Day 1
Consideration of Bill, as amended in the Public Bill Committee
[Relevant Documents: Report of the Joint Committee on the Draft Online Safety Bill, Session 2021-22: Draft Online Safety Bill, HC 609, and the Government Response, CP 640; Letter from the Minister for Tech and the Digital Economy to the Chair of the Joint Committee on Human Rights relating to the Online Safety Bill, dated 16 June 2022; Letter from the Chair of the Joint Committee on Human Rights to the Secretary of State for Digital, Culture, Media and Sport relating to the Online Safety Bill, dated 19 May 2022; First Report of the Digital, Cultural, Media and Sport Committee, Amending the Online Safety Bill, HC 271]
New Clause 19
Duties to protect news publisher content
(1) This section sets out the duties to protect news publisher content which apply in relation to Category 1 services.
(2) Subject to subsections (4), (5) and (8), a duty, in relation to a service, to take the steps set out in subsection (3) before—
(a) taking action in relation to content present on the service that is news publisher content, or
(b) taking action against a user who is a recognised news publisher.
(3) The steps referred to in subsection (2) are—
(a) to give the recognised news publisher in question a notification which—
(i) specifies the action that the provider is considering taking,
(ii) gives reasons for that proposed action by reference to each relevant provision of the terms of service,
(iii) where the proposed action relates to news publisher content that is also journalistic content, explains how the provider took the importance of the free expression of journalistic content into account when deciding on the proposed action, and
(iv) specifies a reasonable period within which the recognised news publisher may make representations,
(b) to consider any representations that are made, and
(c) to notify the recognised news publisher of the decision and the reasons for it (addressing any representations made).
(4) If a provider of a service reasonably considers that the provider would incur criminal or civil liability in relation to news publisher content present on the service if it were not taken down swiftly, the provider may take down that content without having taken the steps set out in subsection (3).
(5) A provider of a service may also take down news publisher content present on the service without having taken the steps set out in subsection (3) if that content amounts to a relevant offence (see section 52 and also subsection (10) of this section).
(6) Subject to subsection (8), if a provider takes action in relation to news publisher content or against a recognised news publisher without having taken the steps set out in subsection (3), a duty to take the steps set out in subsection (7).
(7) The steps referred to in subsection (6) are—
(a) to swiftly notify the recognised news publisher in question of the action taken, giving the provider’s justification for not having first taken the steps set out in subsection (3),
(b) to specify a reasonable period within which the recognised news publisher may request that the action is reversed, and
(c) if a request is made as mentioned in paragraph (b)—
(i) to consider the request and whether the steps set out in subsection (3) should have been taken prior to the action being taken,
(ii) if the provider concludes that those steps should have been taken, to swiftly reverse the action, and
(iii) to notify the recognised news publisher of the decision and the reasons for it (addressing any reasons accompanying the request for reversal of the action).
(8) If a recognised news publisher has been banned from using a service (and the ban is still in force), the provider of the service may take action in relation to news publisher content present on the service which was generated or originally published or broadcast by the recognised news publisher without complying with the duties set out in this section.
(9) For the purposes of subsection (2)(a), a provider is not to be regarded as taking action in relation to news publisher content in the following circumstances—
(a) a provider takes action in relation to content which is not news publisher content, that action affects related news publisher content, the grounds for the action only relate to the content which is not news publisher content, and it is not technically feasible for the action only to relate to the content which is not news publisher content;
(b) a provider takes action against a user, and that action affects news publisher content that has been uploaded to or shared on the service by the user.
(10) Section (Providers’ judgements about the status of content) (providers’ judgements about the status of content) applies in relation to judgements by providers about whether news publisher content amounts to a relevant offence as it applies in relation to judgements about whether content is illegal content.
(11) OFCOM’s guidance under section (Guidance about illegal content judgements) (guidance about illegal content judgements) must include guidance about the matters dealt with in section (Providers’ judgements about the status of content) as that section applies by reason of subsection (10).
(12) Any provision of the terms of service has effect subject to this section.
(13) In this section—
(a) references to “news publisher content” are to content that is news publisher content in relation to the service in question;
(b) references to “taking action” in relation to content are to—
(i) taking down content,
(ii) restricting users’ access to content, or
(iii) taking other action in relation to content (for example, adding warning labels to content);
(c) references to “taking action” against a person are to giving a warning to a person, or suspending or banning a person from using a service, or in any way restricting a person’s ability to use a service.
(14) Taking any step set out in subsection (3) or (7) does not count as “taking action” for the purposes of this section.
(15) See—
section 16 for the meaning of “journalistic content”;
section 49 for the meaning of “news publisher content”;
section 50 for the meaning of “recognised news publisher”.”—(Damian Collins.)
Member’s explanatory statement
This new clause requires providers to notify a recognised news publisher and provide a right to make representations before taking action in relation to news publisher content or against the publisher (except in certain circumstances), and to notify a recognised news publisher after action is taken without that process being followed and provide an opportunity for the publisher to request that the action is reversed.
Brought up, and read the First time.
12:50
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

New clause 2—Secretary of State’s powers to suggest modifications to a code of practice

“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.

(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.

(3) The Secretary of State may only write to OFCOM twice under this section for each code.

(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.

(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”

New clause 3—Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).”

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

New clause 4—Duty about content advertising or facilitating prostitution: Category 1 and Category 2B services

“(1) A provider of a Category 1 or Category 2B service must operate the service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of a Category 1 or Category 2B service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one Category 1 or Category 2B service, the duties set out in this section apply in relation to each such service.

(4) The duties set out in this section extend only to the design, operation and use of a Category 1 or Category 2B service in the United Kingdom.

(5) For the meaning of ‘Category 1 service’ and ‘Category 2B service’, see section 81 (register of categories of services).

(6) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 5—Duty about content advertising or facilitating prostitution: Category 2A services

“(1) A provider of a Category 2A service must operate that service so as to minimise the risk of individuals encountering content which advertises or facilitates prostitution in or via search results of the service.

(2) A provider of a Category 2A service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The reference to encountering content which advertises or facilitates prostitution “in or via search results” of a search service does not include a reference to encountering such content as a result of any subsequent interactions with an internet service other than the search service.

(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extend only to the design, operation and use of a Category 2A service in the United Kingdom.

(6) For the meaning of ‘Category 2A service’, see section 81 (register of categories of services).

(7) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 6—Duty about content advertising or facilitating prostitution: internet services providing pornographic content

“(1) A provider of an internet service within the scope of section 67 of this Act must operate that service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of an internet service under this section must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one internet service under this section, the duties set out in this section apply in relation to each such service.

(4) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 8—Duties about advertisements for cosmetic procedures

“(1) A provider of a regulated service must operate the service using systems and processes designed to—

(a) prevent individuals from encountering advertisements for cosmetic procedures that do not meet the conditions specified in subsection (3);

(b) minimise the length of time for which any such advertisement is present;

(c) where the provider is alerted by a person to the presence of such an advertisement, or becomes aware of it in any other way, swiftly take it down.

(2) A provider of a regulated service must include clear and accessible provisions in the terms of service giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The conditions under subsection (1)(a) are that the advertisement—

(a) contains a disclaimer as to the health risks of the cosmetic procedure, and

(b) includes a certified service quality indicator.

(4) If a person is the provider or more than one regulated service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extent only to the design, operation and use of a regulated service in the United Kingdom.

(6) For the meaning of ‘regulated service’, see section 3 (‘Regulated service’. ‘Part 3 service’ etc).”

This new clause would place a duty on all internet service providers regulated by the Bill to prevent individuals from encountering adverts for cosmetic procedures that do not contain a disclaimer as to the health risks of the procedure nor include a certified service quality indicator.

New clause 9—Content harmful to adults risk assessment duties: regulated search services

“(1) This section sets out the duties about risk assessments which apply in relation to all regulated search services.

(2) A duty to carry out a suitable and sufficient priority adults risk assessment at a time set out in, or as provided by Schedule 3.

(3) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adult risk assessment relating to the impacts of that proposed change.

(5) An ‘adults risk assessment’ of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the level of risk of individuals who are users of the service encountering each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) risks presented by algorithms used by the service, and the way that the service indexes, organises and presents search results;

(b) the level of risk of functionalities of the service facilitating individuals encountering search content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(c) the nature, and severity, of the harm that might be suffered by individuals from the matters identified in accordance with paragraphs (a) and (b);

(d) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6) In this section, references to risk profiles are to the risk profiles for the time being published under section 84 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(7) See also—section 20(2) (records of risk assessments), and Schedule 3 (timing of providers’ assessments).”

New clause 10—Safety Duties Protecting Adults: regulated search services

“(1) This section sets out the duties about protecting adults which apply in relation to all regulated search services.

(2) A duty to summarise in the policies of the search service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).

(3) A duty to include provisions in the search service policies specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (4), which of those kinds of treatment is to be applied.

(4) The duties set out in subsections (2) and (3) apply across all areas of a service, including the way the search engine is operated and used as well as search content of the service, and (among other things) require the provider of a service to take or use measures in the following areas, if it is proportionate to do so—

(a) regulatory compliance and risk management arrangements,

(b) design of functionalities, algorithms and other features relating to the search engine,

(c) functionalities allowing users to control the content they encounter in search results,

(d) content prioritisation and ranking,

(e) user support measures, and

(f) staff policies and practices.

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—

(a) any provisions of the policies included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the policies in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently in relation to content which the provider reasonably considers is priority

(NaN) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(NaN) A duty to ensure that the provisions of the publicly available statement referred to in subsections (5) and (7) are clear and accessible.

(NaN) In this section—

‘adults’ risk assessment’ has the meaning given by section 12;

‘non-designated content that is harmful to adults’ means content that is harmful to adults other than priority content that is harmful to adults.”

New clause 18—Child user empowerment duties

“(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section ‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 57(1)).

(12) In this section references to features include references to functionalities and settings.”

New clause 24—Category 1 services: duty not to discriminate, harass or victimise against service users

“(1) The following duties apply to all providers of Category 1 services.

(2) A duty not to discriminate, on the grounds of a protected characteristic, against a person wishing to use the service by not providing the service, if the result of not providing the service is to cause harm to that person.

(3) A duty not to discriminate, on the grounds of a protected characteristic, against any user of the service in a way that causes harm to the user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(4) A duty not to harass, on the grounds of a protected characteristic, a user of the service in a way that causes harm to the user.

(5) A duty not to victimise because of a protected characteristic a person wishing to use the service by not providing the user with the service, if the result of not providing the service is to cause harm to that person.

(6) A duty not to victimise a service user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(7) In this section—

references to harassing, discriminating or victimising have the same meaning as set out in Part 2 of the Equality Act 2010;

‘protected characteristic’ means a characteristic listed in section 4 of the Equality Act 2010.”

This new clause would place a duty, regulated by Ofcom, on Category 1 service providers not to discriminate, harass or victimise users of their services on the basis of a protected characteristic if doing so would result in them being caused harm. Discrimination, harassment and victimisation, and protected characteristics, have the same meaning as in the Equality Act 2010.

New clause 25—Report on duties that apply to all internet services likely to be accessed by children

“(1) Within 12 months of this Act receiving Royal Assent, the Secretary of State must commission an independent evaluation of the matters under subsection (2) and must lay the report of the evaluation before Parliament.

(2) The evaluation under subsection (1) must consider whether the following duties should be imposed on all providers of services on the internet that are likely to be accessed by children, other than services regulated by this Act—

(a) duties similar to those imposed on regulated services by sections 10 and 25 of this Act to carry out a children’s risk assessment, and

(b) duties similar to those imposed on regulated services by sections 11 and 26 of this Act to protect children’s online safety.”

This new clause would require the Secretary of State to commission an independent evaluation on whether all providers of internet services likely to be accessed by children should be subject to child safety duties and must conduct a children’s risk assessment.

New clause 26—Safety by design

“(1) In exercising their functions under this Act—

(a) The Secretary of State, and

(b) OFCOM

must have due regard to the principles in subsections (2)-(3).

(2) The first principle is that providers of regulated services should design those services to prevent harmful content from being disseminated widely, and that this is preferable in the first instance to both—

(a) removing harmful content after it has already been disseminated widely, and

(b) restricting which users can access the service or part of it on the basis that harmful content is likely to disseminate widely on that service.

(4) The second principle is that providers of regulated services should safeguard freedom of expression and participation, including the freedom of expression and participation of children.”

This new clause requires the Secretary of State and Ofcom to have due regard to the principle that internet services should be safe by design.

New clause 27—Publication of risk assessments

“Whenever a Category 1 service carries out any risk assessment pursuant to Part 3 of this Act, the service must publish the risk assessment on the service’s website.”

New clause 38—End-to-end encryption

“Nothing in this Act shall prevent providers of user-to-user services protecting their users’ privacy through end-to-end encryption.”

Government amendment 57.

Amendment 202, in clause 6, page 5, line 11, at end insert—

“(ba) the duty about pornographic content set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that user-to-user services must meet the new duties set out in NS1.

Government amendments 163, 58, 59 and 60.

Amendment 17, in clause 8, page 7, line 14, at end insert—

“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—

(i) enable users to encounter illegal content on other regulated user-to-user services, and

(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”

This amendment would incorporate into the duties a requirement to consider cross-platform risk.

Amendment 15, in clause 8, page 7, line 14, at end insert—

“(5A) The duties set out in this section apply in respect of content which reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

This amendment extends the illegal content risk assessment duties to cover content which could be foreseen to facilitate or aid the discovery or dissemination of CSEA content.

Government amendments 61 and 62.

Amendment 18, page 7, line 30 [Clause 9], at end insert—

“(none) ‘, including by being directed while on the service towards priority illegal content hosted by a different service;’

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Amendment 16, in clause 9, page 7, line 35, at end insert—

“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.

Amendment 19, in clause 9, page 7, line 35, at end insert—

“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content.”

This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.

Government amendments 63 to 67.

Amendment 190, page 10, line 11, in clause 11, at end insert “, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Government amendments 68 and 69.

Amendment 42, page 11, line 16, in clause 11, at end insert—

“(c) the benefits of the service to children’s well-being.”

Amendment 151, page 12, line 43, leave out Clause 13.

This amendment seeks to remove Clause 13 from the Bill.

Government amendment 70.

Amendment 48, page 13, line 5, in clause 13, leave out “is to be treated” and insert

“the provider decides to treat”

This amendment would mean that providers would be free to decide how to treat content that has been designated ‘legal but harmful’ to adults.

Amendment 49, page 13, line 11, in clause 13, at end insert—

‘(ca) taking no action;”

This amendment provides that providers would be free to take no action in response to content referred to in subsection (3).

Government amendments 71 and 72.

Amendment 157, page 14, line 11, in clause 14, leave out subsections (6) and (7).

This amendment is consequential to Amendment 156, which would require all users of Category 1 services to be verified.

Government amendments 73, 164, 74 and 165.

Amendment 10, page 16, line 16, in clause 16, leave out from “or” until the end of line 17.

Government amendments 166 and 167.

Amendment 50, page 20, line 21, in clause 19, at end insert—

“(6A) A duty to include clear provision in the terms of service that the provider will not take down, or restrict access to content generated, uploaded or shared by a user save where it reasonably concludes that—

(a) the provider is required to do so pursuant to the provisions of this Act, or

(b) it is otherwise reasonable and proportionate to do so.”

This amendment sets out a duty for providers to include in terms of service a commitment not to take down or restrict access to content generated, uploaded or shared by a user except in particular circumstances.

Government amendment 168.

Amendment 51, page 20, line 37, in clause 19, at end insert—

“(10) In any claim for breach of contract brought in relation to the provisions referred to in subsection (7), where the breach is established, the court may make such award by way of compensation as it considers appropriate for the removal of, or restriction of access to, the content in question.”

This amendment means that where a claim is made for a breach of the terms of service result from Amendment 50, the court has the power to make compensation as it considers appropriate.

Government amendment 169.

Amendment 47, page 22, line 10, in clause 21, at end insert—

“(ba) the duties about adults’ risk assessment duties in section (Content harmful to adult risk assessment duties: regulated search services),

(bb) the safety duties protecting adults in section (Safety duties protecting adults: regulated search services).”

Government amendments 75 to 82.

Amendment 162, page 31, line 19, in clause 31, leave out “significant”

This amendment removes the requirement for there to be a “significant” number of child users, and replaces it with “a number” of child users.

Government amendments 85 to 87.

Amendment 192, page 36, line 31, in clause 37, at end insert—

“(ha) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 about codes of practice.

Amendment 44, page 37, line 25, in clause 39, leave out from beginning to the second “the” in line 26.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 45, page 38, line 8, leave out Clause 40.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 13, page 38, line 12, in clause 40, leave out paragraph (a).

Amendment 46, page 39, line 30, leave out Clause 41.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 14, page 39, line 33, in clause 41, leave out subsection (2).

Amendment 21, page 40, line 29, in clause 43, leave out “may require” and insert “may make representations to”

Amendment 22, page 40, line 33, in clause 43, at end insert—

‘(2A) OFCOM must have due regard to representations by the Secretary of State under subsection (2).”

Government amendments 88 to 89 and 170 to 172.

Amendment 161, page 45, line 23, in clause 49, leave out paragraph (d).

This amendment removes the exemption for one-to-one live aural communications.

Amendment 188, page 45, line 24, in clause 49, leave out paragraph (e).

This amendment removes the exemption for comments and reviews on provider content.

Government amendments 90 and 173.

Amendment 197, page 47, line 12, in clause 50, after “material” insert

“or special interest news material”.

Amendment 11, page 47, line 19, in clause 50, after “has” insert “suitable and sufficient”.

Amendment 198, page 47, line 37, in clause 50, leave out the first “is” and insert

“and special interest news material are”.

Amendment 199, page 48, line 3, in clause 50, at end insert—

““special interest news material” means material consisting of news or information about a particular pastime, hobby, trade, business, industry or profession.”

Amendment 12, page 48, line 7, in clause 50, after “a” insert “suitable and sufficient”.

Government amendments 91 to 94.

Amendment 52, page 49, line 13, in clause 52, leave out paragraph (d).

This amendment limits the list of relevant offences to those specifically specified.

Government amendments 95 to 100.

Amendment 20, page 51, line 3, in clause 54, at end insert—

‘(2A) Priority content designated under subsection (2) must include—

(a) content that contains public health related misinformation or disinformation, and

(b) misinformation or disinformation that is promulgated by a foreign state.”

This amendment would require the Secretary of State’s designation of “priority content that is harmful to adults” to include public health-related misinformation or disinformation, and misinformation or disinformation spread by a foreign state.

Amendment 53, page 51, line 47, in clause 55, after “State” insert “reasonably”.

This amendment, together with Amendment 54, would mean that the Secretary of State must reasonably consider the risk of harm to each one of an appreciable number of adults before specifying a description of the content.

Amendment 54, page 52, line 1, in clause 55, after “to” insert “each of”.

This amendment is linked to Amendment 53.

Amendment 55, page 52, line 12, in clause 55, after “OFCOM” insert

“, Parliament and members of the public in a manner the Secretary of State considers appropriate”.

This amendment requires the Secretary of State to consult Parliament and the public, as well as Ofcom, in a manner the Secretary of State considers appropriate before making regulations about harmful content.

Government amendments 147 to 149.

Amendment 43, page 177, line 23, in schedule 4, after “ages” insert

“, including the benefits of the service to their well-being,”

Amendment 196, page 180, line 9, in schedule 4, at end insert—

Amendment 187, page 186, line 32, in schedule 7, at end insert—

Human trafficking

22A An offence under section 2 of the Modern Slavery Act 2015.”

This amendment includes Human Trafficking as a priority offence.

Amendment 211, page 187, line 23, in schedule 7, at end insert—

Government new clause 14.

Government new clause 15.

Government amendments 83 to 84.

Amendment 156, page 53, line 7, in clause 57, leave out subsections (1) and (2) and insert—

‘(1) A provider of a Category 1 service must require all adult users of the service to verify their identity in order to access the service.

(2) The verification process—

(a) may be of any kind (and in particular, it need not require documentation to be provided),

(b) must—

(i) be carried out by a third party on behalf of the provider of the Category 1 service,

(ii) ensure that all anonymous users of the Category 1 service cannot be identified by other users, apart from where provided for by section (Duty to ensure anonymity of users).”

This amendment would require all users of Category 1 services to be verified. The verification process would have to be carried out by a third party and to ensure the anonymity of users.

Government amendment 101.

Amendment 193, page 58, line 33, in clause 65, at end insert—

“(ea) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 in respect of guidance about transparency reports.

Amendment 203, page 60, line 33, in clause 68, at end insert—

‘(2B) A duty to meet the conditions set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that commercial pornographic websites must meet the new duties set out in NS1.

Government amendments 141, 177 to 184, 142 to 145, 185 to 186 and 146.

New schedule 1—Additional duties on pornographic content

“30 All user-to-user services and an internet service which provides regulated provider pornographic content must meet the following conditions for pornographic content and content that includes sexual photographs and films (“relevant content”).

The conditions are—

(a) the service must not contain any prohibited material,

(b) the service must review all relevant content before publication.

31 In this Schedule—

“photographs and films” has the same meaning as section 34 of the Criminal Justice and Courts Act 2015 (meaning of “disclose” and “photograph or film”)

“prohibited material” has the same meaning as section 368E(3) of the Communications Act 2003 (harmful material).”

The new schedule sets out additional duties for pornographic content which apply to user-to-user services under Part 3 and commercial pornographic websites under Part 5.

Government amendments 150 and 174.

Amendment 191, page 94, line 24, in clause 12, at end insert—

“Section [Category 1 services: duty not to discriminate against, harass or victimise service users] Duty not to discriminate against, harass or victimise

This amendment makes NC24 an enforceable requirement.

Government amendment 131.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - - - Excerpts

I welcome the new Minister to the Dispatch Box.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.

Relative to the point of order from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.

We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discusssing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- Hansard - - - Excerpts

I warmly welcome my hon. Friend to his position. He will understand that those of us who have followed the Bill in some detail since its inception had some nervousness as to who might be standing at that Dispatch Box today, but we could not be more relieved that it is him. May I pick up on his point about the point of order from our right hon. Friend the Member for Haltemprice and Howden (Mr Davis)? Does he agree that an additional point to add to his list is that, unusually, this legislation has a remarkable amount of cross-party consensus behind its principles? That distinguishes it from some of the other legislation that perhaps we should not consider in these two weeks. I accept there is plenty of detail to be examined but, in principle, this Bill has a lot of support in this place.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - - - Excerpts

And the other way, as well.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Exactly. The concept at the heart of this legislation is simple. Tech companies, like those in every other sector, must take appropriate responsibility for the consequences of their business decisions. As they continue to offer their users the latest innovations that enrich our lives, they must consider safety as well as profit. They must treat their users fairly and ensure that the internet remains a place for robust debate. The Bill has benefited from input and scrutiny from right across the House. I pay tribute to my predecessor, my hon. Friend the Member for Croydon South (Chris Philp), who has worked tirelessly on the Bill, not least through 50 hours of Public Bill Committee, and the Bill is better for his input and work.

We have also listened to the work of other Members of the House, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the right hon. Member for Barking (Dame Margaret Hodge), my right hon. Friend the Member for Haltemprice and Howden and the Chair of the Select Committee, my hon. Friend the Member for Solihull (Julian Knight), who have all made important contributions to the discussion of the Bill.

We have also listened to those concerned about freedom of expression online. It is worth pausing on that, as there has been a lot of discussion about whether the Bill is censoring legal speech online and much understandable outrage from those who think it is. I asked the same questions when I chaired the Joint Committee on the Bill. This debate does not reflect the actual text of the Bill itself. The Bill does not require platforms to restrict legal speech—let us be absolutely clear about that. It does not give the Government, Ofcom or tech platforms the power to make something illegal online that is legal offline. In fact, if those concerned about the Bill studied it in detail, they would realise that the Bill protects freedom of speech. In particular, the Bill will temper the huge power over public discourse wielded by the big tech companies behind closed doors in California. They are unaccountable for the decisions they make on censoring free speech on a daily basis. Their decisions about what content is allowed will finally be subject to proper transparency requirements.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - - - Excerpts

My hon. Friend did not have the joy of being on the Bill Committee, as I did with my hon. Friend the Member for Croydon South (Chris Philp), who was the Minister at that point. The point that my hon. Friend has just made about free speech is so important for women and girls who are not able to go online because of the violent abuse that they receive, and that has to be taken into account by those who seek to criticise the Bill. We have to make sure that people who currently feel silenced do not feel silenced in future and can participate online in the way that they should be able to do. My hon. Friend is making an excellent point and I welcome him to his position.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend is entirely right on that point. The structure of the Bill is very simple. There is a legal priority of harms, and things that are illegal offline will be regulated online at the level of the criminal threshold. There are protections for freedom of speech and there is proper transparency about harmful content, which I will come on to address.

Joanna Cherry Portrait Joanna Cherry (Edinburgh South West) (SNP)
- Hansard - - - Excerpts

Does the Minister agree that, in moderating content, category 1 service providers such as Twitter should be bound by the duties under our domestic law not to discriminate against anyone on the grounds of a protected characteristic? Will he take a look at the amendments I have brought forward today on that point, which I had the opportunity of discussing with his predecessor, who I think was sympathetic?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. and learned Lady makes a very important point. The legislation sets regulatory thresholds at the criminal law level based on existing offences in law. Many of the points she made are covered by existing public law offences, particularly in regards to discriminating against people based on their protected characteristics. As she well knows, the internet is a reserved matter, so the legal threshold is set at where UK law stands, but where law may differ in Scotland, the police authorities in Scotland can still take action against individuals in breach of the law.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

The difficulty is that Twitter claims it is not covered by the Equality Act 2010. I have seen legal correspondence to that effect. I am not talking about the criminal law here. I am talking about Twitter’s duty not to discriminate against women, for example, or those who hold gender critical beliefs in its moderation of content. That is the purpose of my amendment today—it would ensure that Twitter and other service providers providing a service in the United Kingdom abide by our domestic law. It is not really a reserved or devolved matter.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. and learned Lady is right. There are priority offences where the companies, regardless of their terms of service, have to meet their obligations. If something is illegal offline, it is illegal online as well. There are priority areas where the company must proactively look for that. There are also non-priority areas where the company should take action against anything that is an offence in law and meets the criminal threshold online. The job of the regulator is to hold them to account for that. They also have to be transparent in their terms of service as category 1 companies. If they have clear policies against discrimination, which they on the whole all do, they will have to set out what they would do, and the regulator can hold them to account to make sure they do what they say. The regulator cannot make them take down speech that is legal or below a criminal threshold, but they can hold them to account publicly for the decisions they make.

One of the most important aspects of this Bill with regard to the category 1 companies is transparency. At the moment, the platforms make decisions about curating their content—who to take down, who to suppress, who to leave up—but those are their decisions. There is no external scrutiny of what they do or even whether they do what they say they will do. As a point of basic consumer protection law, if companies say in their terms of service that they will do something, they should be held to account for it. What is put on the label also needs to be in the tin and that is what the Bill will do for the internet.

I now want to talk about journalism and the role of the news media in the online world, which is a very important part of this Bill. The Government are committed to defending the invaluable role of a free media. Online safety legislation must protect the vital role of the press in providing people with reliable and accurate sources of information. Companies must therefore put in place protections for journalistic content. User-to-user services will not have to apply their safety duties in part 3 of the Bill to news publishers’ content shared on their services. News publishers’ content on their own sites will also not be in scope of regulation.

12:59
New clause 19 and associated amendments introduce a further requirement on category 1 services to notify a recognised news publisher and offer a right of appeal before removing or moderating its content or taking any action against its account. This new provision will reduce the risk of major online platforms taking over-zealous, arbitrary or accidental moderation decisions against news publisher content, which plays an invaluable role in UK democracy and society.
We recognise that there are cases where platforms must be able to remove content without having to provide an appeal, and the new clause has been drafted to ensure that platforms will not be required to provide an appeal before removing content that would give rise to civil or criminal liability to the service itself, or where it amounts to a relevant offence as defined by the Bill. This means that platforms can take down without an appeal content that would count as illegal content under the Bill.
Moreover, in response to some of the concerns raised, in particular by my right hon. and learned Friend the Member for Kenilworth and Southam as well as by other Members, about the danger of creating an inadvertent loophole for bad actors, we have committed to further tightening the definition of “recognised news provider” in the House of Lords to ensure that sanctioned entities, such as RT, cannot benefit from these protections.
As the legislation comes into force, the Government are committed to ensuring that protections for journalism and news publisher content effectively safeguard users’ access to such content. We have therefore tabled amendments 167 and 168 to require category 1 companies to assess the impact of their safety duties on how news publisher and journalistic content are treated when hosted on the service. They must then demonstrate the steps they are taking to mitigate any impact.
In addition, a series of amendments, including new clause 20, will require Ofcom to produce a report assessing the impact of the Online Safety Bill on the availability and treatment of news publisher content and journalistic content on category 1 services. This will include consideration of the impact of new clause 19, and Ofcom must do this within two years of the relevant provisions being commenced.
The Bill already excludes comments sections on news publishers’ sites from the Bill’s safety duties. These comments are crucial for enabling reader engagement with the news and encouraging public debate, as well as for the sustainability of the news media. We have tabled a series of amendments to strengthen these protections, reflecting the Government’s commitment to media freedom. The amendments will create a higher bar for removing the protections in place for comments sections on recognised news publishers’ sites by ensuring that these can only be brought into the scope of regulation via primary legislation.
Government amendments 70 and 71 clarify the policy intention of the clause 13 adult safety duties to improve transparency about how providers treat harmful content, rather than incentivise its removal. The changes respond to concerns raised by stakeholders that the drafting did not make it sufficiently clear that providers could choose simply to allow any form of legal content, rather than promote, restrict or remove it, regardless of the harm to users.
This is a really important point that has sometimes been missed in the discussion on the Bill. There are very clear duties relating to illegal harm that companies must proactively identify and mitigate. The transparency requirements for other harmful content are very clear that companies must set out what their policies are. Enforcement action can be taken by the regulator for breach of their policies, but the primary objective is that companies make clear what their policies are. It is not a requirement for companies to remove legal speech if their policies do not allow that.
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

I welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, the hon. Member for Croydon South (Chris Philp)—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.

In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.

The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.

Chris Philp Portrait Chris Philp (Croydon South) (Con)
- View Speech - Hansard - - - Excerpts

I congratulate the Minister on his appointment, and I look forward to supporting him in his role as he previously supported me in mine. I think he made an important point a minute ago about content that is legal but considered to be harmful. It has been widely misreported in the press that this Bill censors or prohibits such content. As the Minister said a moment ago, it does no such thing. There is no requirement on platforms to censor or remove content that is legal, and amendment 71 to clause 13 makes that expressly clear. Does he agree that reports suggesting that the Bill mandates censorship of legal content are completely inaccurate?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful to my hon. Friend, and as I said earlier, he is absolutely right. There is no requirement for platforms to take down legal speech, and they cannot be directed to do so. What we have is a transparency requirement to set out their policies, with particular regard to some of the offences I mentioned earlier, and a wide schedule of things that are offences in law that are enforced through the Bill itself. This is a very important distinction to make. I said to him on Second Reading that I thought the general term “legal but harmful” had added a lot of confusion to the way the Bill was perceived, because it created the impression that the removal of legal speech could be required by order of the regulator, and that is not the case.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

I congratulate the Minister on his promotion and on his excellent chairmanship of the prelegislative scrutiny Committee, which I also served on. Is he satisfied with the Bill in relation to disinformation? It was concerning that there was only one clause on disinformation, and we know the impact—particularly the democratic impact—that that has on our society at large. Is he satisfied that the Bill will address that?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

It was a pleasure to serve alongside the hon. Lady on the Joint Committee. There are clear new offences relating to knowingly false information that will cause harm. As she will know, that was a Law Commission recommendation; it was not in the draft Bill but it is now in the Bill. The Government have also said that as a consequence of the new National Security Bill, which is going through Parliament, we will bring in a new priority offence relating to disinformation spread by hostile foreign states. As she knows, one of the most common areas for organised disinformation has been at state level. As a consequence of the new national security legislation, that will also be reflected in schedule 7 of this Bill, and that is a welcome change.

The Bill requires all services to take robust action to tackle the spread of illegal content and activity. Providers must proactively reduce the risk on their services of illegal activity and the sharing of illegal content, and they must identify and remove illegal content once it appears on their services. That is a proactive responsibility. We have tabled several interrelated amendments to reinforce the principle that companies must take a safety-by-design approach to managing the risk of illegal content and activity on their services. These amendments require platforms to assess the risk of their services being used to commit, or to facilitate the commission of, a priority offence and then to design and operate their services to mitigate that risk. This will ensure that companies put in place preventive measures to mitigate a broad spectrum of factors that enable illegal activity, rather than focusing solely on the removal of illegal content once it appears.

Henry Smith Portrait Henry Smith (Crawley) (Con)
- View Speech - Hansard - - - Excerpts

I congratulate my hon. Friend on his appointment to his position. On harmful content, there are all too many appalling examples of animal abuse on the internet. What are the Government’s thoughts on how we can mitigate such harmful content, which is facilitating wildlife crime? Might similar online protections be provided for animals to the ones that clause 53 sets out for children?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend raises an important point that deserves further consideration as the Bill progresses through its parliamentary stages. There is, of course, still a general presumption that any illegal activity that could also constitute illegal activity online—for example, promoting or sharing content that could incite people to commit violent acts—is within scope of the legislation. There are some priority illegal offences, which are set out in schedule 7, but the non-priority offences also apply if a company is made aware of content that is likely to be in breach of the law. I certainly think this is worth considering in that context.

In addition, the Bill makes it clear that platforms have duties to mitigate the risk of their service facilitating an offence, including where that offence may occur on another site, such as can occur in cross-platform child sexual exploitation and abuse—CSEA—offending, or even offline. This addresses concerns raised by a wide coalition of children’s charities that the Bill did not adequately tackle activities such as breadcrumbing—an issue my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Select Committee, has raised in the House before—where CSEA offenders post content on one platform that leads to offences taking place on a different platform.

We have also tabled new clause 14 and a related series of amendments in order to provide greater clarity about how in-scope services should determine whether they have duties with regard to content on their services. The new regulatory framework requires service providers to put in place effective and proportionate systems and processes to improve user safety while upholding free expression and privacy online. The systems and processes that companies implement will be tailored to the specific risk profile of the service. However, in many cases the effectiveness of companies’ safety measures will depend on them making reasonable judgments about types of content. Therefore, it is essential to the effective functioning of the framework that there is clarity about how providers should approach these judgments. In particular, such clarity will safeguard against companies over-removing innocuous content if they wrongly assume mental elements are present, or under-removing content if they act only where all elements of an offence are established beyond reasonable doubt. The amendments make clear that companies must consider all reasonably available contextual information when determining whether content is illegal content, a fraudulent advert, content that is harmful to children, or content that is harmful to adults.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I was on the Bill Committee and we discussed lots of things, but new clause 14 was not discussed: we did not have conversations about it, and external organisations have not been consulted on it. Is the Minister not concerned that this is a major change to the Bill and it has not been adequately consulted on?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As I said earlier, in establishing the threshold for priority illegal offences, the current threshold of laws that exist offline should provide good guidance. I would expect that as the codes of practice are developed, we will be able to make clear what those offences are. On the racial hatred that the England footballers received after the European championship football final, people have been prosecuted for what they posted on Twitter and other social media platforms. We know what race hate looks like in that context, we know what the regulatory threshold should look at and we know the sort of content we are trying to regulate. I expect that, in the codes of practice, Ofcom can be very clear with companies about what we expect, where the thresholds are and where we expect them to take enforcement action.

13:15
Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

I congratulate my hon. Friend on taking his new position; we rarely have a new Minister so capable of hitting the ground running. He makes a crucial point about clearness and transparency for both users and the social media providers and other platforms, because it is important that we make sure they are 100% clear about what is expected of them and the penalties for not fulfilling their commitments. Does he agree that opaqueness—a veil of secrecy—has been one of the obstacles, and that a whole raft of content has been taken down for the wrong reasons while other content has been left to proliferate because of the lack of clarity?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That is entirely right, and in closing I say that the Bill does what we have always asked for it to do: it gives absolute clarity that illegal things offline must be illegal online as well, and be regulated online. It establishes clear responsibilities and liabilities for the platforms to do that proactively. It enables a regulator to hold the platforms to account on their ability to tackle those priority illegal harms and provide transparency on other areas of harmful content. At present we simply do not know about the policy decisions that companies choose to make: we have no say in it; it is not transparent; we do not know whether they do it. The Bill will deliver in those important regards. If we are serious about tackling issues such as fraud and abuse online, and other criminal offences, we require a regulatory system to do that and proper legal accountability and liability for the companies. That is what the Bill and the further amendments deliver.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is an honour to respond on the first group of amendments on behalf of the Opposition.

For those of us who have been working on this Bill for some time now, it has been extremely frustrating to see the Government take such a siloed approach in navigating this complex legislation. I remind colleagues that in Committee Labour tabled a number of hugely important amendments that sought to make the online space safer for us all, but the Government responded by voting against each and every one of them. I certainly hope the new Minister—I very much welcome him to his post—has a more open-minded approach than his predecessor and indeed the Secretary of State; I look forward to what I hope will be a more collaborative approach to getting this legislation right.

With that in mind, it must be said that time and again this Government claim that the legislation is world-leading but that is far from the truth. Instead, once again the Government have proposed hugely significant and contentious amendments only after line-by-line scrutiny in Committee; it is not the first time this has happened in this Parliament, and it is extremely frustrating for those of us who have debated this Bill for more than 50 hours over the past month.

I will begin by touching on Labour’s broader concerns around the Bill. As the Minister will be aware, we believe that the Government have made a fundamental mistake in their approach to categorisation, which undermines the very structure of the Bill. We are not alone in this view and have the backing of many advocacy and campaign groups including the Carnegie UK Trust, Hope Not Hate and the Antisemitism Policy Trust. Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.

We all know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote very dangerous content. Their aim is to promote radicalisation and to spread hate and harm.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

Not only that: people migrate from one platform to another, a fact that just has not been reflected on by the Government.

Alex Davies-Jones Portrait Alex Davies-Jones
- View Speech - Hansard - - - Excerpts

My hon. Friend is absolutely right, and has touched on elements that I will address later in my speech. I will look at cross-platform harm and breadcrumbing; the Government have taken action to address that issue, but they need to go further.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am sorry to intervene so early in the hon. Lady’s speech, and thank her for her kind words. I personally agree that the question of categorisation needs to be looked at again, and the Government have agreed to do so. We will hopefully discuss it next week during consideration of the third group of amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s commitment, which is something that the previous Minister, the hon. Member for Croydon South (Chris Philp) also committed to in Committee. However, it should have been in the Bill to begin with, or been tabled as an amendment today so that we could discuss it on the Floor of the House. We should not have to wait until the Bill goes to the other place to discuss this fundamental, important point that I know colleagues on the Minister’s own Back Benches have been calling for. Here we are, weeks down the line, with nothing having been done to fix that problem, which we know will be a persistent problem unless action is taken. It is beyond frustrating that no indication was given in Committee of these changes, because they have wide-ranging consequences for the effects of the Bill. Clearly, the Government are distracted with other matters, but I remind the Minister that Labour has long called for a safer internet, and we are keen to get the Bill right.

Let us start with new clause 14, which provides clarification about how online services should determine whether content should be considered illegal, and therefore how the illegal safety duty should apply. The new clause is deeply problematic, and is likely to reduce significantly the amount of illegal content and fraudulent advertising that is correctly identified and acted on. First, companies will be expected to determine whether content is illegal or fraudulently based on information that is

“reasonably available to a provider”,

with reasonableness determined in part by the size and capacity of the provider. That entrenches the problems I have outlined with smaller, high-risk companies being subject to fewer duties despite the acute risks they pose. Having less onerous applications of the illegal safety duties will encourage malign actors to migrate illegal activity on to smaller sites that have less pronounced regulatory expectations placed on them. That has particularly concerning ramifications for children’s protections, which I will come on to shortly. On the other end of the scale, larger sites could use new clause 14 to argue that their size and capacity, and the corresponding volumes of material they are moderating, makes it impractical for them reliably and consistently to identify illegal content.

The second problem arises from the fact that the platforms will need to have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”.

That significantly raises the threshold at which companies are likely to determine that content is illegal. In practice, companies have routinely failed to remove content where there is clear evidence of illegal intent. That has been the case in instances of child abuse breadcrumbing, where platforms use their own definitions of what constitutes a child abuse image for moderation purposes. Charities believe it is inevitable that companies will look to use this clause to minimise their regulatory obligations to act.

Finally, new clause 14 and its resulting amendments do not appear to be adequately future-proofed. The new clause sets out that judgments should be made

“on the basis of all relevant information that is reasonably available to a provider.”

However, on Meta’s first metaverse device, the Oculus Quest product, that company records only two minutes of footage on a rolling basis. That makes it virtually impossible to detect evidence of grooming, and companies can therefore argue that they cannot detect illegal content because the information is not reasonably available to them. The new clause undermines and weakens the safety mechanisms that the Minister, his team, the previous Minister, and all members of the Joint Committee and the Public Bill Committee have worked so hard to get right. I urge the Minister to reconsider these amendments and withdraw them.

I will now move on to improving the children’s protection measures in the Bill. In Committee, it was clear that one thing we all agreed on, cross-party and across the House, was trying to get the Bill to work for children. With colleagues in the Scottish National party, Labour Members tabled many amendments and new clauses in an attempt to achieve that goal. However, despite their having the backing of numerous children’s charities, including the National Society for the Prevention of Cruelty to Children, 5Rights, Save the Children, Barnardo’s, The Children’s Society and many more, the Government sadly did not accept them. We are grateful to those organisations for their insights and support throughout the Bill’s passage.

We know that children face significant risks online, from bullying and sexist trolling to the most extreme grooming and child abuse. Our amendments focus in particular on preventing grooming and child abuse, but before I speak to them, I associate myself with the amendments tabled by our colleagues in the Scottish National party, the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I associate myself with the sensible changes they have suggested to the Bill at this stage, including a change to children’s access assessments through amendment 162 and a strengthening of duties to prevent harm to children caused by habit-forming features through amendment 190.

Since the Bill was first promised in 2017, the number of online grooming crimes reported to the police has increased by more than 80%. Last year, around 120 sexual communication with children offences were committed every single week, and those are only the reported cases. The NSPCC has warned that that amounts to a

“tsunami of online child abuse”.

We now have the first ever opportunity to legislate for a safer world online for our children.

However, as currently drafted, the Bill falls short by failing to grasp the dynamics of online child abuse and grooming, which rarely occurs on one single platform or app, as mentioned by my hon. Friend the Member for Oldham East and Saddleworth (Debbie Abrahams). In well-established grooming pathways, abusers exploit the design features of open social networks to contact children, then move their communication across to other, more encrypted platforms, including livestreaming sites and encrypted messaging services. For instance, perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children, who they then groom through direct messages before moving to encrypted services such as WhatsApp, where they coerce children into sending sexual images. That range of techniques is often referred to as child abuse breadcrumbing, and is a significant enabler of online child abuse.

I will give a sense of how easy it is for abusers to exploit children by recounting the words and experiences of a survivor, a 15-year-old girl who was groomed on multiple sites:

“I’ve been chatting with this guy online who’s…twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to ‘prove my trust’ to him, like doing video chats with my chest exposed. Every time I did these things for him, he would ask for more and I felt like it was too late to back out. This whole thing has been slowly destroying me and I’ve been having thoughts of hurting myself.”

I appreciate that it is difficult listening, but that experience is being shared by thousands of other children every year, and we need to be clear about the urgency that is needed to change that.

It will come as a relief to parents and children that, through amendments 58 to 61, the Government have finally agreed to close the loophole that allowed for breadcrumbing to continue. However, I still wish to speak to our amendments 15, 16, and 17 to 19, which were tabled before the Government changed their mind. Together with the Government’s amendments, these changes will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material.

Amendment 15 would ensure that platforms have to include in their illegal content risk assessment content that

“reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

Amendment 16 would ensure that platforms have to maintain proportionate systems and processes to minimise the presence of such content on their sites. The wording of our amendments is tighter and includes aiding the discovery or dissemination of content, whereas the Government’s amendments cover only “commission or facilitation”. Can the Minister tell me why the Government chose that specific wording and opposed the amendments that we tabled in Committee, which would have done the exact same thing? I hope that in the spirit of collaboration that we have fostered throughout the passage of the Bill with the new Minister and his predecessor, the Minister will consider the merit of our amendments 15 and 16.

Labour is extremely concerned about the significant powers that the Bill in its current form gives to the Secretary of State. We see that approach to the Bill as nothing short of a shameless attempt at power-grabbing from a Government whose so-called world-leading Bill is already failing in its most basic duty of keeping people safe online. Two interlinked issues arise from the myriad of powers granted to the Secretary of State throughout the Bill: the first is the unjustified intrusion of the Secretary of State into decisions that are about the regulation of speech, and the second is the unnecessary levels of interference and threats to the independence of Ofcom that arise from the powers of direction to Ofcom in its day-to-day matters and operations. That is not good governance, and it is why Labour has tabled a range of important amendments that the Minister must carefully consider. None of us wants the Bill to place undue powers in the hands of only one individual. That is not a normal approach to regulation, so I fail to see why the Government have chosen to go down that route in this case.

Chris Philp Portrait Chris Philp
- View Speech - Hansard - - - Excerpts

I thank the shadow Minister for giving way—I will miss our exchanges across the Dispatch Box. She is making a point about the Secretary of State powers in, I think, clause 40. Is she at all reassured by the undertakings given in the written ministerial statement tabled by the Secretary of State last Thursday, in which the Government committed to amending the Bill in the Lords to limit the use of those powers to exceptional circumstances only, and precisely defined those circumstances as only being in connection with issues such as public health and public safety?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the former Minister for his intervention, and I am grateful for that clarification. We debated at length in Committee the importance of the regulator’s independence and the prevention of overarching Secretary of State powers, and of Parliament having a say and being reconvened if required. I welcome the fact that that limitation on the power will be tabled in the other place, but it should have been tabled as an amendment here so that we could have discussed it today. We should not have to wait for the Bill to go to the other place for us to have our say. Who knows what will happen to the Bill tomorrow, next week or further down the line with the Government in utter chaos? We need this to be done now. The Minister must recognise that this is an unparalleled level of power, and one with which the sector and Back Benchers in his own party disagree. Let us work together and make sure the Bill really is fit for purpose, and that Ofcom is truly independent and without interference and has the tools available to it to really create meaningful change and keep us all safe online once and for all.

13:30
I must put on record my support for amendments 11 and 12, tabled by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). In Committee, we heard multiple examples of racist, extremist and other harmful publishers, from holocaust deniers to white supremacists, who would stand to benefit from the recognised news publisher exemption as it stands, either overnight or by making minor administrative changes. As long as the exemption protects antisemites and extremists, it is not fit for purpose. That much should be clear to all of us. In Committee, in response to an amendment tabled by my hon. Friend the Member for Batley and Spen (Kim Leadbeater), the then Minister promised a concession so that Russia Today would be excluded from the recognised news publisher exemption. I welcome the Minister’s comments at the Dispatch Box today to confirm that. I am pleased that the Government have promised to exclude sanctioned news bodies such as Russia Today, but their approach does not go far enough. Disinformation outlets rarely have the profile of Russia Today.
Andrew Percy Portrait Andrew Percy (Brigg and Goole) (Con)
- Hansard - - - Excerpts

While the shadow Minister is on the subject of exemptions for antisemites, will she say where the Opposition are on the issue of search? Search platforms and search engines provide some of the most appalling racist, Islamophobic and antisemitic content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the hon. Gentleman, who is absolutely right. In Committee, we debated at length the impact search engines have, and they should be included in the Bill’s categorisation of difficult issues. In one recent example on a search engine, the imagery that comes up when we search for desk ornaments is utterly appalling and needs to be challenged and changed. If we are to truly tackle antisemitism, racism and extremist content online, then the provisions need to be included in the Bill, and journalistic exemptions should not apply to this type of content. Often, they operate more discretely and are less likely to attract sanctions. Furthermore, any amendment will provide no answer to the many extremist publishers who seek to exploit the terms of the exemption. For those reasons, we need to go further.

The amendments are not a perfect or complete solution. Deficiencies remain, and the amendments do not address the fact that the exemption continues to exclude dozens of independent local newspapers around the country on the arbitrary basis that they have no fixed address. The Independent Media Association, which represents news publishers, describes the news publisher criteria as

“punishing quality journalism with high standards”.

I hope the Minister will reflect further on that point. As a priority, we need to ensure that the exemption cannot be exploited by bad actors. We must not give a free pass to those propagating racist, misogynistic or antisemitic harm and abuse. By requiring some standards of accountability for news providers, however modest, the amendments are an improvement on the Bill as drafted. In the interests of national security and the welfare of the public, we must support the amendments.

Finally, I come to a topic that I have spoken about passionately in this place on a number of occasions and that is extremely close to my heart: violence against women and girls. Put simply, in their approach to the Bill the Government are completely failing and falling short in their responsibilities to keep women and girls safe online. Labour has been calling for better protections for some time now, yet still the Government are failing to see the extent of the problem. They have only just published an initial indicative list of priority harms to adults, in a written statement that many colleagues may have missed. While it is claimed that this will add to scrutiny and debate, the final list of harms will not be on the face of the Bill but will included in secondary legislation after the Bill has received Royal Assent. Non-designated content that is harmful will not require action on the part of service providers, even though by definition it is still extremely harmful. How can that be acceptable?

Many campaigners have made the case that protections for women and girls are not included in the draft Bill at all, a concern supported by the Petitions Committee in its report on online abuse. Schedule 7 includes a list of sexual offences and aggravated offences, but the Government have so far made no concessions here and the wider context of violence against women and girls has not been addressed. That is why I urge the Minister to carefully consider our new clause 3, which seeks to finally name violence against women and girls as a priority harm. The Minister’s predecessor said in Committee that women and girls receive “disproportionate” levels of abuse online. The Minister in his new role will likely be well briefed on the evidence, and I know this is an issue he cares passionately about. The case has been put forward strongly by hon. Members on all sides of the House, and the message is crystal clear: women and girls must be protected online, and we see this important new clause as the first step.

Later on, we hope to see the Government move further and acknowledge that there must be a code of practice on tackling violence against women and girls content online.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Lady raises the issue of codes of practice. She will recall that in Committee we talked about that specifically and pressed the then Minister on that point. It became very clear that Ofcom would be able to issue a code of practice on violence against women and girls, which she talked about. Should we not be seeking an assurance that Ofcom will do that? That would negate the need to amend the Bill further.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the right hon. Lady’s comments. We did discuss this at great length in Committee, and I know she cares deeply and passionately about this issue, as do I. It is welcome that Ofcom can issue a code of practice on violence against women and girls, and we should absolutely be urging it to do that, but we also need to make it a fundamental aim of the Bill. If the Bill is to be truly world leading, if it is truly to make us all safe online, and if we are finally to begin to tackle the scourge of violence against women and girls in all its elements—not just online but offline—then violence against women and girls needs to be named as a priority harm in the Bill. We need to take the brave new step of saying that enough is enough. Words are not enough. We need actions, and this is an action the Minister could take.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I think we would all agree that when we look at the priority harms set out in the Bill, women and girls are disproportionately the victims of those offences. The groups in society that the Bill will most help are women and girls in our community. I am happy to work with the hon. Lady and all hon. Members to look at what more we can do on this point, both during the passage of the Bill and in future, but as it stands the Bill is the biggest step forward in protecting women and girls, and all users online, that we have ever seen.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for the offer to work on that further, but we have an opportunity now to make real and lasting change. We talk about how we tackle this issue going forward. How can we solve the problem of violence against women and girls in our community? Three women a week are murdered at the hands of men in this country—that is shocking. How can we truly begin to tackle a culture change? This is how it starts. We have had enough of words. We have had enough of Ministers standing at the Dispatch Box saying, “This is how we are going to tackle violence against women and girls; this is our new plan to do it.” They have an opportunity to create a new law that makes it a priority harm, and that makes women and girls feel like they are being listened to, finally. I urge the Minister and Members in all parts of the House, who know that this is a chance for us finally to take that first step, to vote for new clause 3 today and make women and girls a priority by showing understanding that they receive a disproportionate level of abuse and harm online, and by making them a key component of the Bill.

David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- View Speech - Hansard - - - Excerpts

I join everybody else in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), to the Front Bench. He is astonishingly unusual in that he is both well-intentioned and well-informed, a combination we do not always find among Ministers.

I will speak to my amendments to the Bill. I am perfectly willing to be in a minority of one—one of my normal positions in this House. To be in a minority of one on the issue of free speech is an honourable place to be. I will start by saying that I think the Bill is fundamentally mis-designed. It should have been several Bills, not one. It is so complex that it is very difficult to forecast the consequences of what it sets out to do. It has the most fabulously virtuous aims, but unfortunately the way things will be done under it, with the use of Government organisations to make decisions that, properly, should be taken on the Floor of the House, is in my view misconceived.

We all want the internet to be safe. Right now, there are too many dangers online—we have been hearing about some of them from the hon. Member for Pontypridd (Alex Davies-Jones), who made a fabulous speech from the Opposition Front Bench—from videos propagating terror to posts promoting self-harm and suicide. But in its well-intentioned attempts to address those very real threats, the Bill could actually end up being the biggest accidental curtailment of free speech in modern history.

There are many reasons to be concerned about the Bill. Not all of them are to be dealt with in this part of the Report stage—some will be dealt with later—and I do not have time to mention them all. I will make one criticism of the handling of the Bill at this point. I have seen much smaller Bills have five days on Report in the past. This Bill demands more than two days. That was part of what I said in my point of order at the beginning.

One of the biggest problems is the “duties of care” that the Bill seeks to impose on social media firms to protect users from harmful content. That is a more subtle issue than the tabloid press have suggested. My hon. Friend the Member for Croydon South (Chris Philp), the previous Minister, made that point and I have some sympathy with him. I have spoken to representatives of many of the big social media firms, some of which cancelled me after speeches that I made at the Conservative party conference on vaccine passports. I was cancelled for 24 hours, which was an amusing process, and they put me back up as soon as they found out what they had done. Nevertheless, that demonstrated how delicate and sensitive this issue is. That was a clear suppression of free speech without any of the pressures that are addressed in the Bill.

When I spoke to the firms, they made it plain that they did not want the role of online policemen, and I sympathise with them, but that is what the Government are making them do. With the threat of huge fines and even prison sentences if they consistently fail to abide by any of the duties in the Bill—I am using words from the Bill—they will inevitably err on the side of censorship whenever they are in doubt. That is the side they will fall on.

Worryingly, the Bill targets not only illegal content, which we all want to tackle—indeed, some of the practice raised by the Opposition Front Bencher, the hon. Member for Pontypridd should simply be illegal full stop—but so-called “legal but harmful” content. Through clause 13, the Bill imposes duties on companies with respect to legal content that is “harmful to adults”. It is true that the Government have avoided using the phrase “legal but harmful” in the Bill, preferring “priority content”, but we should be clear about what that is.

The Bill’s factsheet, which is still on the Government’s website, states on page 1:

“The largest, highest-risk platforms will have to address named categories of legal but harmful material”.

This is not just a question of transparency—they will “have to” address that. It is simply unacceptable to target lawful speech in this way. The “Legal to Say, Legal to Type” campaign, led by Index on Censorship, sums up this point: it is both perverse and dangerous to allow speech in print but not online.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As I said, a company may be asked to address this, which means that it has to set out what its policies are, how it would deal with that content and its terms of service. The Bill does not require a company to remove legal speech that it has no desire to remove. The regulator cannot insist on that, nor can the Government or the Bill. There is nothing to make legal speech online illegal.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

That is exactly what the Minister said earlier and, indeed, said to me yesterday when we spoke about this issue. I do not deny that, but this line of argument ignores the unintended consequences that the Bill may have. Its stated aim is to achieve reductions in online harm, not just illegal content. Page 106 of the Government’s impact assessment lists a reduction in the prevalence of legal but harmful content as a “key evaluation” question. The Bill aims to reduce that—the Government say that both in the online guide and the impact assessment. The impact assessment states that an increase in “content moderation” is expected because of the Bill.

A further concern is that the large service providers already have terms and conditions that address so-called legal but harmful content. A duty to state those clearly and enforce them consistently risks legitimising and strengthening the application of those terms and conditions, possibly through automated scanning and removal. That is precisely what happened to me before the Bill was even dreamed of. That was done under an automated system, backed up by somebody in Florida, Manila or somewhere who decided that they did not like what I said. We have to bear in mind how cautious the companies will be. That is especially worrying because, as I said, providers will be under significant pressure from outside organisations to include restrictive terms and conditions. I say this to Conservative Members, and we have some very well-intentioned and very well-informed Members on these Benches: beware of the gamesmanship that will go on in future years in relation to this.

Ofcom and the Department see these measures as transparency measures—that is the line. Lord Michael Grade, who is an old friend of mine, came to see me and he talked about this not as a pressure, but as a transparency measure. However, these are actually pressure measures. If people are made to announce things and talk about them publicly, that is what they become.

It is worth noting that several free speech and privacy groups have expressed scepticism about the provisions, yet they were not called to give oral evidence in Committee. A lot of other people were, including pressure groups on the other side and the tech companies, which we cannot ignore, but free speech advocates were not.

13:45
The clause is also part of the Bill where real democratic scrutiny is missing. Without being too pious about this, the simple truth is that the comparative power of Parliament has diminished over the past decade or two, with respect to Government, and this is another example. The decision on what counts as
“priority content that is harmful to adults”
will initially be made by the Secretary of State and then be subject to the draft affirmative procedure, in a whipped Statutory Instrument Committee. I have never, ever been asked to serve on an SI Committee and the Whips’ Office has never sought me to volunteer to do that—I wonder why. I hasten to add that I am not volunteering to do so now either, but the simple truth is this will be considered in a whipped, selected Committee. When we talked earlier about constraints on the power, we heard comments such as, “We will only do this in the case of security, and so on.” Heavens above, we have been through two years of ferocious controversy on matters of public health security. This is not something that should somehow be protected from free speech, I’m afraid.
We cannot allow such significant curtailments of free expression to take place without proper parliamentary debate or amendment. These questions need to be discussed and decided in the Chamber, if need be, annually. When I first came into the House of Commons, we had an annual Companies Act, because companies law and accounting law were going through change. They were not changing anything like as fast as the internet. The challenges were not coming up anything like as fast as they do with the internet, so why do we not have an annual Bill on this matter? I would be perfectly happy to see that, so that we can make decisions here. If we do not do that, we could do this on an ad hoc basis as the issues arise, including some that the hon. Member for Pontypridd raised. We could have been dealing with that before now on a simpler basis than that of the Bill.
If a category of speech is important enough to be censored, which is what we are really asking for, it is important enough to be debated in this Chamber and by the whole of Parliament—the Commons and the Lords. Otherwise, the Government’s claim that the Bill will protect free speech will appear absurd. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled some amendments relating to the press. Even there, it is incredibly difficult to get this right because of the sheer complexity of the Bill and the size of the problem that we are trying to address. That is why I have tabled amendment 151, which seeks to remove clause 13 entirely, because it introduces the authoritarian concept of “legal but harmful” content—decided by the Government. “Legal but harmful” is a perfectly reasonable concept, but if it is decided by the Government alone, that is authoritarian. This is described as “priority content”, but everybody knows what it actually means. The Government have run away from using “legal but harmful” in a public context, but they use it everywhere else.
My amendments are designed to protect free speech while making the internet a safer place for everyone. I do not want to see content relating to suicide, self-harm or abuse of women, or whatever it may be, and I tabled two amendments to make them explicitly illegal, and the House can decide on those. That is what we should do. That is where the power of the House and the proper judgment lies.
The Bill has significantly improved since it was in draft form and the new Minister has a very honourable history in that reform. I compliment and commend him on that and thank him for those actions. I also welcome the measures taken against such things as cyber-flashing, but more needs to be done. The Bill falls far short of what it needs to be, and it would be remiss of us, in our duty as MPs, to let it pass without serious alteration.
I say this to the Whip on the Front Bench, and I hope that I have his attention: the Bill needs many more days on Report. I hope that he will reflect that back to the Chief Whip at the end of this business, because only with more days can we get it right. This is probably one of the most important Bills to go through this House in this decade, and we have not quite got it right yet.
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- View Speech - Hansard - - - Excerpts

I rise to speak to the amendments in my name and those of other right hon. and hon. Members. I welcome the Minister to his place after his much-deserved promotion; as other hon. Members have said, it is great to have somebody who is both passionate and informed as a Minister. I also pay tribute to the hon. Member for Croydon South (Chris Philp), who is sitting on the Back Benches: he worked incredibly hard on the Bill, displayed a mastery of detail throughout the process and was extremely courteous in his dealings with us. I hope that he will be speedily reshuffled back to the Front Bench, which would be much deserved—but obviously not that he should replace the Minister, who I hope will remain in his current position or indeed be elevated from it.

But enough of all this souking, as we say north of the border. As one can see from the number of amendments tabled, the Bill is not only an enormous piece of legislation but a very complex one. Its aims are admirable—there is no reason why this country should not be the safest place in the world to be online—but a glance through the amendments shows how many holes hon. Members think it still has.

The Government have taken some suggestions on board. I welcome the fact that they have finally legislated outright to stop the wicked people who attempt to trigger epileptic seizures by sending flashing gifs; I did not believe that such cruelty was possible until I was briefed about it in preparation for debates on the Bill. I pay particular tribute to wee Zach, whose name is often attached to what has been called Zach’s law.

The amendments to the Bill show that there has been a great deal of cross-party consensus on some issues, on which it has been a pleasure to work with friends in the Labour party. The first issue is addressed, in various ways, by amendments 44 to 46, 13, 14, 21 and 22, which all try to reduce the Secretary of State’s powers under the Bill. In all the correspondence that I have had about the Bill, and I have had a lot, that is the area that has most aggrieved the experts. A coalition of groups with a broad range of interests, including child safety, human rights, women and girls, sport and democracy, all agree that the Secretary of State is granted too many powers under the Bill, which threatens the independence of the regulator. Businesses are also wary of the powers, in part because they cause uncertainty.

The reduction of ministerial powers under the Bill was advised by the Joint Committee on the Draft Online Safety Bill and by the Select Committee on Digital, Culture, Media and Sport, on both of which I served. In Committee, I asked the then Minister whether any stakeholder had come forward in favour of these powers. None had.

Even DCMS Ministers do not agree with the powers. The new Minister was Chair of the Joint Committee, and his Committee’s report said:

“The powers for the Secretary of State to a) modify Codes of Practice to reflect Government policy and b) give guidance to Ofcom give too much power to interfere in Ofcom’s independence and should be removed.”

The Government have made certain concessions with respect to the powers, but they do not go far enough. As the Minister said, the powers should be removed.

We should be clear about exactly what the powers do. Under clause 40, the Secretary of State can

“modify a draft of a code of practice”.

That allows the Government a huge amount of power over the so-called independent communications regulator. I am glad that the Government have listened to the suggestions that my colleagues and I made on Second Reading and in Committee, and have committed to using the power only in “exceptional circumstances” and by further defining “public policy” motives. But “exceptional circumstances” is still too opaque and nebulous a phrase. What exactly does it mean? We do not know. It is not defined—probably intentionally.

The regulator must not be politicised in this way. Several similar pieces of legislation are going through their respective Parliaments or are already in force. In Germany, Australia, Canada, Ireland and the EU, with the Digital Services Act, different Governments have grappled with the issue of making digital regulation future-proof and flexible. None of them has added political powers. The Bill is sadly unique in making such provision.

When a Government have too much influence over what people can say online, the implications for freedom of speech are particularly troubling, especially when the content that they are regulating is not illegal. There are ways to future-proof and enhance the transparency of Ofcom in the Bill that do not require the overreach that these powers give. When we allow the Executive powers over the communications regulator, the protections must be absolute and iron-clad, but as the Bill stands, it gives leeway for abuse of those powers. No matter how slim the Minister feels the chance of that may be, as parliamentarians we must not allow it.

Amendment 187 on human trafficking is an example of a relatively minor change to the Bill that could make a huge difference to people online. Our amendment seeks to deal explicitly with what Meta and other companies refer to as domestic servitude, which is very newsworthy, today of all days, and which we know better as human trafficking. Sadly, this abhorrent practice has been part of our society for hundreds if not thousands of years. Today, human traffickers are aided by various apps and platforms. The same platforms that connect us with old friends and family across the globe have been hijacked by the very worst people in our world, who are using them to create networks of criminal enterprise, none more cruel than human trafficking.

Investigations by the BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell and co-ordinate the trafficking of young women. One would have thought that the issue would be of the utmost importance to Meta—Facebook, as it was at the time—yet, as the BBC reported, The Wall Street Journal found that

“the social media giant only took ‘limited action’ until ‘Apple Inc. threatened to remove Facebook’s products from the App Store, unless it cracked down on the practice’.”

I and my friends across the aisle who sat on the DCMS Committee and the Joint Committee on the draft Bill know exactly what it is like to have Facebook’s high heid yins before us. They will do absolutely nothing to respond to legitimate pressure. They understand only one thing: the force of law and of financial penalty. Only when its profits were in danger did Meta take the issue seriously.

The omission of human trafficking from schedule 7 is especially worrying, because if human trafficking is not directly addressed as priority illegal content, we can be certain that it will not be prioritised by the platforms. We know from their previous behaviour that the platforms never do anything that will cost them money unless they are forced to do so. We understand that it is difficult to regulate in respect of human trafficking on platforms: it requires work across borders and platforms, with moderators speaking different languages. It is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and for the costs that will be entailed. If human trafficking is not designated as a priority harm, I fear that it will fall by the wayside.

In Committee, the then Minister said that the relevant legislation was covered by other parts of the Bill and that it was not necessary to incorporate offences under the Modern Slavery Act 2015 into priority illegal content. He referred to the complexity of offences such as modern slavery, and said how illegal immigration and prostitution priority offences might cover that already. That is simply not good enough. Human traffickers use platforms as part of their arsenal at every stage of the process, from luring in victims to co-ordinating their movements and threatening their families. The largest platforms have ample capacity to tackle these problems and must be forced to be proactive. The consequences of inaction will be grave.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a pleasure to follow the hon. Member for Ochil and South Perthshire (John Nicolson).

Let me begin by repeating my earlier congratulations to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) on assuming his place on the Front Bench. Let me also take this opportunity to extend my thanks to those who served on the Bill Committee with me for some 50 sitting hours—it was, generally speaking, a great pleasure—and, having stepped down from the Front Bench, to thank the civil servants who have worked so hard on the Bill, in some cases over many years.

13:59
It would be strange if I did not broadly support the Government amendments, given that I have spent most of the last three or months concocting them. I will touch on one or two of them, and then mention some areas in which I think the House might consider going further when the Bill proceeds to the other end of the building. I certainly welcome new clause 19, which gives specific protection to content generated by news media publishers by ensuring that there is a right of appeal before it can be removed. I take the view—and I think the Government do as well—that protecting freedom of the press is critical, but as we grant news media publishers this special protection, it is important for us to ensure that we are granting it to organisations that actually deserve it.
That, I think, is the purpose of amendments 11 and 12, tabled by my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright). The amendments apply to clause 50, which defines the term “recognised news publisher”. During the evidence sessions in Committee, some concern was expressed that the definition was too wide, and that some organisations—"bad actors”, as the Minister put it—might manage to organise themselves in such a way that they would benefit from this exemption. My right hon. and learned Friend’s amendments are designed to tighten that definition a little bit. There is some concern that the drafting of the amendments might effectively give rise to back-door press regulation because determining whether news publishers’ terms and conditions are “suitable and sufficient” constitutes a value judgment, but I certainly agree that clause 50 needs tightening up.
I welcome—unsurprisingly—the reference in the written ministerial statement to tabling an amendment in the House of Lords providing that sanctioned organisations cannot benefit from this exemption. I suggest, however, that their lordships might like to consider going even further, for example by saying that where content amounts to a foreign interference offence as defined by the National Security Bill, introduced by my hon. Friend the Member for North East Hampshire (Mr Jayawardena)—the Under-Secretary of State for International Trade, who is in his place on the Front Bench—the organisation propagating it should not be able to benefit from the “recognised news publisher” exemption. Their lordships may wish to consider that, along with any other ideas for tightening the definition in clause 50.
Let me now say a word about free speech. It has been widely misreported that the Bill mandates censorship of speech that is legal but harmful. As I said in my intervention on the Minister earlier, that is categorically untrue. While the large social media platforms will have to address such content as part of their terms and conditions, they are not compelled in the actions that they have to take in relation to it; they simply have to risk-assess it, adopt a policy—what that policy is will be up to them—and then apply that policy consistently. They are not obliged to take any action, and they are certainly not obliged to remove the content entirely. Lest there should be any doubt about that, Government amendment 71 to clause 13 makes it explicit that it is reasonable to take no action if the platform sees fit.
Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I hear what the hon. Gentleman is saying, but he will have heard the speech made by his colleague, the right hon. Member for Haltemprice and Howden (Mr Davis). Does he not accept that it is correct to say that there is a risk of an increase in content moderation, and does he therefore see the force of my amendment, which we have previously discussed privately and which is intended to ensure that Twitter and other online service providers are subject to anti-discrimination law in the United Kingdom under the Equality Act 2010?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I did of course hear what was said by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis). To be honest, I think that increased scrutiny of content which might constitute abuse of harassment, whether of women or of ethnic minorities, is to be warmly welcomed. The Bill provides that the risk assessors must pay attention to the characteristics of the user. There is no cross-reference to the Equality Act—I know the hon. and learned Lady has submitted a request on that, to which my successor Minister will now be responding—but there are references to characteristics in the provisions on safety duties, and those characteristics do of course include gender and race.

In relation to the risk that these duties are over-interpreted or over-applied, for the first time ever there is a duty for social media firms to have regard to freedom of speech. At present these firms are under no obligation to have regard to it, but clause 19(2) imposes such a duty, and anyone who is concerned about free speech should welcome that. Clauses 15 and 16 go further: clause 15 creates special protections for “content of democratic importance”, while clause 16 does the same for content of journalistic importance. So while I hugely respect and admire my right hon. Friend the Member for Haltemprice and Howden, I do not agree with his analysis in this instance.

I would now like to ask a question of my successor. He may wish to refer to it later or write to me, but if he feels like intervening, I will of course give way to him. I note that four Government amendments have been tabled; I suppose I may have authorised them at some point. Amendments 72, 73, 78 and 82 delete some words in various clauses, for example clauses 13 and 15. They remove the words that refer to treating content “consistently”. The explanatory note attached to amendment 72 acknowledges that, and includes a reference to new clause 14, which defines how providers should go about assessing illegal content, what constitutes illegal content, and how content is to be determined as being in one of the various categories.

As far as I can see, new clause 14 makes no reference to treating, for example, legal but harmful content “consistently”. According to my quick reading—without the benefit of highly capable advice—amendments 72, 73, 78 and 82 remove the obligation to treat content “consistently”, and it is not reintroduced in new clause 14. I may have misread that, or misunderstood it, but I should be grateful if, by way of an intervention, a later speech or a letter, my hon. Friend the Minister could give me some clarification.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I think that the codes of practice establish what we expect the response of companies to be when dealing with priority illegal harm. We would expect the regulator to apply those methods consistently. If my hon. Friend fears that that is no longer the case, I shall be happy to meet him to discuss the matter.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 13(6)(b), for instance, states that the terms of service must be

“applied consistently in relation to content”,

and so forth. As far as I can see, amendment 72 removes the word “consistently”, and the explanatory note accompanying the amendment refers to new clause 14, saying that it does the work of the previous wording, but I cannot see any requirement to act consistently in new clause 14. Perhaps we could pick that up in correspondence later.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

If there is any area of doubt, I shall be happy to follow it up, but, as I said earlier, I think we would expect that if the regulator establishes through the codes of practice how a company will respond proactively to identify illegal priority content on its platform, it is inherent that that will be done consistently. We would accept the same approach as part of that process. As I have said, I shall be happy to meet my hon. Friend and discuss any gaps in the process that he thinks may exist, but that is what we expect the outcome to be.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to my hon. Friend for his comments. I merely observe that the “consistency” requirements were written into the Bill, and, as far as I can see, are not there now. Perhaps we could discuss it further in correspondence.

Let me turn briefly to clause 40 and the various amendments to it—amendments 44, 45, 13, 46 and others—and the remarks made by the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), about the Secretary of State’s powers. I intervened on the hon. Lady earlier on this subject. It also arose in Committee, when she and many others made important points on whether the powers in clause 40 went too far and whether they impinged reasonably on the independence of the regulator, in this case Ofcom. I welcome the commitments made in the written ministerial statement laid last Thursday—coincidentally shortly after my departure—that there will be amendments in the Lords to circumscribe the circumstances in which the Secretary of State can exercise those powers to exceptional circumstances. I heard the point made by the hon. Member for Ochil and South Perthshire that it was unclear what “exceptional” meant. The term has a relatively well defined meaning in law, but the commitment in the WMS goes further and says that the bases upon which the power can be exercised will be specified and limited to certain matters such as public health or matters concerning international relations. That will severely limit the circumstances in which those powers can be used, and I think it would be unreasonable to expect Ofcom, as a telecommunications regulator, to have expertise in those other areas that I have just mentioned. I think that the narrowing is reasonable, for the reasons that I have set out.

Julian Knight Portrait Julian Knight (Solihull) (Con)
- Hansard - - - Excerpts

Those areas are still incredibly broad and open to interpretation. Would it not be easier just to remove the Secretary of State from the process and allow this place to take directly from Ofcom the code of standards that we are talking about so that it can be debated fully in the House?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand my hon. Friend’s point. Through his work as the Chairman of the Select Committee he has done fantastic work in scrutinising the Bill. There might be circumstances where one needed to move quickly, which would make the parliamentary intervention he describes a little more difficult, but he makes his point well.

Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

So why not quicken up the process by taking the Secretary of State out of it? We will still have to go through the parliamentary process regardless.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government are often in possession of information—for example, security information relating to the UK intelligence community—that Ofcom, as the proposer of a code or a revised code, may not be in possession of. So the ability of the Secretary of State to propose amendments in those narrow fields, based on information that only the Government have access to, is not wholly unreasonable. My hon. Friend will obviously comment further on this in his speech, and no doubt the other place will give anxious scrutiny to the question as well.

I welcome the architecture in new clause 14 in so far as it relates to the definition of illegal content; that is a helpful clarification. I would also like to draw the House’s attention to amendment 16 to clause 9, which makes it clear that acts that are concerned with the commission of a criminal offence or the facilitation of a criminal offence will also trigger the definitions. That is a very welcome widening.

I do not want to try the House’s patience by making too long a speech, given how much the House has heard from me already on this topic, but there are two areas where, as far as I can see, there are no amendments down but which others who scrutinise this later, particularly in the other place, might want to consider. These are areas that I was minded to look at a bit more over the summer. No doubt it will be a relief to some people that I will not be around to do so. The first of the two areas that might bear more thought is clause 137, which talks about giving academic researchers access to social media platforms. I was struck by Frances Haugen’s evidence on this. The current approach in the Bill is for Ofcom to do a report that will takes two years, and I wonder if there could be a way of speeding that up slightly.

The second area concerns the operation of algorithms promoting harmful content. There is of course a duty to consider how that operates, but when it comes algorithms promoting harmful content, I wonder whether we could be a bit firmer in the way we treat that. I do not think that would restrain free speech, because the right of free speech is the right to say something; it is not the right to have an algorithm automatically promoting it. Again, Frances Haugen had some interesting comments on that.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

I agree that there is scope for more to be done to enable those in academia and in broader civil society to understand more clearly what the harm landscape looks like. Does my hon. Friend agree that if they had access to the sort of information he is describing, we would be able to use their help to understand more fully and more clearly what we can do about those harms?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. and learned Friend is right, as always. We can only expect Ofcom to do so much, and I think inviting expert academic researchers to look at this material would be welcome. There is already a mechanism in clause 137 to produce a report, but on reflection it might be possible to speed that up. Others who scrutinise the Bill may also reach that conclusion. It is important to think particularly about the operation of algorithmic promotion of harmful content, perhaps in a more prescriptive way than we do already. As I have said, Frances Haugen’s evidence to our Committee in this area was particularly compelling.

14:15
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I agree with my hon. Friend on both points. I discussed the point about researcher access with him last week, when our roles were reversed, so I am sympathetic to that. There is a difference between that and the researcher access that the Digital Services Act in Europe envisages, which will not have the legal powers that Ofcom will have to compel and demand access to information. It will be complementary but it will not replace the primary powers that Ofcom will have, which will really set our regime above those elsewhere. It is certainly my belief that the algorithmic amplification of harmful content must be addressed in the transparency reports and that, where it relates to illegal activities, it must absolutely be within the scope of the regulator to state that actively promoting illegal content to other people is an offence under this legislation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

On my hon. Friend’s first point, he is right to remind the House that the obligations to disclose information to Ofcom are absolute; they are hard-edged and they carry criminal penalties. Researcher access in no way replaces that; it simply acts as a potential complement to it. On his second point about algorithmic promotion, of course any kind of content that is illegal is prohibited, whether algorithmically promoted or otherwise. The more interesting area relates to content that is legal but perceived as potentially harmful. We have accepted that the judgments on whether that content stays up or not are for the platforms to make. If they wish, they can choose to allow that content simply to stay up. However, it is slightly different when it comes to algorithmically promoting it, because the platform is taking a proactive decision to promote it. That may be an area that is worth thinking about a bit more.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On that point, if a platform has a policy not to accept a certain sort of content, I think the regulators should expect it to say in its transparency report what it is doing to ensure that it is not actively promoting that content through a newsfeed, on Facebook or “next up” on YouTube. I expect that to be absolutely within the scope of the powers we have in place.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In terms of content that is legal but potentially harmful, as the Bill is drafted, the platforms will have to set out their policies, but their policies can say whatever they like, as we discussed earlier. A policy could include actively promoting content that is harmful through algorithms, for commercial purposes. At the moment, the Bill as constructed gives them that freedom. I wonder whether that is an area that we can think about making slightly more prescriptive. Giving them the option to leave the content up there relates to the free speech point, and I accept that, but choosing to algorithmically promote it is slightly different. At the moment, they have the freedom to choose to algorithmically promote content that is toxic but falls just on the right side of legality. If they want to do that, that freedom is there, and I just wonder whether it should be. It is a difficult and complicated topic and we are not going to make progress on it today, but it might be worth giving it a little more thought.

I think I have probably spoken for long enough on this Bill, not just today but over the last few months. I broadly welcome these amendments but I am sure that, as the Bill completes its stages, in the other place as well, there will be opportunities to slightly fine-tune it that all of us can make a contribution to.

Margaret Hodge Portrait Dame Margaret Hodge
- View Speech - Hansard - - - Excerpts

First, congratulations to the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins). I think his is one of the very few appointments in these latest shenanigans that is based on expertise and ability. I really welcome him, and the work he has done on the Bill this week has been terrific. I also thank the hon. Member for Croydon South (Chris Philp). When he held the position, he was open to discussion and he accepted a lot of ideas from many of us across the House. As a result, I think we have a better Bill before us today than we would have had. My gratitude goes to him as well.

I support much of the Bill, and its aim of making the UK the safest place to be online is one that we all share. I support the systems-based approach and the role of Ofcom. I support holding the platforms to account and the importance of protecting children. I also welcome the cross-party work that we have done as Back Benchers, and the roles played by both Ministers and by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). I thank him for his openness and his willingness to talk to us. Important amendments have been agreed on fraudulent advertising, bringing forward direct liability so there is not a two-year wait, and epilepsy trolling—my hon. Friend the Member for Batley and Spen (Kim Leadbeater) promoted that amendment.

I also welcome the commitment to bring forward amendments in the Lords relating to the amendments tabled by the hon. Member for Brigg and Goole (Andrew Percy) and the right hon. and learned Member for Kenilworth and Southam—I think those amendments are on the amendment paper but it is difficult to tell. It is important that the onus on platforms to be subject to regulation should be based not on size and functionality but on risk of harm. I look forward to seeing those amendments when they come back from the other place. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosques in Christchurch, New Zealand is probably the most egregious example, as the individual concerned had been on 8chan before committing that crime.

I am speaking to amendments 156 and 157 in my name and in the names of other hon. and right hon. Members. These amendments would address the issue of anonymous abuse. I think we all accept that anonymity is hugely important, particularly to vulnerable groups such as victims of domestic violence, victims of child abuse and whistleblowers. We want to retain anonymity for a whole range of groups and, in framing these amendments, I was very conscious of our total commitment to doing so.

Equally, freedom of speech is very important, as the right hon. Member for Haltemprice and Howden (Mr Davis) said, but freedom of speech has never meant freedom to harm, which is not a right this House should promote. It is difficult to define, and it is difficult to get the parameters correct, but we should not think that freedom of speech is an absolute right without constraints.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I agree with the right hon. Lady that freedom of speech is not absolute. As set out in article 10 of the European convention on human rights, there have to be checks and balances. Nevertheless, does she agree freedom of speech is an important right that this House should promote, with the checks and balances set out in article 10 of the ECHR?

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Absolutely. I very much welcome the hon. and learned Lady’s amendment, which clarifies the parameters under which freedom of speech can be protected and promoted.

Equally, freedom of speech does not mean freedom from consequences. The police and other enforcement agencies can pursue unlawful abuse, assuming they have the resources, which we have not discussed this afternoon. I know the platforms have committed to providing the finance for such resources, but I still question whether the resources are there.

The problem with the Bill and the Government amendments, particularly Government amendment 70, is that they weaken the platforms’ duty on legal but harmful abuse. Such abuse is mainly anonymous and the abusers are clever. They do not break the law; they avoid the law with the language they use. It might be best if I give an example. People do not say, in an antisemitic way, “I am going to kill all Jews.” We will not necessarily find that online, but we might find, “I am going to harm all globalists.” That is legal but harmful and has the same intent. We should think about that, without being beguiled by the absolute right to freedom of speech that I am afraid the right hon. Member for Haltemprice and Howden is promoting, otherwise we will find that the Bill does not meet the purposes we all want.

Much of the abuse is anonymous. We do not know how much, but much of it is. When there was racist abuse at the Euros, Twitter claimed that 99% of postings of racist abuse were identifiable. Like the Minister, I wrote to Twitter to challenge that claim and found that Twitter was not willing to share its data with me, claiming GDPR constraints.

It is interesting that, in recent days, the papers have said that one reason Elon Musk has given for pulling out of his takeover is that he doubts Twitter’s claim that fake and spam accounts represent less than 5% of users. There is a lack of understanding and knowledge of the extent of anonymous abuse.

In the case I have shared with the Minister on other occasions, I received 90,000 posts in the two months from the publication of the Equality and Human Rights Commission report to the shenanigans about the position of the previous leader of the Labour party—from October to Christmas. The posts were monitored for me by the Community Security Trust. When I asked how many of the posts were anonymous, I was told that it had been unable to do that analysis. I wish there were the resources to do so, but I think most of the posts were anonymous and abusive.

There is certainly public support for trying to tackle abusive posts. A June 2021 YouGov poll found that 78% of the public are in favour of revealing the identity of those who post online, and we should bear that in mind. If people feel strongly about this, and the poll suggests that they do, we should respond and not put it to one side.

The Government have tried to tackle this with a compromise following the very good work by the hon. Member for Stroud (Siobhan Baillie). The Bill places a duty on the platforms to give users the option to verify their identity. If a user chooses to remain unverified, they may not be able to interact with verified accounts. Although I support the motives behind that amendment, I have concerns.

First, the platform itself would have to verify who holds the account, which gives the platforms unprecedented access to personal details. Following Cambridge Analytica, we know how such data can be abused. Data on 87 million identities was stolen, and we know it was used to influence the Trump election in 2016, and it may have been a factor in the Brexit referendum.

Secondly, the police have been very clear on how I should deal with anonymous online abuse. They say that the last thing I should do is remove it, as they need it to be able to judge whether there is a real threat within the abuse that they should take seriously. So individuals having that right does not diminish the real harm they could face if the online abuse is removed.

Thirdly, one of the problems with a lot of online abuse is not just that it is horrible or can be dangerous in particular circumstances, but that it prevents democracy. It inhibits freedom of speech by inhibiting engagement in free, democratic discourse. Online abuse is used to undermine an individual’s credibility. A lot of the abuse I receive seeks to undermine my credibility. It says that I am a bad woman, that I abuse children, that I break tax law and that I do this, that and the other. Building that picture of me as someone who cannot be believed undermines my ability to enter into legitimate democratic debate on issues I care about. Simply removing anonymous online abuse from my account does not stop the circulation of abusive, misleading content that undermines my democratic right to free speech. Therefore, in its own way, it undermines free speech.

Amendments 156 and 157, in my name and in the name of other colleagues, are based on a strong commitment to protecting anonymity, especially for vulnerable groups. We seek to tackle anonymous abuse not by denying anonymity but by ensuring traceability. It is quite simple. The Government recognise the feasibility and importance of that with age verification; they have now accepted the argument on age verification, and I urge them to take it further. Although I have heard that various groups are hostile to what we are suggesting, in a meeting I held last week with HOPE not hate there was agreement that what we are proposing made sense, and therefore we and the Government should pursue it.

14:30
Under our proposed scheme, any individual who chooses to go on a platform would have to have their identity verified, not by the platform but by a third party. We would thus remove the platform’s ability to access the individual’s data, which it could use in an inappropriate way. Such a scheme is perfectly feasible, particularly now that the Government have introduced the age verification mechanism. More than 99% of us have bank accounts, so there is a simple way of verifying someone’s identity through a third-party mechanism without giving platforms the powers I have described. Everybody would be able to enter any platform and have total anonymity, and only if and when an individual posts something that breaks the law will they lose their right to anonymity.
To go back to a point I made in an intervention on the Minister, that would also involve having minimum standards on harmful but legal abuse. Under a minimum standards platform, only if someone posted abuse that is harmful—this would mainly be illegal abuse, but it would also be harmful but legal abuse—would they lose their right to anonymity. I think that is good, because one could name and shame. Most importantly, this would be the most effective tool for preventing a lot of online abuse from happening in the first place, and we should all be focusing our energies on doing so.
My hon. Friend the Member for Pontypridd (Alex Davies-Jones), our Front Bencher, has talked about women who are particularly vulnerable, and I think our measure would be very important—my experience justifies that. It would be a powerful deterrent. I hope that our Front-Bench team will support the proposition we are putting before the House. I will not press it to a vote if they do not, although I would regret the fact that they did not support it.
I regret that the Government do not feel able to support our proposition, but I think its time will come. A lot of the stuff that we are doing in this Bill is innovative, and we are not sure where everything will land. We are likely to get some things wrong and others right. I say to all Members, from across this House, that if we really want to reduce the amount of harmful abuse online, tackling anonymous abuse, rather than anonymity, must be central to our concerns. I urge my Front-Bench team and the Government to think carefully about this.
Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- View Speech - Hansard - - - Excerpts

I rise to speak on amendments 50, 51 and 55, and I share the free speech concerns that I think lie behind amendment 151. As I said in Committee to the previous Minister, my hon. Friend the Member for Croydon South (Chris Philp), who knew this Bill inside out—it was amazing to watch him do it—I have deep concerns about how the duty on “legal but harmful” content will affect freedom of speech. I do not want people to be prevented from saying what they think. I am known for saying what I think, and I believe others should be allowed the same freedom, offline and online. What is harmful can be a subjective question, and many of us in this House might have different answers. When we start talking about restricting content that is perfectly legal, we should be very careful.

This Bill is very complex and detailed, as I know full well, having been on the Committee. I support the Bill—it is needed—but when it comes to legal but harmful content, we need to make sure that free speech is given enough protection. We have to get the right balance, but clause 19 does not do that. It says only that social media companies have

“a duty to have regard to the importance of protecting users’ right to freedom of expression within the law.”

There is no duty to do anything about freedom of speech; it just says, “You have to think about the importance of it”. That is not enough.

I know that the Bill does not state that social media companies have to restrict content—I understand that—but in the real world that is what will happen. If the Government define certain content as harmful, no social media company will want to be associated with it. The likes of Meta will want to be seen to get tough on legally defined harmful content, so of course it will be taken down or restricted. We have to counterbalance that instinct by putting stronger free speech duties in the Bill if we insist on it covering legal but harmful.

The Government have said that we cannot have stronger free speech obligations on private companies, and, in general, I agree with that. However, this Bill puts all sorts of other obligations on Facebook, Twitter and Instagram, because they are not like other private companies. These companies and their chief executive officers are household words all around the world, and their power and influence is incredible. In 2021, Facebook’s revenue was $117 billion, which is higher than the GDP—

Andrew Percy Portrait Andrew Percy
- Hansard - - - Excerpts

Is that not exactly why there has to be action on legal but harmful content? The cross-boundary, cross-national powers of these organisations mean that we have to insist that they take action against harm, whether lawful or unlawful. We are simply asking those organisations to risk assess and ensure that appropriate warnings are provided, just as they are in respect of lots of harms in society; the Government require corporations and individuals to risk assess those harms and warn about them. The fact that these organisations are so transnational and huge is absolutely why we must require them to risk assess legal but harmful content.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I understand what my hon. Friend is saying, but the list of what is legal but harmful will be set by the Secretary of State, not by Parliament. All we ask is for that to be discussed on the Floor of the House before we place those duties on the companies. That is all I am asking us to do.

Facebook has about 3 billion active users globally. That is more than double the population of China, the world’s most populous nation, and it is well over half the number of internet users in the entire world. These companies are unlike any others we have seen in history. For hundreds of millions of people around the world, they are the public square, which is how the companies have described themselves: Twitter founder Jack Dorsey said in 2018:

“We believe many people use Twitter as a digital public square. They gather from all around the world to see what’s happening, and have a conversation about what they see.”

In 2019, Mark Zuckerberg said:

“Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square.”

Someone who is blocked from these platforms is blocked from the public square, as we saw when the former President of the United States was blocked. Whatever we might think about Donald Trump, it cannot be right that he was banned from Twitter. We have to have stronger protection for free speech in the digital public square than clause 19 gives. The Bill gives the Secretary of State the power to define what is legal but harmful by regulations. As I have said, this is an area where free speech could easily be affected—

Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - - - Excerpts

I commend my hon. Friend for the powerful speech he is making. It seems to many of us here that if anyone is going to be setting the law or a regulation, it should really be done in the Chamber of this House. I would be very happy if we had annual debates on what may be harmful but is currently lawful, in order to make it illegal. I very much concur with what he is saying.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I thank my hon. Friend for his contribution, which deals with what I was going to finish with. It is not enough for the Secretary of State to have to consult Ofcom; there should be public consultation too. I support amendment 55, which my hon. Friend has tabled.

Anna McMorrin Portrait Anna McMorrin (Cardiff North) (Lab)
- View Speech - Hansard - - - Excerpts

Not too long ago, the tech industry was widely looked up to and the internet was regarded as the way forward for democracy and freedoms. Today that is not the case. Every day we read headlines about data leaks, racist algorithms, online abuse, and social media platforms promoting, and becoming swamped in, misinformation, misogyny and hate. These problems are not simply the fault of those platforms and tech companies; they are the result of a failure to govern technology properly. That has resulted from years of muddled thinking and a failure to bring forward this Bill, and now, a failure to ensure that the Bill is robust enough.

Ministers have talked up the Bill, and I welcome the improvements that were made in Committee. Nevertheless, Ministers had over a decade in which to bring forward proposals, and in that time online crime exploded. Child sexual abuse online has become rife; the dark web provides a location for criminals to run rampant and scams are widespread.

Delay has also allowed disinformation to spread, including state-sponsored propaganda and disinformation, such as from Russia’s current regime. False claims and fake fact checks are going viral. That encourages other groups to adopt such tactics, in an attempt to undermine democracy, from covid deniers to climate change deniers—it is rampant.

Today I shall speak in support of new clause 3, to put violence against women and girls on the face of the Bill. As a female MP, I, along with my colleagues, have faced a torrent of abuse online, attacking me personally and professionally. I have been sent images such as that of a person with a noose around their neck, as well as numerous messages containing antisemitic and misogynistic abuse directed towards both me and my children. It is deeply disturbing, but also unsurprising, that one in five women across the country have been subjected to abuse; I would guess that that figure is actually much higher.

Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

I am really sorry to hear about the abuse that the hon. Lady and her family have received. Many women inside and without this Chamber, such as myself, receive terrible abuse on Twitter, including repeated threats to shoot us if we do not shut the f-u-c-k up. Twitter refuses to take down memes of a real human hand pointing a gun at me and other feminists and lesbians, telling us to shut the f-u-c-k up. Does she see the force of my amendment to ensure that Twitter apply its moderation policy evenly across society with regard to all protected characteristics, including sex?

Anna McMorrin Portrait Anna McMorrin
- Hansard - - - Excerpts

The hon. and learned Lady makes a very good point, and that illustrates what I am talking about in my speech—the abuse that women face online. We need this legislation to ensure that tech companies take action.

There is a very dark side to the internet, deeply rooted in misogyny. The End Violence Against Women organisation released statistics last year, stating that 85% of women who experienced online abuse from a partner or ex-partner also received abuse online. According to the latest Office for National Statistics figures, 92% of women who were killed in the year ending March 2021 were killed by men. Just yesterday, a woman was stabbed in the back by a male cyclist in east London, near to where Zara Aleena was murdered just two weeks ago. And in the year 2021, nearly 41,000 women were victims of sexual assault—and those were just the ones who reported it. We know that the actual figure was very much higher. That was the highest number of sexual offences ever recorded within a 12-month period. It is highly unlikely that any of those women will ever see their perpetrator brought to justice, because of the current 1.3% prosecution rate of rape cases. Need I continue?

14:45
Violence against women and girls is an ever-growing epidemic, and time is running out. This Government are more concerned with piecemeal actions that fail to tackle the root causes of the issue. Although the introduction of new criminal offences such as cyber-flashing and rape threats is a welcome first step, there are significant concerns about their enforceability. The cyber-flashing offence requires the police to prove a perpetrator’s intent to cause harm, which is incredibly difficult to evidence. That is the loophole through which perpetrators avoid consequences.
I doubt there are many women who have not been sent unsolicited images of male genitals online. There are accounts of women being airdropped images on public transport while on their way to work. What does that leave them feeling? Violated—scared, not knowing who in their train carriage or on their bus has sent those unsolicited images. The online dating platform Bumble conducted research on cyber-flashing and found that of its users, nearly half of women aged 18 to 24 had received a sexual photo that they did not ask for in the last year alone.
So, considering the scale of this issue and the Government’s appalling record on prosecuting sexual assault offences, why would this new offence be any different? Acts of violence towards women are not merely isolated incidents. We know that, unfortunately, there is systemic misogyny within our society that results in a shocking number of women losing their lives, but we refuse to see it. Failing to name violence against women and girls on the face of the Bill is putting the lives of countless women at risk, and will leave behind a dangerous and damning legacy, even by this Government’s standards.
I welcome the new Minister to his place and hope that he will look at this issue in a new light. I hope that the Government can put politics to one side for just a moment, match their words with deeds and commit to protecting women across the country by supporting new clause 3.
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- View Speech - Hansard - - - Excerpts

Order. The House will see that a great many people still wish to speak. May I explain that there are two groups of amendments? We will finish debating this group at 4.30 pm, after which there will be some votes, and debate on the next group of amendments will last until 7 o’clock. By my calculations, there might be more time for speeches during the debate on the next group, so if anyone wishes to speak on that group rather than the current group, I would be grateful if they came and indicated that to me. Meanwhile, if everyone takes about eight minutes and no longer, everyone will have the opportunity to speak. I call Sir Jeremy Wright.

Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - - - Excerpts

I shall speak to the amendments in my name and the names of other right hon. and hon. Members, to whom I am grateful for their support. I am also grateful to the organisations that helped me to work through some of the problems I am about to identify, including the Carnegie Trust, Reset and the Antisemitism Policy Trust.

On the first amendments I shall talk about, amendments 42 and 43, I have been able to speak to Lego, so I can honestly say that these amendments were put together with Lego. Let me explain. The focus of the Bill, quite rightly, is on safety, and there is no safety more important than the safety of children. In that respect, the Bill is clear: platforms must give the safety of children the utmost priority and pay close attention to ways to enhance it. In other parts of the Bill, however, there are countervailing duties—for example, in relation to freedom of speech and privacy—where, predominantly in relation to adults, we expect platforms to conduct a balancing exercise. It seems right to me to think about that in the context of children, too.

As I said, the emphasis is rightly on children’s safety, but the safest approach would be to prohibit children from any online activity at all. We would not regard such an approach as sensible, because there are benefits to children in being able to engage—safely, of course—in online activity and to use online products and services. It seems to me that we ought to recognise that in the language of the Bill. Amendment 42 would do that when consideration is given to the safety duties designed to protect children set out in clause 11, which requires that “proportionate measures” must be taken to protect children’s safety and goes on to explain what factors might be taken into account when deciding what is proportionate, by adding

“the benefits to children’s well-being”

of the product or service in that list of factors. Amendment 43 would do the same when consideration is given to the online safety objectives set out in schedule 4. Both amendments are designed to ensure that the appropriate balance is struck when judgments are taken by platforms.

Others have spoken about journalistic content, and I am grateful for what the Minister said about that, but my amendment 10 is aimed at the defect that I perceive in clause 16. The Bill gives additional protections and considerations to journalists, which is entirely justifiable, given the important role that journalism plays in our society, but those extra protections mean that it will be harder for platforms to remove potentially harmful content that is also journalistic content. We should be sure, therefore, that the right people get the benefit of that protection.

It is worth having look at what clause 16 says and does. It sets out that a platform—a user-to-user service—in category 1 will have

“A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about…how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and…whether to take action against a user generating, uploading or sharing such content.”

So it is important, because of the significance of those protections, that we get right the definitions of those who should benefit from them. Amendment 10 would amend clause 16(8), which states that:

“For the purposes of this section content is “journalistic content”, in relation to a user-to-user service, if…the content is”

either

“news publisher content in relation to that service”—

the definition of which I will return to—

“or…regulated user-generated content in relation to that service”.

That is the crucial point. The content also has to be

“generated for the purposes of journalism”

and be linked to the UK.

The first problem here is that journalism is not defined in the Bill. There are definitions of journalism, but none appears in the text of this Bill. “UK-linked” does not narrow it down much, and “regulated user-generated content” is a very broad category indeed. Clause 16 as drafted offers the protection given to journalistic content not just to news publishers, but to almost everybody else who chooses to define themselves as a journalist, whether or not that is appropriate. I do not think that that is what the Bill is intended to do, or an approach that this House should endorse. Amendment 10 would close the loophole by removing the second limb, regulated user-generated content that is not news publisher content. Let me be clear: I do not think that that is the perfect answer to the question I have raised, but it is better than the Bill as it stands, and if the Government can come up with a way of reintroducing protections of this kind for types of journalistic content beyond news publisher content that clearly deserve them, I will be delighted and very much open to it. Currently, however, the Bill is defective and needs to be remedied.

That brings us to the definition of news publisher content, because it is important that if we are to give protection to that category of material, we are clear about what we mean by it. Amendments 11 and 12 relate to the definition of news publisher content that arises from the definition of a recognised news publisher in clauses 49 and 50. That matters for the same reason as I just set out: we should give these protections only to those who genuinely deserve them. That requires rigorous definition. Clause 50 states that if an entity is not named in the Bill, as some are, it must fulfil a set of conditions set out in subsection (2), which includes having a standards code and policies and procedures for handling and resolving complaints. The difficulty here is that in neither case does the Bill refer to any quality threshold for those two things, so having any old standards code or any old policy for complaints will apparently qualify. That cannot be right.

I entirely accept that inserting a provision that the standards code and the complaints policies and procedures should be both “suitable and sufficient” opens the question whose job it becomes to decide what is suitable and sufficient. I am familiar with all the problems that may ensue, so again, I do not say that the amendment is the final word on the subject, but I do say that the Government need to look more carefully at what the value of those two items on the list really is if the current definition stands. If we are saying that we want these entities to have a standards code and a complaints process that provide some reassurance that they are worthy of the protections the Bill gives, it seems to me that meaningful criteria must apply, which currently they do not.

The powers of the Secretary of State have also been discussed by others, but I perhaps differ from their view in believing that there should be circumstances in which the Secretary of State should hold powers to act in genuine emergency situations. However, being able to direct Ofcom, as the Bill allows the Secretary of State to do, to modify a code of practice

“for reasons of public policy”

is far too broad. Amendment 13 would simply remove that capacity, with amendment 14 consequential upon it.

I accept that on 7 July the Secretary of State issued a written statement that helps to some extent on that point—it was referred to by my hon. Friend the Member for Croydon South South (Chris Philp). First, it states that the Secretary of State would act only in “exceptional circumstances”, although it does not say who defines what exceptional circumstances are, leaving it likely that the Secretary of State would do so, which does not help us much. Secondly, it states the intention to replace the phrase

“for reasons of public policy”

with a list of circumstances in which the Secretary of State might act. I agree with my hon. Friend the Member for Solihull (Julian Knight) that that is still too broad. The proposed list comprises

“national security, public safety, public health, the UK’s international relations and obligations, economic policy and burden to business.”—[Official Report, 7 July 2022; Vol. 717, c. 69WS.]

The platforms we are talking about are businesses. Are we really saying that a burden on them would give the Secretary of State reason to say to Ofcom, the independent regulator, that it must change a code of practice? That clearly cannot be right. This is still too broad a provision. The progress that has been made is welcome, but I am afraid that there needs to be more to further constrain this discretion. That is because, as others have said, the independence of the regulator is crucial not just to this specific part of the Bill but to the credibility of the whole regulatory and legislative structure here, and therefore we should not undermine it unless we have to.

15:00
Madam Deputy Speaker, may I also say something very briefly about new clause 14. This is the Government’s additional new clause, which is designed to assist platforms in understanding some of the judgments that they have to make and how to make them, particularly in relation to illegal content. When people first look at this Bill, they will assume that everyone knows what illegal content is and therefore it should be easy to identify and take it down, or take the appropriate action to avoid its promotion. But, as new clause 14 makes clear, what the platform has to do is not just identify content but have reasonable grounds to infer that all elements of an offence, including the mental elements, are present or satisfied, and, indeed, that the platform does not have reasonable grounds to infer that the defence to the offence may be successfully relied upon. That is right, of course, because criminal offences very often are not committed just by the fact of a piece of content; they may also require an intent, or a particular mental state, and they may require that the individual accused of that offence does not have a proper defence to it. The question of course is how on earth a platform is supposed to know either of those two things in each case. This is helpful guidance, but the Government will have to think carefully about what further guidance they will need to give—or Ofcom will need to give—in order to help a platform to make those very difficult judgments.
Julian Knight Portrait Julian Knight
- View Speech - Hansard - - - Excerpts

Although this is not contained within these measures, it is pertaining to them. Does my right hon. and learned Friend agree that, down the line, Ofcom will want to look at a regime of compliance officers in order to give the guidance that he seeks?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

Yes, that is a possible way forward. Ofcom will need to produce a code of practice in this area. I am sure my hon. Friend on the Front Bench will say that that is a suitable way to deal with the problem that I have identified. It may well be, but at this stage, it is right for the House to recognise that the drafting of the Bill at the moment seeks to offer support to platforms, for which I am sure they will be grateful, but it will need to offer some more in order to allow these judgments to be made.

I restate the point that I have made in previous debates on this subject: there is little point in this House passing legislation aimed to make the internet a safer place if the legislation does not work as it is intended to. If our regime does not work, we will keep not a single person any safer. It is important, therefore, that we think about this Bill not in its overarching statements and principles but, particularly at this stage of consideration, in terms of how it will actually work.

You will not find a bigger supporter of the Bill in this House than me, Madam Deputy Speaker, but I want to see it work well and be effective. That means that some of the problems that I am highlighting must be addressed. Because humility is a good way to approach debates on something as ground-breaking and complex as this, I do not pretend that I have all the right answers. These amendments have been tabled because the Bill as it stands does not quite yet do the job that we want it to do. It is a good Bill—it needs to pass—but it can be better, and I very much hope that this process will improve it.

Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

I rise to speak to new clause 24 and amendments 193 and 191 tabled in my name. I also want to specifically give my support to new clause 6 and amendments 33 and 34 in the name of the right hon. Member for Kingston upon Hull North (Dame Diana Johnson).

The purpose of my amendments, as I have indicated in a number of interventions, is to ensure that, when moderating content, category 1 service providers such as Twitter abide by the anti-discrimination law of our domestic legal systems—that is to say the duties set out in the Equality Act 2010 not to discriminate against, harass or victimise their users on the grounds of a protected characteristic.

I quickly want to say a preliminary word about the Bill. Like all responsible MPs, I recognise the growing concern about online harms, and the need to protect service users, especially children, from harmful and illegal content online. That said, the House of Lords’ Communication and Digital Committee was correct to note that the internet is not currently the unregulated Wild West that some people say it is, and that civil and criminal law already applies to activities online as well as offline.

The duty of care, which the Bill seeks to impose on online services, will be a significant departure from existing legislation regulating online content. It will allow for a more preventative approach to regulating illegal online content and will form part of a unified regulatory framework applying to a wide range of online services. I welcome the benefits that this would represent, especially with respect to preventing the proliferation of child sexual and emotional abuse online.

Before I became an MP, I worked for a number of years as a specialist sex crimes prosecutor, so I am all too aware of how children are targeted online. Sadly, there are far too many people in our society, often hiding in plain sight, who seek to exploit children. I must emphasise that child safeguarding should be a No. 1 priority for any Government. In so far as this Bill does that, I applaud it. However, I do have some concerns that there is a significant risk that the Bill will lead to censorship of legal speech by online platforms. For the reasons that were set out by the right hon. Member for Haltemprice and Howden (Mr Davis), I am also a bit worried that it will give the Government unacceptable controls over what we can and cannot say online, so I am keen to support any amendments that would ameliorate those aspects of the Bill. I say this to those Members around the Chamber who might be looking puzzled: make no mistake, when the Bill gives greater power to online service providers to regulate content, there is a very real risk that they will be lobbied by certain groups to regulate what is actually legal free speech by other groups. That is partly what my amendment is designed to avoid.

Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - - - Excerpts

What the hon. and learned Lady says is sensible, but does she accept—this is a point the Minister made earlier—that, at the moment, the platforms have almost unfettered control over what they take down and what they leave up? What this Bill does is present a framework for the balancing exercise that they ought to apply in making those decisions.

Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

That is why I am giving the Bill a cautious welcome, but I still stand by my very legitimate concerns about the chilling effect of aspects of this Bill. I will give some examples in a moment about the problems that have arisen when organisations such as Twitter are left to their own devices on their moderation of content policy.

As all hon. Members will be aware, under the Equality Act there are a number of protected characteristics. These include: age; gender reassignment; being married or in a civil partnership; being pregnant or on maternity leave; disability; race, including colour, nationality, ethnic or national origin; religion or belief; sex and sexual orientation. It is against the law to discriminate, victimise or harass anyone because of any of those protected characteristics, but Twitter does discriminate against some of the protected characteristics. It often discriminates against women in the way that I described in an intervention earlier. It takes down expressions of feminist belief, but refuses to take down expressions of the utmost violent intent against women. It also discriminates against women who hold gender-critical beliefs. I remind hon. Members that, in terms of the Employment Appeal Tribunal’s decision in the case of Maya Forstater, the belief that sex matters is worthy of respect in a democratic society and, under the Equality Act, people cannot lawfully discriminate against women, or indeed men, who hold those views.

Twitter also sometimes discriminates against lesbians, gay men and bisexual people who assert that their sexual orientation is on the basis of sex, not gender, despite the fact that same-sex orientation, such as I hold, is a protected characteristic under the Equality Act.

At present, Twitter claims not to be covered by the Equality Act. I have seen correspondence from its lawyers that sets out the purported basis for that claim, partly under reference to schedule 25 to the Equality Act, and partly because it says:

“Twitter UK is included in an Irish Company and is incorporated in the Republic of Ireland. It does pursue economic activity through a fixed establishment in the UK but that relates to income through sales and marketing with the main activity being routed through Ireland.”

I very much doubt whether that would stand up in court, since Twitter is clearly providing a service in the United Kingdom, but it would be good if we took the opportunity of this Bill to clarify that the Equality Act applies to Twitter, so that when it applies moderation of content under the Bill, it will not discriminate against any of the protected characteristics.

The Joint Committee on Human Rights, of which I am currently the acting Chair, looked at this three years ago. We had a Twitter executive before our Committee and I questioned her at length about some of the content that Twitter was content to support in relation to violent threats against women and girls and, on the other hand, some of the content that Twitter took down because it did not like the expression of certain beliefs by feminists or lesbians.

We discovered on the Joint Committee on Human Rights that Twitter’s hateful conduct policy does not include sex as a protected characteristic. It does not reflect the domestic law of the United Kingdom in relation to anti-discrimination law. Back in October 2019, in the Committee’s report on democracy, freedom of expression and freedom of association, we recommended that Twitter should include sex as a protected characteristic in its hateful conduct policy, but Twitter has not done that. It seems Twitter thinks it is above the domestic law of the United Kingdom when it comes to anti-discrimination.

At that Committee, the Twitter executive assured me that certain violent memes that often appear on Twitter directed against women such as me and against many feminists in the United Kingdom, threatening us with death by shooting, should be removed. However, just in the past 48 hours I have seen an example of Twitter’s refusing to remove that meme. Colleagues should be assured that there is a problem here, and I would like us to direct our minds to it, as the Bill gives us an opportunity to do.

Whether or not Twitter is correctly praying in aid the loophole it says there is in the Equality Act—I think that is questionable—the Bill gives us the perfect opportunity to clarify matters. Clause 3 of clearly brings Twitter and other online service providers within the regulatory scheme of the Bill as a service with

“a significant number of United Kingdom users”.

The Bill squarely recognises that Twitter provides a service in the United Kingdom to UK users, so it is only a very small step to amend the Bill to make it absolutely clear that when it does so it should be subject to the Equality Act. That is what my new clause 24 seeks to do.

I have also tabled new clauses 193 and 191 to ensure that Twitter and other online platforms obey non-discrimination law regarding Ofcom’s production of codes of practice and guidance. The purpose of those amendments is to ensure that Ofcom consults with persons who have expertise in the Equality Act before producing those codes of conduct.

I will not push the new clauses to a vote. I had a very productive meeting with the Minister’s predecessor, the hon. Member for Croydon South (Chris Philp), who expressed a great deal of sympathy when I explained the position to him. I have been encouraged by the cross-party support for the new clauses, both in discussions before today with Members from all parties and in some of the comments made by various hon. Members today.

I am really hoping that the Government will take my new clauses away and give them very serious consideration, that they will look at the Joint Committee’s report from October 2019 and that either they will adopt these amendments or perhaps somebody else will take them forward in the other place.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I can assure the hon. and learned Lady that I am happy to carry on the dialogue that she had with my predecessor and meet her to discuss this at a further date.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I am delighted to hear that. I must tell the Minister that I have had a huge number of approaches from women, from lesbians and from gay men across the United Kingdom who are suffering as a result of Twitter’s moderation policy. There is a lot of support for new clause 24.

Of course, it is important to remember that the Equality Act protects everyone. Gender reassignment is there with the protected characteristics of sex and sexual orientation. It is really not acceptable for a company such as Twitter, which provides a service in the United Kingdom, to seek to flout and ignore the provisions of our domestic law on anti-discrimination. I am grateful to the Minister for the interest he has shown and for his undertaking to meet me, and I will leave it at that for now.

15:15
Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

We live in the strangest of times, and the evidence of that is that my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who has knowledge second to none in this area, has ended up in charge of it. I have rarely seen such an occurrence. I hope he is able to have a long and happy tenure and that the blob does not discover that he knows what he is doing.

I backed the Bill on Second Reading and I will continue to back it. I support most of the content within it and, before I move on to speak to the amendments I have tabled, I want to thank the Government for listening to the recommendations of the Digital, Culture, Media and Sport Committee, which I chair. The Government have accepted eight of the Committee’s key recommendations, demonstrating that the Committee is best placed to provide Parliamentary scrutiny of DCMS Bills as they pass through this House and after they are enacted.

I also pay tribute to the work of the Joint Committee on the draft Bill, which my hon. Friend the Member for Folkestone and Hythe chaired, and the Public Bill Committee, which has improved this piece of legislation during its consideration. The Government have rightfully listened to the Select Committee’s established view that it would be inappropriate to establish a permanent joint committee on digital regulation. I also welcome the news that the Government are set to bring forward amendments in the House of Lords to legislate for a new criminal offence for epilepsy trolling, which was recommended by both the Joint Committee and the Select Committee.

That said, the Digital, Culture, Media and Sport Committee continues to have concerns around some aspects of the Bill, particularly the lack of provision for funding digital literacy, a key area where we are falling behind in and need to make some progress. However, my primary concern and that of my colleagues on the Committee relates to the powers within this Bill that would, in effect, give the Secretary of State the opportunity to interfere with Ofcom’s role in the issuing of codes of practice to service providers.

It is for that reason that I speak to amendments 44 to 46 standing in my name on the amendment paper. Clause 40, in my view, gives the Secretary of State unprecedented powers and would bring into question the future integrity of Ofcom itself. Removing the ability to exercise those powers in clause 39 would mean we could lose clauses 40 and 41, which outline the powers granted and how they would be sent to the House for consideration.

Presently, Ofcom sets out codes of practice under which,

“companies can compete fairly, and businesses and customers benefit from the choice of a broad range of services”.

Under this Bill Ofcom, which, I remind the House, is an independent media regulator, will be required to issue codes of practice to service providers, for example codes outlining measures that would enable services to comply with duties to mitigate the presence of harmful content.

Currently, codes of practice from Ofcom are presented to the House for consideration “as soon as practicable”, something I support. My concern is the powers given in this Bill that allow the Secretary of State to reject the draft codes of practice and to send them back to Ofcom before this House knows the recommendations exist, let alone having a chance to consider or debate them.

I listened with interest to my hon. Friend the Member for Croydon South (Chris Philp), who is not in his place but who was a very fine Minister during his time in the Department. To answer his query on the written ministerial statement and the letter written to my Committee on this matter, I say to him and to those on the Front Bench that if the Government disagree with what Ofcom is saying, they can bring the matter to the House and explain that disagreement. That would allow things to be entirely transparent and open, allow greater scrutiny rather than less, and allow for less delay than would be the case if there is forever that ping-pong between the Secretary of State and Ofcom until it gets its work right.

I want to make it clear that the DCMS Committee and I believe that this is nothing more than a power grab by the Executive. I am proud that in western Europe we have a free press without any interference from Government, and I believe that the Bill, if constituted in this particular form, has the potential to damage that relationship—I say potential, because I do not believe that is the intention of what is being proposed here, but there is the potential for the Bill to jeopardise that relationship in the long term. That is why I hope that Members will consider supporting my amendments, and I will outline why they should do so.

As William Perrin, a trustee of the Carnegie Trust UK, made clear in evidence to my Committee,

“the underpinning convention of regulation of media in Western Europe is that there is an independent regulator and the Executive does not interfere in their day to day decision-making for very good reason.” Likewise, Dr Edina Harbinja, a senior lecturer at Aston University, raised concerns that the Bill made her

“fear that Ofcom’s independence may be compromised”

and that

“similar powers are creeping into other law reform pieces and proposals, such as…data protection”.

My amendments seek to cut red tape, bureaucracy and endless recurring loops that in some cases may result in significant delays in Ofcom managing to get some codes of practice approved. The amendments will allow the codes to come directly to this House for consideration by Members without another level of direct interference from the Secretary of State. Let me make it very clear that this is not a comment on any Secretary of State, at any time in the past, but in some of these cases I expect that Ofcom will require a speedy turnaround to get these codes of practices approved—for instance, measures that it wishes to bring forward to better safeguard children online. In addition, the Secretary of State has continually made it clear in our Select Committee hearings that she is a great supporter of more parliamentary scrutiny. I therefore hope that the Government will support my amendment so that we do not end up in a position where future Secretaries of State could potentially prevent draft codes coming before the House due to endless delays and recurring loops.

I also want to make it abundantly clear that my amendment does not seek to prevent the Secretary of State from having any involvement in the formulation of new codes of practice from Ofcom. Indeed, as Ofcom has rightly pointed out, the Secretary of State is already a statutory consultee when Ofcom wishes to draft new codes of practice or amend those that already exist. She can also, every three years, set out guidelines that Ofcom would have to follow when creating such codes of practice. The Government therefore already play a crucial role in influencing the genesis and the direction of travel in this area.

On Friday the Secretary of State wrote to my office outlining some of the concerns shared by Members of this House and providing steps on how her Department would address those concerns. In her letter, she recognises that the unprecedented powers awarded to the Secretary of State are of great concern to Members and goes on to state that

“regulatory independence is vital to the success of the framework”.

I have been informed that in order to appease some of these concerned Members, the Government intend to bring forward amendments around the definitions of “exceptional circumstances” and “public policy”, as referenced earlier. These definitions, including “economic policy” and “business interests”, are so broad that I cannot think of anything that would not be covered by these exceptional circumstances.

If the Secretary of State accepts our legitimate concerns, surely Ministers should accept my amendments becoming part of the Bill today, leaving a cleaner process rather than an increasingly complex system of unscrutinised ministerial interference with the regulator. The DCMS Committee and I are very clear that clause 40 represents a power grab by the Government that potentially threatens the independence of Ofcom, which is a fundamental principle of ensuring freedom of speech and what should be a key component of this legislation. The Government must maintain their approach to ensuring independent, effective, and trustworthy regulation.

I will not press my amendments to a vote, but I hope my concerns will spark not just thoughts and further engagement from Ministers but legislative action in another place as the Bill progresses, because I really do think that this could hole the Bill under the waterline and has the potential for real harm to our democratic way of life going forward as we tackle this whole new area.

Kevan Jones Portrait Mr Kevan Jones (North Durham) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to speak to my new clause 8, which would place a duty on all internet site providers regulated by this Bill to prevent individuals from encountering adverts for cosmetic procedures that do not contain disclaimers as to health risks of the procedure or include certified service quality indicators.

I have been campaigning for a number of years for better regulation of the non-surgical and cosmetic surgery industry, which is frankly a wild west in terms of lack of regulation, only made worse by the internet. I pay tribute to my constituent Dawn Knight, who has been a fierce campaigner in this area. We are slowly making progress. I thank the former Health Minister, the hon. Member for Charnwood (Edward Argar), for his work in bringing amendments on licensing to the Bill that became the Health and Care Act 2022. That is now out for consultation. It is a first, welcome step in legislation to tame the wild west that is the cosmetic surgery sector. My amendment would enhance and run parallel to that piece of legislation.

Back in 2013, Sir Bruce Keogh first raised the issue of advertising in his recommendations on regulation of the cosmetic surgery industry, saying that cosmetic and aesthetic procedures adverts should be provided with a disclaimer or kitemark in a manner similar to that around alcohol or gambling regulation. Years ago, adverts were in newspapers and magazines. Now, increasingly, the sector’s main source of advertising revenue is the internet.

People will say, “Why does this matter?” Well, it links to some of the other things that have been raised in this debate. The first is safety. We do not have any data, for which I have been calling for a while, on how many surgical and non-surgical aesthetic procedures in the UK go wrong, but I know who picks up the tab for it—it is us as taxpayers as the NHS has to put a lot of those procedures right. The horrendous cases that I have seen over the years provide just cause for why people need to be in full control of the facts before they undertake these procedures.

This is a boom industry. It is one where decisions on whether to go ahead with a procedure are not usually made with full information on the potential risks. It is sold, certainly online, as something similar to buying any other service. As we all know, any medical procedure has health risks connected to it, and people should be made aware of them in the adverts that are now online. I have tried writing to Facebook and others to warn them about some of the more spurious claims that some of the providers are making, but have never got a reply from Facebook. This is about patient safety. My amendment would ensure that these adverts at least raise in people’s minds the fact that there is a health risk to these procedures.

Again, people will say, “Why does this matter?” Well, the target for this sector is young people. As I said, a few years ago these adverts were in newspapers and magazines; now they are on Facebook, Twitter, Instagram and so on, and we know what they are selling: they are bombarding young people with the perfect body image.

We only have to look at the Mental Health Foundation’s report on this subject to see the effect the industry is having on young people, with 37% feeling upset and 31% feeling ashamed of their own body image. That is causing anxiety and mental health problems, but it is also forcing some people to go down the route of cosmetic surgery—both surgical and non-surgical—when there is nothing wrong with their body. It is the images, often photoshopped and sadly promoted by certain celebrities, that force them down that route.

Someone has asked me before, “Do you want to close down the cosmetic surgery industry?” I am clear that I do not; what I want is for anyone going forward for these procedures to be in full control of the facts. Personally, if I had a blank sheet of paper, I would say that people should have mental health assessments before they undertake these procedures. If we had a kitemark on adverts, as Sir Bruce Keogh recommended, or something that actually said, “This is not like buying any other service. This is a medical procedure that could go wrong”, people would be in full awareness of the facts before they went forward.

15:30
It is a modest proposal for the Bill, but it could have a major impact on the industry out there at the moment, which for many years has been completely unregulated. I do not propose pressing my new clause to a vote, but will the Minister work with his Department of Health and Social Care colleagues? Following the Health and Care Act 2022, there is a consultation on the regulations, and we could make a real difference for those I am worried about and concerned for—the more and more young people who are being bombarded with these adverts. In some cases, dangerous and potentially life-threatening procedures are being sold to them as if they are just like any other service, and they are not.
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The right hon. Gentleman makes a very important point and, as he knows, there is a wider ongoing Government review related to advertising online, which is a very serious issue. I assure him that we will follow up with colleagues in the Department of Health and Social Care to discuss the points he has raised.

Kevan Jones Portrait Mr Jones
- Hansard - - - Excerpts

I am grateful to the Minister and I will be keeping a beady eye to see how far things go. The proposal would make a difference. It is a simple but effective way of protecting people, especially young people.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Very good, that was wonderfully brief.

Damian Hinds Portrait Damian Hinds (East Hampshire) (Con)
- View Speech - Hansard - - - Excerpts

May I join others in welcoming my hon. Friend the Member for Folkestone and Hythe (Damian Collins) to his place on the Front Bench? He brings a considerable amount of expertise. I also, although it is a shame he is not here to hear me say nice things about him, pay tribute, as others have, to my hon. Friend the Member for Croydon South (Chris Philp). I had the opportunity to work with him, his wonderful team of officials and wonderful officials at the Home Office on some aspects of this Bill, and it was a great pleasure to do so. As we saw again today, his passion for this subject is matched only by his grasp of its fine detail.

I particularly echo what my hon. Friend said about algorithmic promotion, because if we address that, alongside what the Government have rightly done on ID verification options and user empowerment, we would address some of the core wiring and underpinnings at an even more elemental level of online harm.

I want to talk about two subjects briefly. One is fraud, and the other is disinformation. Opposition amendment 20 refers to disinformation, but that amendment is not necessary because of the amendments that the Government are bringing to the National Security Bill to address state-sponsored disinformation. I refer the House in particular to Government amendment 9 to that Bill. That in turn amends this Bill—it is the link, or so-called bridge, between the two. Disinformation is a core part of state threat activity and it is one of the most disturbing, because it can be done at huge volume and at very low cost, and it can be quite hard to detect. When someone has learned how to change the way people think, that makes that part of their weaponry look incredibly valuable to them.

We often talk about this in the context of elections. I think we are actually pretty good—when I say “we”, I mean our country, some other countries and even the platforms themselves—at addressing disinformation in the context of the elections themselves: the process of voting, eligibility to vote and so on. However, first, that is often not the purpose of disinformation at election time and, secondly, most disinformation occurs outside election times. Although our focus on interference with the democratic process is naturally heightened coming up to big democratic events, it is actually a 365-day-a-year activity.

There are multiple reasons and multiple modes for foreign states to engage in that activity. In fact, in many ways, the word “disinformation” is a bit unsatisfactory because a much wider set of things comes under the heading of information operations. That can range from simple untruths to trying to sow many different versions of an event, particularly a foreign policy or wartime event, to confuse the audience, who are left thinking, “Oh well, whatever story I’m being told by the BBC, my newspaper, or whatever it is, they are all much of a muchness.” Those states are competing for truth, even though in reality, of course, there is one truth. Sometimes the aim is to big up their own country, or to undermine faith in a democracy like ours, or the effectiveness of free societies.

Probably the biggest category of information operations is when there is not a particular line to push at all, but rather the disinformer is seeking to sow division or deepen division in our society, often by telling people things that they already believe, but more loudly and more aggressively to try to make them dislike some other group in society more. The purpose, ultimately, is to destabilise a free and open society such as ours and that has a cancerous effect. We talk sometimes of disinformation being spread by foreign states. Actually, it is not spread by foreign states; it is seeded by foreign states and then spread usually by people here. So they create these fake personas to plant ideas and then other people, seeing those messages and personas, unwittingly pick them up and pass them on themselves. It is incredibly important that we tackle that for the health of our democracy and our society.

The other point I want to mention briefly relates to fraud and the SNP amendments in the following group, but also Government new clause 14 in this group. I strongly support what the Government have done, during the shaping of the Bill, on fraud; there have been three key changes on fraud. The first was to bring user-generated content fraud into the scope of the Bill. That is very important for a particularly wicked form of fraud known as romance fraud. The second was to bring fraudulent advertising into scope, which is particularly important for categories of fraud such as investment fraud and e-commerce. The third big change was to make fraud a priority offence in the Bill, meaning that it is the responsibility of the platforms not just to remove that content when they are made aware of it, but to make strenuous efforts to try to stop it appearing in front of their users in the first place. Those are three big changes that I greatly welcome.

There are three further things I think the Government will need to do on fraud. First, there is a lot of fraudulent content beyond categories 1 and 2A as defined in the Online Safety Bill, so we are going to have to find ways—proportionate ways—to make sure that that fraudulent content is suppressed when it appears elsewhere, but without putting great burdens on the operators of all manner of community websites, village newsletters and so on. That is where the DCMS online advertising programme has an incredibly important part to play.

The second thing is about the huge variety of channels and products. Telecommunications are obviously important, alongside online content, but even within online, as the so-called metaverse develops further, with the internet of things and the massive potential for defrauding people through deep fakes and so on, we need to be one step ahead of these technologies. I hope that in DCMS my hon. Friends will look to create a future threats unit that seeks to do that.

Thirdly, we need to make sure everybody’s incentives are aligned on fraud. At present, the banks reimburse people who are defrauded and I hope that rate of reimbursement will shortly be increasing. They are not the only ones involved in the chain that leads to people being defrauded and often they are not the primary part of that chain. It is only right and fair, as well as economically efficient, to make sure the other parts of the chain that are involved share in that responsibility. The Bill makes sure their incentives are aligned because they have to take proportionate steps to stop fraudulent content appearing in front of customers, but we need to look at how we can sharpen that up to make sure everybody’s incentives are absolutely as one.

This is an incredibly important Bill. It has been a long time coming and I congratulate everybody, starting with my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), my hon. Friend the Member for Croydon South (Chris Philp) and others who have been closely involved in creating it. I wish my hon. Friend the Minister the best of luck.

None Portrait Several hon. Members rose—
- Hansard -

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

We will now introduce a six-minute limit on speeches. It may come down but, if Members can take less than six minutes, please do so. I intend to call the Minister at 4.20 pm.

Jamie Stone Portrait Jamie Stone (Caithness, Sutherland and Easter Ross) (LD)
- View Speech - Hansard - - - Excerpts

May I, on behalf of my party, welcome the Minister to his place?

I have been reflecting on the contributions made so far and why we are here. I am here because I know of a female parliamentary candidate who pulled out of that process because of the online abuse. I also know of somebody not in my party—it would be unfair to name her or her party—who stood down from public life in Scotland mostly because of online abuse. This is something that threatens democracy, which we surely hold most dear.

Most of us are in favour of the Bill. It is high time that we had legislation that keeps users safe online, tackles illegal content and seeks to protect freedom of speech, while also enforcing the regulation of online spaces. It is clear to me from the myriad amendments that the Bill as it currently stands is not complete and does not go far enough. That is self-evident. It is a little vague on some issues.

I have tabled two amendments, one of which has already been mentioned and is on media literacy. My party and I believe Ofcom should have a duty to promote and improve the media literacy of the public in relation to regulated user-to-user services and search services. That was originally in the Bill but it has gone. Media literacy is mentioned only in the context of risk assessments. There is no active requirement for internet companies to promote media literacy.

The pandemic proved that a level of skill is needed to navigate the online world. I offer myself as an example. The people who help me out in my office here and in my constituency are repeatedly telling me what I can and cannot do and keeping me right. I am of a certain age, but that shows where education is necessary.

My second amendment is on end-to-end encryption. I do not want anything in this Bill to prevent providers of online services from protecting their users’ privacy through end-to-end encryption. It does provide protection to individuals and if it is circumvented or broken criminals and hostile foreign states can breach security. Privacy means security.

There are also concerns about the use of the word “harm” in the Bill. It remains vague and threatens to capture a lot of unintended content. I look forward to seeing what comes forward from the Government on that front. It focuses too much on content as opposed to activity and system design. Regulation of social media must respect the rights to privacy and free expression of those who use it. However, as the right hon. Member for Barking (Dame Margaret Hodge) said, that does not mean a laissez-faire approach: bullying and abuse prevent people from expressing themselves and must at all costs be stamped out, not least because of the two examples I mentioned at the start of my contribution.

As I have said before, the provisions on press exemption are poorly drafted. Under the current plans, the Russian propaganda channel Russia Today, on which I have said quite a bit in this place in the past, would qualify as a recognised news publisher and would therefore be exempt from regulation. That cannot be right. It is the same news channel that had its licence revoked by Ofcom.

I will help you by being reasonably brief, Mr Deputy Speaker, and conclude by saying that as many Members have said, the nature of the Bill means that the Secretary of State will have unprecedented powers to decide crucial legislation later. I speak—I will say it again—as a former chair of the Scottish Parliament’s statutory instruments committee, so I know from my own experience that all too often, instruments that have far-reaching effects are not given the consideration in this place that they should receive. Such instruments should be debated by the rest of us in the Commons.

As I said at the beginning of my speech, the myriad amendments to the Bill make it clear that the rest of us are not willing to allow it to remain so inherently undemocratic. We are going in the right direction, but a lot can be done to improve it. I wait with great interest to see how the Minister responds and what is forthcoming in the period ahead.

15:45
Andrew Percy Portrait Andrew Percy
- View Speech - Hansard - - - Excerpts

This has been an interesting debate on a Bill I have followed closely. I have been particularly struck by some of the arguments that claim the Bill is an attack on freedom of speech. I always listen intently to my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and to the hon. and learned Member for Edinburgh South West (Joanna Cherry), but I think they are wrong in the conclusions they have reached about legal but harmful content. Indeed, many of the criticisms that the hon. and learned Member for Edinburgh South West made of the various platforms were criticisms of the present situation, and that is exactly why I think this legislation will improve the position. However, those Members raised important points that I am sure will be responded to. I have also been a strong advocate of the inclusion of small but high-harm platforms, as the Minister and the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), both know—we have all had those discussions.

In the time I have, I want to focus principally on the issue of search and on new clauses 9 and 10, which stand in my name. As the shadow Minister has highlighted, last week we were—like many people in this place, perhaps—sent the most remarkable online prompt, which was to simply search Google for the words “desk ornament”. The top images displayed in response to that very mundane and boring search were of swastikas, SS bolts and other Nazi memorabilia presented as desk ornaments. Despite there having been awareness of that fact since, I believe, the previous weekend, and even though Google is making millions of pounds in seconds from advertising, images promoting Nazism were still available for all to see as a result of those searches.

When he gave evidence to the Bill Committee recently, Danny Stone, the Antisemitism Policy Trust’s very capable chief executive, pointed out that Amazon’s Alexa had used just one comment posted by one individual on Amazon’s website to inform potentially millions of users who cared to ask that George Soros was responsible for all of the world’s evils, and that Alexa had used a comment from another website to inform those who searched for it that the humanitarian group the White Helmets was an illicit operation founded by a British spy.

As we have seen throughout the covid pandemic, similar results come up in response to other searches, such as those around vaccines and covid. The Antisemitism Policy Trust has previously demonstrated that Microsoft Bing, the platform that lies behind Alexa, was directing users to hateful searches such as “Jews are bastards” through autocompletes, as well as pointing people to homophobic stories. We even had the sickening situation of Google’s image carousel highlighting Jewish baby strollers in response to people searching for portable barbecues.

Our own Alexa searches highlighted the issue some time ago. Users who asked Alexa “Do Jews control the media?” were responded to with a quote from a website called Jew Watch—that should tell Members all they need to know about the nature of the platform—saying that Jews control not only the media, but the financial system too. The same problem manifests itself across search platforms in other languages, as we highlighted not so long ago with Siri in Spanish. When asked, “Do the Jews control the media?” she responds with an article that states that Jews do indeed control international media. This goes on and on, irrespective of whether the search is voice or text-based.

The largest search companies in the world are falling at the first hurdle when it comes to risk assessing for harms on their platform. That is the key point when we ask for lawful but harmful content to be responded to. It is about risk assessment—requiring companies that do not respect borders, operate globally and are in many ways more powerful than Governments to risk assess and warn about lawful but deeply harmful content that all of us in the House would be disgusted by.

At present, large traditional search services including Google and Microsoft Bing, and voice search assistants including Alexa and Siri, will be exempted from having to risk assess their systems and address harm to adults, despite the fact that other large user-to-user services will have to do so. How can it be possible that Google does not have to act, when Meta—Facebook—and Twitter do? That does not seem consistent with the aims of the Bill.

There is a lot more that I would like to have said on the Bill. I welcome the written ministerial statement last week in relation to small but high-harm platforms. I hope that as the Bill progresses to the other place, we can look again at search. Some of the content generated is truly appalling, even though it may very well be considered lawful.

Feryal Clark Portrait Feryal Clark (Enfield North) (Lab)
- View Speech - Hansard - - - Excerpts

I join everyone else in the House in welcoming the Minister to his place.

I rise to speak in support of amendments 15 and 16. At the core of this issue is the first duty of any Government: to keep people safe. Too often in debates, which can become highly technical, we lose sight of that fact. We are not just talking about technology and regulation; we are talking about real lives and real people. It is therefore incumbent on all of us in this place to have that at the forefront of our minds when discussing such legislation.

Labelling social media as the wild west of today is hardly controversial—that is plain and obvious for all to see. There has been a total failure on the part of social media companies to make their platforms safe for everyone to use, and that needs to change. Regulation is not a dirty word, but a crucial part of ensuring that as the internet plays a bigger role in every generation’s lives, it meets the key duty of keeping people safe. It has been a decade since we first heard of this Bill, and almost four years since the Government committed to it, so I am afraid that there is nothing even slightly groundbreaking about the Bill as it is today. We have seen progress being made in this area around the world, and the UK is falling further and further behind.

Of particular concern to me is the impact on children and young people. As a mother, I worry for the world that my young daughter will grow up in, and I will do all I can in this place to ensure that children’s welfare is at the absolute forefront. I can see no other system or institution that children are allowed to engage with that has such a wanting lack of safeguards and regulation. If there was a faulty slide in a playground, it would be closed off and fixed. If a sports field was covered with glass or litter, that would be reported and dealt with. Whether we like it or not, social media has become the streets our children hang out in, the world they grow up in and the playground they use. It is about time we started treating it with the same care and attention.

There are far too many holes in the Bill that allow for the continued exploitation of children. Labour’s amendments 15 and 16 tackle the deeply troubling issue of “breadcrumbing”. That is where child abusers use social networks to lay trails to illegal content elsewhere online and share videos of abuse edited to fall within content moderation guidelines. The amendments would give the regulators powers to tackle that disgusting practice and ensure that there is a proactive response to it. They would bring into regulatory scope the millions of interactions with accounts that actively enable child abuse. Perhaps most importantly, they would ensure that social media companies tackled child abuse at the earliest possible stage.

In its current form, even with Government amendment 14, the Bill merely reinforces companies’ current focus only on material that explicitly reaches the criminal threshold. That is simply not good enough. Rather than acknowledging that issue, Government amendments 71 and 72 let social media companies off the hook. They remove the requirement for companies to apply their terms and conditions “consistently”. That was addressed very eloquently by the hon. Member for Croydon South (Chris Philp) and the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), who highlighted that Government amendment 14 simply does not go far enough.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On the amendments that the former Minister, my hon. Friend the Member for Croydon South (Chris Philp), spoke to, the word “consistently” has not been removed from the text. There is new language that follows the use of “consistently”, but the use of that word will still apply in the context of the companies’ duties to act against illegal content.

Feryal Clark Portrait Feryal Clark
- View Speech - Hansard - - - Excerpts

I welcome the Minister’s clarification and look forward to the amendments being made to the Bill. Other than tying one of our hands behind our back in relation to trying to keep children safe, however, the proposals as they stand do not achieve very much. This will undermine the entire regulatory system, practically rendering it completely ineffective.

Although I welcome the Bill and some of the Government amendments, it still lacks a focus on ensuring that tech companies have the proper systems in place to fulfil their duty of care and keep our children safe. The children of this country deserve better. That is why I wholeheartedly welcome the amendments tabled by my hon. Friend the Member for Pontypridd (Alex Davies-Jones) and urge Government Members to support them.

None Portrait Several hon. Members rose—
- Hansard -

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- View Speech - Hansard - - - Excerpts

Order. We will stick with a time limit of six minutes, but I put everybody on notice that we may have to move that down to five.

Adam Afriyie Portrait Adam Afriyie
- Hansard - - - Excerpts

I very much welcome the Bill, which has been a long time in the making. It has travelled from my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) to my hon. Friend the Member for Croydon South (Chris Philp) and now to my hon. Friend the Member for Folkestone and Hythe (Damian Collins); I say a huge thank you to them for their work. The Bill required time because this is a very complex matter. There are huge dangers and challenges in terms of committing offences against freedom of speech. I am glad that Ministers have recognised that and that we are very close to an outcome.

The Bill is really about protection—it is about protecting our children and our society from serious harms—and nobody here would disagree that we want to protect children from harm online. That is what 70% to 80% of the Bill achieves. Nobody would disagree that we need to prevent acts of terror and incitement to violence. We are all on the same page on that across the House. What we are talking about today, and what we have been talking about over the past several months, are nips and tucks to try to improve elements of the Bill. The framework appears to be generally correct. We need to drill down into some of the details to ensure that the areas that each of us is concerned about are dealt with in the Bill we finally produce, as it becomes an Act of Parliament.

There are several amendments tabled in my name and those of other right hon. and hon. Members. I can only canter through them cursorily in the four minutes and 30 seconds remaining to me, but I will put these points on the record in the hope that the Minister will respond positively to many of them.

Amendments 48 and 49 would ensure that providers can decide to keep user-generated content online, taking no action if that content is not harmful. In effect, the Government have accepted those amendments by tabling amendment 71, so I thank the Minister for that.

My amendment 50 says that the presumption should be tipped further in favour of freedom of expression and debate by ensuring that under their contractual terms of service, except in particular circumstances, providers are obliged to leave content online. I emphasise that I am not talking about harmful or illegal content; amendment 50 seeks purely to address content that may be controversial but does not cross the line.

16:00
I turn to amendment 51. It appears that the Bill protects the media, journalists, Governments and us politicians, while providers have some protections against being fined unjustly. In many ways, the only people who are not protected are the public—the users with user-generated legal content. It seems to me that we need to increase the powers for citizens to get an outcome if their content is taken down inaccurately, incorrectly or inappropriately. We would do well to look at ensuring, as amendment 51 would, that citizens can also seek compensation. Tens of millions of pounds, if not hundreds of millions, are about to go to the regulator and to the Government in a form of quasi-taxation, so there needs to be a mechanism for a judge to decide that if somebody has been harmed by the inappropriate removal of legal content, they can get some redress.
Government amendment 94 is quite interesting. I can certainly see the reason for it and the purpose that it seeks to achieve, but it will require providers to take into account the entire criminal code. Effectively, they will have to act as a policeman, policing all internet content against all legislation. I am sure that that is not the intent behind amendment 94. I simply urge the Government to take a look at my amendment 52, which would ensure that relevant offences include only those specified, so that providers do not need to understand the entire criminal code.
The primary area of concern, which many other hon. Members have voiced, is that it looks as if the Secretary of State will be given the power to specify priority harms without that decision necessarily being passed on the Floor of the House. It seems to me that it is Parliament that should primarily be making regulations and legislation, so I really urge the Government to take another look and ensure that if a Secretary of State seeks to modify the priority harms or specify certain content as harmful or illegal, it is debated in the Chamber of the House of Commons. That is the primary function of this place.
Technology moves very quickly, so personally I would welcome an annual debate on areas that may need improvement. Now that we are outside the European Union and have autonomy, those are the kinds of things that we must decide in this Chamber.
Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- View Speech - Hansard - - - Excerpts

I rise to speak to new clauses 25 and 26 in my name. The Government rightly seek to make the UK the safest place in the world to go online, especially for our children, and some of their amendments will start to address previous gaps in the Bill. However, I believe that the Bill still falls short in its aim not only to protect children from harm and abuse, but, importantly, to empower and enable young people to make the most of the online world.

I welcome the comments that the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) made about how we achieve the balance between rights and protecting children from harm. I also welcome his amendments on children’s wellbeing, which seek to achieve that balance.

With one in five children going online, keeping them safe is more difficult but more important than ever. I speak not only as the mother of two very young children who are growing up with iPads in their hands, but as—like everyone else in the Chamber—a constituency Member of Parliament who speaks regularly to school staff and parents who are concerned about the harms caused by social media in particular, but also those caused by games and other services to which children have access.

The Bill proffers a broad and vague definition of content that is legal yet harmful. As many have already said, it should not be the responsibility of the Secretary of State, in secondary legislation, to make decisions about how and where to draw the line; Parliament should set clear laws that address specific, well-defined harms, based on strong evidence. The clear difficulty that the Government have in defining what content is harmful could have been eased had the Bill focused less on removing harmful content and more on why service providers allow harmful content to spread so quickly and widely. Last year, the 5Rights Foundation conducted an experiment in which it created several fake Instagram profiles for children aged between 14 and 17. When the accounts were searched for the term “skinny”, while a warning pop-up message appeared, among the top results were

“accounts promoting eating disorders and diets, as well as pages advertising appetite-suppressant gummy bears.”

Ultimately, the business models of these services profit from the spread of such content. New clause 26 requires the Government and Ofcom to focus on ensuring that internet services are safe by design. They should not be using algorithms that give prominence to harmful content. The Bill should focus on harmful systems rather than on harmful content.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It does focus on systems as well as content. We often talk about content because it is the exemplar for the failure of the systems, but the systems are entirely within the scope of the Bill.

Munira Wilson Portrait Munira Wilson
- View Speech - Hansard - - - Excerpts

I thank the Minister for that clarification, but there are still many organisations out there, not least the Children’s Charities Coalition, that feel that the Bill does not go far enough on safety by design. Concerns have rightly been expressed about freedom of expression, but if we focus on design rather than content, we can protect freedom of expression while keeping children safe at the same time. New clause 26 is about tackling harms downstream, safeguarding our freedoms and, crucially, expanding participation among children and young people. I fear that we will always be on the back foot when trying to tackle harmful content. I fear that regulators or service providers will become over-zealous in taking down what they consider to be harmful content, removing legal content from their platforms just in case it is harmful, or introducing age gates that deny children access to services outright.

Of course, some internet services are clearly inappropriate for children, and illegal content should be removed—I think we all agree on that—but let us not lock children out of the digital world or let their voices be silenced. Forty-three per cent. of girls hold back their opinions on social media for fear of criticism. Children need a way to exercise their rights. Even the Children’s Commissioner for England has said that heavy-handed parental controls that lock children out of the digital world are not the solution.

I tabled new clause 25 because the Bill’s scope, focusing on user-to-user and search services, is too narrow and not sufficiently future-proof. It should cover all digital technology that is likely to be accessed by children. The term

“likely to be accessed by children”

appears in the age-appropriate design code to ensure that the privacy of children’s data is protected. However, that more expansive definition is not included in the Bill, which imposes duties on only a subset of services to keep children safe. Given rapidly expanding technologies such as the metaverse—which is still in its infancy—and augmented reality, as well as addictive apps and games that promote loot boxes and gambling-type behaviour, we need a much more expansive definition

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I am grateful to the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for keeping her powder dry and deferring her speech until the next group of amendments, so Members now have five minutes each.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to speak in favour of amendments 15 to 19 in the names of my hon. Friends and, later, amendments 11 and 12 in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright).

As we discussed at great length in Committee—my first Bill Committee; a nice simple one to get me started—the Bill has a number of critical clauses to address the atrocious incidence of child sexual expectation online. Amendments 15 to 19 are aimed at strengthening those protections and helping to ensure that the internet is a safer place for every young person. Amendments 15 and 16 will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material. Amendments 17 to 19 will tackle the issue of cross-platform abuse, where abuse starts on one platform and continues on another. These are urgent measures that children’s charities and advocacy groups have long called for, and I seriously hope this House will support them.

Last week, along with the shadow Minister and the then Minister, I attended an extremely moving reception hosted by one of those organisations, the NSPCC. It included a speech by Rachel, a mother of a victim of online grooming and child sexual exploitation. She outlined in a very powerful way how her son Ben was forced from the age of 13 to take and share photos of himself that he did not want to, and to enter Skype chats with multiple men. He was then blackmailed with those images and subjected to threats of violence to his family. Rachel said to us:

“We blamed ourselves and I thought we had failed…I felt like I hadn’t done enough to protect our children”.

I want to say to you, Rachel, that you did not fail Ben. Responsibility for what happened to Ben lies firmly with the perpetrators of these heinous crimes, but what did fail Ben and has failed our young people for far too long is the lack of urgency and political will to regulate the wild west of the internet. No one is pretending that this is an easy task, and we are dealing with a highly complex piece of legislation, but if we are to protect future Bens we have to strengthen this Bill as much as possible.

Another young woman, Danielle, spoke during the NSPCC event. She had been a victim of online CSE that had escalated into horrific real-world physical and sexual abuse. She told us how she has to live with the fear that her photos may appear online and be shared without her knowledge or control. She is a strong young woman who is moving on with her life with huge resilience, but her trauma is very real. Amendment 19 would ensure that proportionate measures are in place to prevent the encountering or dissemination of child abuse content—for example, through intelligence sharing of new and emerging threats. This will protect Danielle and people like her, giving them some comfort that measures are in place to stop the spread of these images and to place far more onus on the platforms to get on top of this horrific practice.

Amendments 11 and 12, in the name of the right hon. and learned Member for Kenilworth and Southam, will raise the threshold for non-broadcast media outlets to benefit from the recognised news publisher exemption by requiring that such publishers are subject to complaints procedures that are both suitable and sufficient. I support those amendments, which, while not perfect, are a step forward in ensuring that this exception is protected from abuse.

I am also pleased that the Government have listened to some of my and other Members’ concerns and have now agreed to bring forward amendments at a later stage to exclude sanctioned publishers such as Russia Today from accessing this exemption. However, there are hundreds if not thousands of so-called news publishers across the internet that pose a serious threat, from the far right and also from Islamist, antisemitic and dangerous conspiratorial extremism. We must act to ensure that journalistic protections are not abused by those wishing to spread harm. Let us be clear that this is as much about protecting journalism as it is about protecting users from harm.

We cannot overstate the seriousness of getting this right. Carving out protections within the Bill creates a risk that if we do not get the criteria for this exemption right, harmful and extremist websites based internationally will simply establish offices in the UK, just so that they too can access this powerful new protection. Amendments 11 and 12 will go some way towards ensuring that news publishers are genuine, but I recognise that the amendments are not the perfect solution and that more work is needed as the Bill progresses in the other place.

In closing, I hope that we can find consensus today around the importance of protecting children online and restricting harmful content. It is not always easy, but I know we can find common ground in this place, as we saw during the Committee stage of the Bill when I was delighted to gain cross-party support to secure the introduction of Zach’s law, inspired by my young constituent Zach Eagling, which will outlaw the dreadful practice of epilepsy trolling online.

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

You will resume your seat no later than 4.20 pm. We will therefore not put the clock on you.

16:14
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will try to avoid too much preamble, but I thank the former Minister, the hon. Member for Croydon South (Chris Philp), for all his work in Committee and for listening to my nearly 200 contributions, for which I apologise. I welcome the new Minister to his place.

As time has been short today, I am keen to meet the Minister to discuss my new clauses and amendments. If he cannot meet me, I would be keen for him to meet the NSPCC, in particular, on some of my concerns.

Amendment 196 is about using proactive technology to identify CSEA content, which we discussed at some length in Committee. The hon. Member for Croydon South made it very clear that we should use scanning to check for child sexual abuse images. My concern is that new clause 38, tabled by the Lib Dems, might exclude proactive scanning to look for child sexual abuse images. I hope that the Government do not lurch in that direction, because we need proactive scanning to keep children protected.

New clause 18 specifically addresses child user empowerment duties. The Bill currently requires that internet service providers have user empowerment duties for adults but not for children, which seems bizarre. Children need to be able to say yes or no. They should be able to make their own choices about excluding content and not receiving unsolicited comments or approaches from anybody not on their friend list, for example. Children should be allowed to do that, but the Bill explicitly says that user empowerment duties apply only to adults. New clause 18 is almost a direct copy of the adult user empowerment duties, with a few extra bits added. It is important that children have access to user empowerment.

Amendment 190 addresses habit-forming features. I have had conversations about this with a number of organisations, including The Mix. I regularly accessed its predecessor, The Site, more than 20 years ago, and it is concerned that 42% of young people surveyed by YoungMinds show addiction-like behaviour in what they are accessing on social media. There is nothing on that in this Bill. The Mix, the Mental Health Foundation, the British Psychological Society, YoungMinds and the Royal College of Psychiatrists are all unhappy about the Bill’s failure to regulate habit-forming features. It is right that we provide support for our children, and it is right that our children are able to access the internet safely, so it is important to address habit-forming behaviour.

Amendment 162 addresses child access assessments. The Bill currently says that providers need to do a child access assessment only if there is a “significant” number of child users. I do not think that is enough and I do not think it is appropriate, and the NSPCC agrees. The amendment would remove the word “significant.” OnlyFans, for example, should not be able to dodge the requirement to child risk assess its services because it does not have a “significant” number of child users. These sites are massively harmful, and we need to ensure changes are made so they cannot wriggle out of their responsibilities.

Finally, amendment 161 is about live, one-to-one oral communications. I understand why the Government want to exempt live, one-to-one oral communications, as they want to ensure that phone calls continue to be phone calls, which is totally fine, but they misunderstand the nature of things like Discord and how people communicate on Fortnite, for example. People are having live, one-to-one oral communications, some of which are used to groom children. We cannot explicitly exempt them and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way so that children can be protected from the grooming behaviour we see on some online platforms.

Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot meet me, will he please meet the NSPCC? We cannot explicitly exempt those and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way, in order that children can be protected from that grooming behaviour that we see on some of those platforms that are coming online. Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot do that, I ask that the NSPCC have a meeting with him.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

We have had a wide-ranging debate of passion and expert opinion from Members in all parts of the House, which shows the depth of interest in this subject, and the depth of concern that the Bill is delivered and that we make sure we get it right. I speak as someone who only a couple of days ago became the Minister for online safety, although I was previously involved in engaging with the Government on this subject. As I said in my opening remarks, this has been an iterative process, where Members from across the House have worked successfully with the Government to improve the Bill. That is the spirit in which we should complete its stages, both in the Commons and in the Lords, and look at how we operate this regime when it has been created.

I wish to start by addressing remarks made by the hon. Member for Pontypridd (Alex Davies-Jones), the shadow Minister, and by the hon. Member for Cardiff North (Anna McMorrin) about violence against women and girls. There is a slight assumption that if the Government do not accept an amendment that writes, “Violence against women and girls” into the priority harms in the Bill, somehow the Bill does not address that issue. I think we would all agree that that is not the case. The provisions on harmful content that is directed at any individual, particularly the new harms offences approved by the Law Commission, do create offences in respect of harm that is likely to lead to actual physical harm or severe psychological harm. As the father of a teenage girl, who was watching earlier but has now gone to do better things, I say that the targeting of young girls, particularly vulnerable ones, with content that is likely to make them more vulnerable is one of the most egregious aspects of the way social media works. It is right that we are looking to address serious levels of self-harm and suicide in the Bill and in the transparency requirements. We are addressing the self-harm and suicide content that falls below the illegal threshold but where a young girl who is vulnerable is being sent content and prompted with content that can make her more vulnerable, could lead her to harm herself or worse. It is absolutely right that that was in the scope of the Bill.

New clause 3, perfectly properly, cites international conventions on violence against women and girls, and how that is defined. At the moment, with the way the Bill is structured, the schedule 7 offences are all based on existing areas of UK law, where there is an existing, clear criminal threshold. Those offences, which are listed extensively, will all apply as priority areas of harm. If there is, through the work of the Law Commission or elsewhere, a clear legal definition of misogyny and violence against women and girls that is not included, I think it should be included within scope. However, if new clause 3 was approved, as tabled, it would be a very different sort of offence, where it would not be as clear where the criminal threshold applied, because it is not cited against existing legislation. My view, and that of the Government, is that existing legislation covers the sorts of offences and breadth of offences that the shadow Minister rightly mentioned, as did other Members. We should continue to look at this—

Anna McMorrin Portrait Anna McMorrin
- Hansard - - - Excerpts

The Minister is not giving accurate information there. Violence against women and girls is defined by article 3 of the Council of Europe convention on preventing violence against women and domestic violence—the Istanbul convention. So there is that definition and it would be valid to put that in the Bill to ensure that all of that is covered.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I was referring to the amendment’s requirement to list that as part of the priority illegal harms. The priority illegal harms set out in the Bill are all based on existing UK Acts of Parliament where there is a clear established criminal threshold—that is the difference. The spirit of what that convention seeks to achieve, which we would support, is reflected in the harm-based offences written into the Bill. The big change in the structure of the Bill since the draft Bill was published—the Joint Committee on the Draft Online Safety Bill and I pushed for this at the time—is that far more of these offences have been clearly written into the Bill so that it is absolutely clear what they apply to. The new offences proposed by the Law Commission, particularly those relating to self-harm and suicide, are another really important addition. We know what the harms are. We know what we want this Bill to do. The breadth of offences that the hon. Lady and her colleagues have set out is covered in the Bill. But of course as law changes and new offences are put in place, the structure of the Bill, through the inclusion of new schedule 7 on priority offences, gives us the mechanism in the future, through instruments of this House, to add new offences to those primary illegal harms as they occur. I expect that that is what would happen. I believe that the spirit of new clause 3 is reflected in the offences that are written into the Bill.

The hon. Member for Pontypridd mentioned Government new clause 14. It is not true that the Government came up with it out of nowhere. There has been extensive consultation with Ofcom and others. The concern is that some social media companies, and some users of services, may have sought to interpret the criminal threshold as being based on whether a court of law has found that an offence has been committed, and only then might they act. Actually, we want them to pre-empt that, based on a clear understanding of where the legal threshold is. That is how the regulatory codes work. So it is an attempt, not to weaken the provision but to bring clarity to the companies and the regulator over the application.

The hon. Member for Ochil and South Perthshire (John Nicolson) raised an important point with regard to the Modern Slavery Act. As the Bill has gone along, we have included existing migration offences and trafficking offences. I would be happy to meet him further to discuss that aspect. Serious offences that exist in law should have an application, either as priority harms or as non-priority legal harms, and we should consider how we do that. I do not know whether he intends to press the amendment, but either way, I would be happy to meet him and to discuss this further.

My hon. Friend the Member for Solihull, the Chair of the Digital, Culture, Media and Sport Committee, raised an important matter with regard to the power of the Secretary of State, which was a common theme raised by several other Members. The hon. Member for Ochil and South Perthshire rightly quoted me, or my Committee’s report, back to me—always a chilling prospect for a politician. I think we have seen significant improvement in the Bill since the draft Bill was published. There was a time when changes to the codes could be made by the negative procedure; now they have to be by a positive vote of both Houses. The Government have recognised that they need to define the exceptional circumstances in which that provision might be used, and to define specifically the areas that are set out. I accept from the Chair of the Select Committee and my right hon. and learned Friend the Member for Kenilworth and Southam that those things could be interpreted quite broadly—maybe more broadly than people would like—but I believe that progress has been made in setting out those powers.

I would also say that this applies only to the period when the codes of practice are being agreed, before they are laid before Parliament. This is not a general provision. I think sometimes there has been a sense that the Secretary of State can at any time pick up the phone to Ofcom and have it amend the codes. Once the codes are approved by the House they are fixed. The codes do not relate to the duties. The duties are set out in the legislation. This is just the guidance that is given to companies on how they comply. There may well be circumstances in which the Secretary of State might look at those draft codes and say, “Actually, we think Ofcom has given the tech companies too easy a ride here. We expected the legislation to push them further.” Therefore it is understandable that in the draft form the Secretary of State might wish to have the power to raise that question, and not dictate to Ofcom but ask it to come back with amendments.

I take on board the spirit of what Members have said and the interest that the Select Committee has shown. I am happy to continue that dialogue, and obviously the Government will take forward the issues that they set out in the letter that was sent round last week to Members, showing how we seek to bring in that definition.

A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend the Member for Windsor (Adam Afriyie) at the end of his excellent speech. We have sought to bring, in the Government amendments, additional clarity to the way the legislation works, so that it is absolutely clear what the priority legal offences are. Where we have transparency requirements, it is absolutely clear what they apply to. The amendment that the Government tabled reflects the work that he and his colleagues have done, setting out that if we are discussing the terms of service of tech companies, it should be perfectly possible for them to say that this is not an area where they intend to take enforcement action and the Bill does not require them to do so.

The hon. Member for Batley and Spen (Kim Leadbeater) mentioned Zach’s law. The hon. Member for Ochil and South Perthshire raised that before the Joint Committee. So, too, did my hon. Friend the Member for Watford (Dean Russell); he and the hon. Member for Ochil and South Perthshire are great advocates on that. It is a good example of how a clear offence, something that we all agree to be wrong, can be tackled through this legislation; in this case, a new offence will be created, to prevent the pernicious targeting of people with epilepsy with flashing images.

Finally, in response to the speech by the hon. Member for Aberdeen North (Kirsty Blackman), I certainly will continue dialogue with the NSPCC on the serious issues that she has raised. Obviously, child protection is foremost in our mind as we consider the legislation. She made some important points about the ability to scan for encrypted images. The Government have recently made further announcements on that, to be reflected as the Bill progresses through the House.

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

To assist the House, I anticipate two votes on this first section and one vote immediately on the next, because it has already been moved and debated.

16:30
Proceedings interrupted (Programme Order, this day).
The Deputy Speaker put forthwith the Question already proposed from the Chair (Standing Order No. 83E), That the clause be read a Second time.
Question agreed to.
New clause 19 accordingly read a Second time, and added to the Bill.
The Deputy Speaker then put forthwith the Questions necessary for the disposal of the business to be concluded at that time (Standing Order No. 83E).
New Clause 3
Priority illegal content: violence against women and girls
“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—
(a) constitutes,
(b) encourages, or
(c) promotes
(2) “Violence against women and girls” is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (“the Istanbul Convention”).”—(Alex Davies-Jones.)
This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.
Brought up,
Question put, That the clause be added to the Bill.
16:30

Division 35

Ayes: 226

Noes: 292

Clause 5
Overview of Part 3
Amendment made: 57, page 4, line 36, at beginning insert ““priority offence”,”.—(Damian Collins.)
This is a technical amendment providing for a signpost to the definition of “priority offence” in the clause giving an overview of Part 3.
Clause 6
Providers of user-to-user services: duties of care
Amendment made: 163, page 5, line 30, at end insert—
“(da) the duties to protect news publisher content set out in section (Duties to protect news publisher content),”. —(Damian Collins.)
This amendment is consequential on NC19.
Clause 8
Illegal content risk assessment duties
Amendments made: 58, page 6, line 45, at end insert—
“(ba) the level of risk of the service being used for the commission or facilitation of a priority offence;”
This amendment adds another matter to the matters that should be included in a provider’s risk assessment regarding illegal content on a user-to-user service, so that the risks around use of a service for the commission or facilitation of priority offences are included.
Amendment 59, page 7, line , at end insert
“or by the use of the service for the commission or facilitation of a priority offence”.
This amendment ensures that providers’ risk assessments about illegal content on a user-to-user service must consider the risk of harm from use of the service for the commission or facilitation of priority offences.
Amendment 60, page 7, line 4, after “content” insert
“or the use of the service for the commission or facilitation of a priority offence”.—(Damian Collins.)
This amendment ensures that providers’ risk assessments about illegal content on a user-to-user service must consider the risk of functionalities of the service facilitating the use of the service for the commission or facilitation of priority offences.
Clause 9
Safety duties about illegal content
Amendments made: 61, page 7, line 24, leave out from “measures” to the end of line 26 and insert
“relating to the design or operation of the service to—
(a) prevent individuals from encountering priority illegal content by means of the service,
(b) effectively mitigate and manage the risk of the service being used for the commission or facilitation of a priority offence, as identified in the most recent illegal content risk assessment of the service, and
(c) effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service (see section 8(5)(f)).”
The substantive changes made by this amendment are: (1) making it clear that compliance with duties to mitigate risks as mentioned in paragraphs (a) to (c) is to be achieved by the way a service is designed or operated, and (2) paragraph (b) is a new risk mitigation duty on providers to deal with the risks around use of a user-to-user service for the commission or facilitation of priority offences.
Amendment 62, page 7, line 29, leave out paragraph (a).
This amendment omits the provision that now appears in subsection (2) of this clause: see paragraph (a) of the provision inserted by Amendment 61.
Amendment 63, page 7, line 37, after “is” insert “designed”.
This adds a reference to design to clause 9(4) which provides for the illegal content duties for user-to-user services to apply across all areas of a service.
Amendment 64, page 8, line 5, leave out “paragraphs (a) and (b)” and insert “paragraph (b)”.
This amendment is a technical change consequential on Amendment 62.
Amendment 65, page 8, line 9, leave out from “consistently” to end of line 10.(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply the terms of service of a user-to-user service consistently. The omitted words relate to the material now dealt with in NC14.
Clause 11
Safety duties protecting children
Amendments made: 66, page 10, line 5, after “measures” insert
“relating to the design or operation of the service”.
This amendment makes it clear that compliance with duties to mitigate risks of harm from content harmful to children on user-to-user services is to be achieved by the way a service is designed or operated.
Amendment 67, page 10, line 9, after “service” insert “(see section 10(6)(g))”.
This is a technical amendment to put beyond doubt the meaning of a provision about risks identified in a risk assessment relating to content that is harmful to children on user-to-user services.
Amendment 68, page 10, line 22, after “is” insert “designed”.
This adds a reference to design to clause 11(4) which provides for the children’s safety duties for user-to-user services to apply across all areas of a service.
Amendment 69, page 11, line 2, leave out from “consistently” to end of line 4.(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply the terms of service of a user-to-user service consistently. The omitted words relate to the material now dealt with in NC14.
Clause 13
Safety duties protecting adults
Amendments made: 70, page 13, line 4, leave out subsection (3) and insert—
“(3) If a provider decides to treat a kind of priority content that is harmful to adults in a way described in subsection (4), a duty to include provisions in the terms of service specifying how that kind of content is to be treated (separately covering each kind of priority content that is harmful to adults which a provider decides to treat in one of those ways).”
This amendment ensures that a provider must state in the terms of service how priority content that is harmful to adults is to be treated, if the provider decides to treat it in one of the ways listed in clause 13(4).
Amendment 71, page 13, line 12, at end insert—
“(e) allowing the content without treating it in a way described in any of paragraphs (a) to (d).”
This amendment expands the list in clause 13(4) with the effect that if a provider decides to treat a kind of priority content that is harmful to adults by not recommending or promoting it, nor taking it down or restricting it etc, the terms of service must make that clear.
Amendment 72, page 13, line 23, leave out from “consistently” to end of line 25.—(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply the terms of service of a Category 1 service consistently. The omitted words relate to the material now dealt with in NC14.
Clause 15
Duties to protect content of democratic importance
Amendment made: 73, page 15, line 4, leave out from “consistently” to end of line 5.—(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply the terms of service of a Category 1 service consistently. The omitted words relate to the material now dealt with in NC14.
Clause 16
Duties to protect journalistic content
Amendments made: 164, page 15, line 44, at end insert—
“(5A) Subsections (3) and (4) do not require a provider to make a dedicated and expedited complaints procedure available to a recognised news publisher in relation to a decision if the provider has taken the steps set out in section (Duties to protect news publisher content)(3) in relation to that decision.”
This amendment ensures that where a recognised news publisher has a right to make representations about a proposal to take action in relation to news publisher content or against the recognised news publisher under the new clause introduced by NC19, a provider is not also required to offer that publisher a complaints procedure under clause 16.
Amendment 74, page 16, line 11, leave out from “consistently” to end of line 12.
This amendment omits words from a provision imposing a duty to apply the terms of service of a Category 1 service consistently. The omitted words relate to the material now dealt with in NC14.(Damian Collins.)
Amendment 165, page 16, line 13, leave out “section” and insert “Part”.—(Damian Collins.)
This is a technical amendment ensuring that the definition of “journalistic content” applies for the purposes of Part 3 of the Bill.
Clause 18
Duties about complaints procedures
Amendment made: 166, page 19, line 14, at end insert—
“(iiia) section (Duties to protect news publisher content) (news publisher content)”.
This amendment ensures that users and affected persons can complain if a provider is not complying with a duty set out in NC19.(Damian Collins.)
Clause 19
Duties about freedom of expression and privacy
Amendments made: 167, page 20, line 18, at end insert—
“(5A) An impact assessment relating to a service must include a section which considers the impact of the safety measures and policies on the availability and treatment on the service of content which is news publisher content or journalistic content in relation to the service.”
This amendment requires a provider of a Category 1 service to include a section in their impact assessments considering the effect of the provider’s measures and policies on the availability on the service of news publisher content and journalistic content.
Amendment 168, page 20, line 37, at end insert—
“(10) See—
section 16 for the meaning of “journalistic content”;
section 49 for the meaning of “news publisher content”.” —(Damian Collins.)
This amendment inserts a signpost to definitions of terms used in the new subsection inserted by Amendment 167.
Clause 20
Record-keeping and review duties
Amendment made: 169, page 21, line 45, at end insert
“and for the purposes of subsection (6), also includes the duties set out in section (Duties to protect news publisher content) (news publisher content).”—(Damian Collins.)
This amendment ensures that providers have a duty to review compliance with the duties set out in NC19 regularly, and after making any significant change to the design or operation of the service.
Clause 24
Safety duties about illegal content
Amendments made: 75, page 23, line 43, after “measures” insert
“relating to the design or operation of the service”.
This amendment makes it clear that compliance with duties to mitigate risks of harm from illegal content on search services is to be achieved by the way a service is designed or operated.
Amendment 76, page 23, line 45, at end insert “(see section 23(5)(c))”.
This is a technical amendment to put beyond doubt the meaning of a provision about risks identified in a risk assessment relating to illegal content on search services.
Amendment 77, page 24, line 8, after “is” insert “designed”.
This adds a reference to design to clause 24(4) which provides for the illegal content duties for search services to apply across all areas of a service.
Amendment 78, page 24, line 23, leave out from “consistently” to end of line 24.—(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply a search service’s publicly available statement consistently. The omitted words relate to the material now dealt with in NC14.
Clause 26
Safety duties protecting children
Amendments made: 79, page 26, line 5, after “measures” insert
“relating to the design or operation of the service”.
This amendment makes it clear that compliance with duties to mitigate risks of harm from content that is harmful to children on search services is to be achieved by the way a service is designed or operated.
Amendment 80, page 26, line 9, after “service” insert “(see section 25(5)(d))”.
This is a technical amendment to put beyond doubt the meaning of a provision about risks identified in a risk assessment relating to content that is harmful to children on search services.
Amendment 81, page 26, line 20, after “is” insert “designed”.
This adds a reference to design to clause 26(4) which provides for the children’s safety duties for search services to apply across all areas of a service.
Amendment 82, page 26, line 40, leave out from “consistently” to end of line 42.—(Damian Collins.)
This amendment omits words from a provision imposing a duty to apply a search service’s publicly available statement consistently. The omitted words relate to the material now dealt with in NC14.
Clause 37
Codes of practice about duties
Amendments made: 85, page 35, line 32, at end insert “or offences within Schedule 5 (terrorism offences)”.
This amendment ensures that a code of practice under clause 37(1) encompasses duties to deal with the use of a service in connection with terrorism offences as well as terrorism content.
Amendment 86, page 35, line 36, at end insert “or offences within Schedule 6 (child sexual exploitation and abuse offences)”.
This amendment ensures that a code of practice under clause 37(2) encompasses duties to deal with the use of a service in connection with child sexual exploitation and abuse offences as well as CSEA content.
Amendment 87, page 36, line 21, leave out “content” and insert “matters”. —(Damian Collins.)
This amendment is consequential on Amendments 85 and 86.
Clause 47
Duties and the first codes of practice
Amendments made: 88, page 44, line 32, at end insert “or offences within Schedule 5 (terrorism offences)”.
This amendment ensures that a reference to the illegal content duties on providers encompasses a reference to terrorism offences as well as terrorism content.
Amendment 89, page 44, line 35, after “content” insert “or offences within Schedule 6 (child sexual exploitation and abuse offences)”.—(Damian Collins.)
Clause 48
OFCOM’s guidance: record-keeping duties and children’s access assessments
Amendments made: 170, page 45, line 4, at end insert—
‘(A1) OFCOM must produce guidance for providers of Category 1 services to assist them in complying with their duties set out in section (Duties to protect news publisher content) (news publisher content).”
This amendment requires Ofcom to produce guidance for providers of Category 1 services to assist them with complying with their duties under NC19.
Amendment 171, page 45, line 9, leave out “the guidance” and insert “guidance under subsection (1)”.
This amendment means that the consultation requirements in clause 48 would not apply to guidance required to be produced as a result of Amendment 170.
Amendment 172, page 45, leave out “the guidance” and insert “guidance under this section”. —(Damian Collins.)
This amendment requires Ofcom to publish guidance required to be produced as a result of Amendment 170.
Clause 49
“Regulated user-generated content”, “user-generated content”, “news publisher content”
Amendments made: 90, page 46, line 1, leave out “operated” and insert “controlled”.
This amendment uses the term “control ” in relation to a person responsible for a bot, which is the language used in NC14.
Amendment 173, page 46, line 11, leave out “on, or reviews of,” and insert “or reviews relating to”. —(Damian Collins.)
This amendment ensures that the wording in this provision is consistent with the wording in paragraph 4(1)(a) of Schedule 1.
Clause 52
“Illegal content” etc
Amendments made: 91, page 49, line 1, leave out paragraph (b).
This amendment leaves out material which is now dealt with in NC14.
Amendment 92, page 49, line 9, leave out paragraph (a) and insert—
“(a) a priority offence, or”
This is a technical amendment to insert a defined term, a “priority offence” (see Amendment 95).
Amendment 93, page 49, line 10, leave out paragraphs (b) and (c).
This amendment is consequential on the new approach of referring to a “priority offence”.
Amendment 94, page 49, line 13, leave out paragraph (d) and insert—
“(d) an offence within subsection (4A).
‘(4A) An offence is within this subsection if—
(a) it is not a priority offence,
(b) the victim or intended victim of the offence is an individual (or individuals), and
(c) the offence is created by this Act or, before or after this Act is passed, by—
(i) another Act,
(ii) an Order in Council,
(iii) an order, rules or regulations made under an Act by the Secretary of State or other Minister of the Crown, including such an instrument made jointly with a devolved authority, or
(iv) devolved subordinate legislation made by a devolved authority with the consent of the Secretary of State or other Minister of the Crown.”
New subsection (4A), inserted by this amendment, describes offences which are relevant for the purposes of the concept of “illegal content”, but which are not priority offences as defined by new subsection (4B) (see amendment ). Subsection (4A)(c) requires there to have been some involvement of HMG in relation to the creation of the offence.
Amendment 95, page 49, line 14, at end insert—
‘(4B) “Priority offence” means—
(a) an offence specified in Schedule 5 (terrorism offences),
(b) an offence specified in Schedule 6 (offences related to child sexual exploitation and abuse), or
(c) an offence specified in Schedule 7 (other priority offences).”
This amendment inserts a definition of “priority offence” into clause 52.
Amendment 96, page 49, line 23, that subsection (8) of clause 52 be transferred to the end of line 14 on page 49.
This is a technical amendment moving provision to a more appropriate position in clause 52.
Amendment 97, page 49, leave out line 23 and insert “But an offence is not within subsection (4A)”.
This is a technical amendment consequential on the changes to clause 52 made by Amendment 94.
Amendment 98, page 49, line 35, at end insert—
‘(9A) References in subsection (3) to conduct of particular kinds are not to be taken to prevent content generated by a bot or other automated tool from being capable of amounting to an offence (see also section (Providers’ judgements about the status of content)(7) (providers’ judgements about the status of content)).”
This amendment ensures that content generated by bots is capable of being illegal content (so that the duties about dealing with illegal content may apply to such content).
Amendment 99, page 50, line 1, leave out subsection (12) and insert—
‘(12) In this section—
“devolved authority” means—
(a) the Scottish Ministers,
(b) the Welsh Ministers, or
(c) a Northern Ireland department;
“devolved subordinate legislation” means—
(a) an instrument made under an Act of the Scottish Parliament,
(b) an instrument made under an Act or Measure of Senedd Cymru, or
(c) an instrument made under Northern Ireland legislation;
“Minister of the Crown” has the meaning given by section 8 of the Ministers of the Crown Act 1975 and also includes the Commissioners for Her Majesty’s Revenue and Customs;
“offence” means an offence under the law of any part of the United Kingdom.”
This amendment inserts definitions into clause 52 that are needed as a result of Amendment 94.
Amendment 100, page 50, line 2, at end insert—
‘(13) See also section (Providers’ judgements about the status of content) (providers’ judgements about the status of content).” —(Damian Collins.)
This amendment inserts a signpost into clause 52 pointing to the NC about Providers’ judgements inserted by NC14.
Schedule 3
Timing of providers’ assessments
Amendments made: 147, page 175, line 23, leave out the definition of “illegal content risk assessment guidance”.
This technical amendment is consequential on Amendment 107.
Amendment 148, page 175, line 30, leave out from second “to” to end of line 31 and insert “OFCOM’s guidance under section 85(1).”
Schedule 3 is about the timing of risk assessments etc. This amendment ensures that the provisions about guidance re risk assessments work with the changes made by Amendment 107.
Amendment 149, page 175, line 36, leave out from second “to” to end of line 37 and insert “OFCOM’s guidance under section 85(1A).” —(Damian Collins.)
Schedule 3 is about the timing of risk assessments etc. This amendment ensures that the provisions about guidance re risk assessments work with the changes made by Amendment 107.
Schedule 7
Priority offences
Amendment proposed: 187, page 186, line 32, at end insert—
“Human trafficking
22A An offence under section 2 of the Modern Slavery Act 2015.” —(John Nicolson.)
This amendment includes Human Trafficking as a priority offence.
16:45

Division 36

Ayes: 229

Noes: 294

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I am anticipating another Division, as I said, and then I understand there may be some points of order, which I will hear after that Division.

That concludes proceedings on new clauses, new schedules and amendments to those parts of the Bill that have to be concluded by 4.30 pm.

It has been pointed out to me that, in this unusually hot weather, Members should please remember to drink more water. I tried it myself once. [Laughter.]

In accordance with the programme (No. 2) order of today, we now come to new clauses, new schedules and amendments relating to those parts of the Bill to be concluded by 7 pm. We begin with new clause 14, which the House has already debated. I therefore call the Minister to move new clause 14 formally.

New Clause 14

Providers’ judgements about the status of content

“(1) This section sets out the approach to be taken where—

(a) a system or process operated or used by a provider of a Part 3 service for the purpose of compliance with relevant requirements, or

(b) a risk assessment required to be carried out by Part 3, involves a judgement by a provider about whether content is content of a particular kind.

(2) Such judgements are to be made on the basis of all relevant information that is reasonably available to a provider.

(3) In construing the reference to information that is reasonably available to a provider, the following factors, in particular, are relevant—

(a) the size and capacity of the provider, and

(b) whether a judgement is made by human moderators, by means of automated systems or processes or by means of automated systems or processes together with human moderators.

(4) Subsections (5) to (7) apply (as well as subsection (2)) in relation to judgements by providers about whether content is—

(a) illegal content, or illegal content of a particular kind, or

(b) a fraudulent advertisement.

(5) In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is content of the kind in question (and a provider must treat content as content of the kind in question if reasonable grounds for that inference exist).

(6) Reasonable grounds for that inference exist in relation to content and an offence if, following the approach in subsection (2), a provider—

(a) has reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied, and

(b) does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.

(7) In the case of content generated by a bot or other automated tool, the tests mentioned in subsection (6)(a) and (b) are to be applied in relation to the conduct or mental state of a person who may be assumed to control the bot or tool (or, depending what a provider knows in a particular case, the actual person who controls the bot or tool).

(8) In considering a provider’s compliance with relevant requirements to which this section is relevant, OFCOM may take into account whether providers’ judgements follow the approaches set out in this section (including judgements made by means of automated systems or processes, alone or together with human moderators).

(9) In this section—

“fraudulent advertisement” has the meaning given by section 34 or 35 (depending on the kind of service in question);

“illegal content” has the same meaning as in Part 3 (see section 52);

“relevant requirements” means—

(a) duties and requirements under this Act, and

(b) requirements of a notice given by OFCOM under this Act.”—(Damian Collins.)

This new clause clarifies how providers are to approach judgements (human or automated) about whether content is content of a particular kind, and in particular, makes provision about how questions of mental state and defences are to be approached when considering whether content is illegal content or a fraudulent advertisement.

Brought up.

Question put, That the clause be added to the Bill.

16:59

Division 37

Ayes: 288

Noes: 229

New clause 14 read a Second time, and added to the Bill.
New Clause 15
Guidance about illegal content judgements
“(1) OFCOM must produce guidance for providers of Part 3 services about the matters dealt with in section (Providers’ judgements about the status of content) so far as relating to illegal content judgements.
(2) “Illegal content judgements” means judgements of a kind mentioned in subsection (4) of that section.
(3) Before producing the guidance (including revised or replacement guidance), OFCOM must consult such persons as they consider appropriate.
(4) OFCOM must publish the guidance (and any revised or replacement guidance).”—(Damian Collins.)
This new clause requires OFCOM to give guidance to providers about how they should approach judgements about whether content is illegal content or a fraudulent advertisement.
Brought up, and added to the Bill.
Thangam Debbonaire Portrait Thangam Debbonaire (Bristol West) (Lab)
- Hansard - - - Excerpts

On a point of order, Mr Deputy Speaker. Despite over 50 members of the Government resigning last week and many more Tory MPs submitting letters of no confidence in their own leader, the Conservative party continues to prop up this failed Prime Minister until September. They are complicit. They know—indeed, they have said—he is not fit to govern. They told the public so just days ago. Now they seem to be running scared and will not allow the Opposition to table a vote of no confidence. [Hon. Members: “Shame!”] Yes. This is yet another outrageous breach of the conventions that govern our country from a man who disrespected the Queen and illegally prorogued Parliament. Now he is breaking yet another convention. Every single day he is propped up by his Conservative colleagues, he is doing more damage to this country.

Mr Deputy Speaker, are you aware of any other instances where a Prime Minister has so flagrantly ignored the will of this House by refusing to grant time to debate a motion of no confidence in the Government, despite the fact that even his own party does not believe he should be Prime Minister any more? Do you agree with me that this egregious breach of democratic convention only further undermines confidence in this rotten Government?

Margaret Beckett Portrait Margaret Beckett (Derby South) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. I recognise that under the present Prime Minister, this Government have specialised in constitutional innovation. Nevertheless, it certainly seems to me, and I hope it does to you and to the House authorities, that this is stretching the boundaries of what is permissible into the outrageous and beyond, and threatening the democracy of this House.

Angela Eagle Portrait Dame Angela Eagle (Wallasey) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. The convention is that if the Leader of the Opposition tables a motion of no confidence, it is taken as the next available business. That is what has been done, yet even though we know that large swathes of the party in Government have no confidence in their Prime Minister, they are refusing to acknowledge and honour a time-honoured convention that is the only way to make a debate on that possible. Do you not agree that it is for this House of Commons to test whether any given Prime Minister has its confidence and that his or her Prime Ministership is always based on that? One of the prerequisites for being appointed Prime Minister of this country by the Queen is that that person shall have the confidence of the House of Commons. If we are not allowed to test that now, when on earth will be allowed to test it?

17:15
Chris Bryant Portrait Chris Bryant (Rhondda) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. As you know, “Erskine May” says very clearly:

“By established convention, the Government always accedes to the demand from the Leader of the Opposition”

in regard to a no-confidence motion. There has been a very long tradition of all sorts of different kinds of votes of no confidence. Baldwin, Melbourne, Wellington and Salisbury all resigned after a vote on an amendment to the Loyal Address; they considered that to be a vote of no confidence. Derby and Gladstone resigned after an amendment to the Budget; they considered that to be a vote of no confidence. Neville Chamberlain resigned after a motion to adjourn the House, even though he won the vote, because he saw that as a motion of no confidence. So it is preposterous that the Government are trying to say that the motion that is being tabled for tomorrow somehow does not count.

Let me remind Government Members that on 2 August 1965, the motion tabled by the Conservatives was:

“That this House has no confidence in Her Majesty’s Government and deplores the Prime Minister’s conduct of the nation’s affairs.”

I think this House agrees with that today.

Of course, we briefly had the Fixed-term Parliaments Act 2011, which set in statute that there was only one way of having a motion of no confidence, but this Government overturned and repealed that Act. The then Minister, the right hon. Member for Surrey Heath (Michael Gove), came on behalf of the Government to tell the Joint Committee on the Fixed-term Parliaments Act:

“It seems to us to be cleaner and clearer to have a return to a more classical understanding of what a vote of confidence involves.”

It is simple: the Prime Minister is disgraced, he does not enjoy the confidence of the House, and if he simply tries to prevent the House from coming to that decision, it is because he is a coward.

None Portrait Several hon. Members rose—
- Hansard -

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I will only allow three more points of order, because this is eating into time for very important business. [Interruption.] They are all similar points of order and we could carry on with them until 7 o’clock, but we are not going to do so.

Karin Smyth Portrait Karin Smyth (Bristol South) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. At the Public Administration and Constitutional Affairs Committee this morning, Sir John Major presented evidence to us about propriety and ethics. In that very sombre presentation, he talked about being

“at the top of a slope”

down towards the loss of democracy in this country. Ultimately, the will of Parliament is all we have, so if we do not have Parliament to make the case, what other option do we have?

None Portrait Several hon. Members rose—
- Hansard -

Nigel Evans Portrait Mr Deputy Speaker
- Hansard - - - Excerpts

Order. I ask the final Members please to show restraint as far as language is concerned, because I am not happy with some of the language that has been used.

Clive Efford Portrait Clive Efford (Eltham) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. There have been 50 resignations of Ministers; the Government are mired in controversy; people are acting up as Ministers who are not quite Ministers, as I understand it; and legislation is being delayed. When was there ever a better time for the House to table a motion of no confidence in a Government? This is a cowardly act not by the Prime Minister, but by the Conservative party, which does not want a vote on this issue. Conservative Members should support the move to have a vote of no confidence and have the courage to stand up for their convictions.

Angus Brendan MacNeil Portrait Angus Brendan MacNeil (Na h-Eileanan an Iar) (SNP)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. How can the Conservative party have no confidence in and write letters about the Prime Minister one week yet refuse to come to Parliament the following week to declare that in front of the public?

Kevin Brennan Portrait Kevin Brennan (Cardiff West) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Mr Deputy Speaker. Can you inform the House of whether Mr Speaker has received any explanation from the Government for this craven and egregious breach of parliamentary convention? If someone were to table a motion under Standing Order No. 24 for tomorrow, has he given any indication of what his attitude would be towards such a motion?

Nigel Evans Portrait Mr Deputy Speaker
- Hansard - - - Excerpts

I will answer the question about Standing Order No. 24 first, because I can deal with it immediately: clearly, if an application is made, Mr Speaker will determine it himself.

The principles concerning motions of no confidence are set out at paragraph 18.44 of “Erskine May”, which also gives examples of motions that have been debated and those that have not. “May” says:

“By established convention, the Government always accedes to the demand from the Leader of the Opposition to allot a day for the discussion of a motion tabled by the official Opposition which, in the Government’s view, would have the effect of testing the confidence of the House.”

I can only conclude, therefore, that the Government have concluded that the motion, as tabled by the official Opposition, does not have that effect. That is a matter for the Government, though, rather than for the Chair.

May I say that there are seven more sitting days before recess? As Deputy Speaker, I would anticipate that there will be further discussions.

We now have to move on with the continuation of business on the Bill.

New Clause 7

Duties regarding user-generated pornographic content: regulated services

“(1) This section sets out the duties which apply to regulated services in relation to user-generated pornographic content.

(2) A duty to verify that each individual featuring in the pornographic content has given their permission for the content in which they feature to be published or made available by the service.

(3) A duty to remove pornographic content featuring a particular individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.

(4) For the meaning of ‘pornographic content’, see section 66(2).

(5) In this section, ‘user-generated pornographic content’ means any content falling within the meaning given by subsection (4) and which is also generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.

(6) For the meaning of ‘regulated service’, see section 2(4).”—(Dame Diana Johnson.)

Brought up, and read the First time.

Diana Johnson Portrait Dame Diana Johnson (Kingston upon Hull North) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Lindsay Hoyle Portrait Mr Deputy Speaker
- View Speech - Hansard - - - Excerpts

With this it will be convenient to discuss the following:

New clause 33—Meaning of “pornographic content”

“(1) In this Act ‘pornographic content’ means any of the following—

(a) a video work in respect of which the video works authority has issued an R18 certificate;

(b) content that was included in a video work to which paragraph (a) applies, if it is reasonable to assume from its nature that its inclusion was among the reasons why the certificate was an R18 certificate;

(c) any other content if it is reasonable to assume from its nature that any classification certificate issued in respect of a video work including it would be an R18 certificate;

(d) a video work in respect of which the video works authority has issued an 18 certificate, and that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal;

(e) content that was included in a video work to which paragraph (d) applies, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the certificate was an 18 certificate;

(f) any other content if it is reasonable to assume from its nature—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that any classification certificate issued in respect of a video work including it would be an 18 certificate;

(g) a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if—

(i) it includes content that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal, and

(ii) it is reasonable to assume from the nature of that content that its inclusion was among the reasons why the video works authority made that determination;

(h) content that was included in a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the video works authority made that determination;

(i) any other content if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that the video works authority would determine that a video work including it was not suitable for a classification certificate to be issued in respect of it.

(2) In this section—

‘18 certificate’ means a classification certificate which—

(a) contains, pursuant to section 7(2)(b) of the Video Recordings Act 1984, a statement that the video work is suitable for viewing only by persons who have attained the age of 18 and that no video recording containing that work is to be supplied to any person who has not attained that age, and

(b) does not contain the statement mentioned in section 7(2)(c) of that Act that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘classification certificate’ has the same meaning as in the Video Recordings Act 1984 (see section 7 of that Act);

‘content’ means—

(a) a series of visual images shown as a moving picture, with or without sound;

(b) a still image or series of still images, with or without sound; or

(c) sound;

‘R18 certificate’ means a classification certificate which contains the statement mentioned in section 7(2)(c) of the Video Recordings Act 1984 that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘the video works authority’ means the person or persons designated under section 4(1) of the Video Recordings Act 1984 as the authority responsible for making arrangements in respect of video works other than video games;

‘video work’ means a video work within the meaning of the Video Recordings Act 1984, other than a video game within the meaning of that Act.”

This new clause defines pornographic content for the purposes of the Act and would apply to user-to-user services and commercial pornographic content.

Amendment 205, in clause 34, page 33, line 23, at end insert—

“(3A) But an advertisement shall not be regarded as regulated user-generated content and precluded from being a ‘fraudulent advertisement’ by reason of the content constituting the advertisement being generated directly on, uploaded to, or shared on a user-to-user service before being modified to a paid-for advertisement.”

Amendment 206, page 33, line 30, after “has” insert

“or may reasonably be expected to have”.

Amendment 207, in clause 36, page 35, line 12, at end insert—

“(3A) An offence under section 993 of the Companies Act 2006 (fraudulent trading).”

Amendment 208, page 35, line 18, after “(3)” insert “, 3(A)”.

Amendment 209, page 35, line 20, after “(3)” insert “, 3(A)”

Amendment 210, page 35, line 23, after “(3)” insert “, 3(A)”

Amendment 201, in clause 66, page 59, line 8, leave out from “Pornographic content” to end of line 10 and insert

“has the same meaning as section [meaning of pornographic content]”.

This amendment defines pornographic content for the purposes of the Part 5. It is consequential on NC33.

Amendment 56, page 59, line 8, after “content” insert “, taken as a whole,”

This amendment would require that content is considered as a whole before being defined as pornographic content.

Amendment 33, in clause 68, page 60, line 33, at end insert—

“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.

(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.

(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.

(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”

This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.

Amendment 34, page 60, line 37, leave out “subsection (2)” and insert “subsections (2) to (2D)”.

This amendment is consequential on Amendment 33.

Amendment 31, in clause 182, page 147, line 16, leave out from “unless” to end of line 17 and insert—

“(a) a draft of the instrument has been laid before each House of Parliament,

“(b) the Secretary of State has made a motion in the House of Commons in relation to the draft instrument, and

(c) the draft instrument has been approved by a resolution of each House of Parliament.”

This amendment would require a draft of a statutory instrument containing regulations under sections 53 or 54 to be debated on the floor of the House of Commons, rather than in a delegated legislation committee (as part of the affirmative procedure).

Amendment 158, in clause 192, page 155, line 26, after “including” insert “but not limited to”.

This amendment clarifies that the list of types of content in clause 192 is not exhaustive.

Diana Johnson Portrait Dame Diana Johnson
- View Speech - Hansard - - - Excerpts

May I welcome the Minister to his place, as I did not get an opportunity to speak on the previous group of amendments?

New clause 7 and amendments 33 and 34 would require online platforms to verify the age and consent of all individuals featured in pornographic videos uploaded to their site, as well as enabling individuals to withdraw their consent to the footage remaining on the website. Why are the amendments necessary? Let me read a quotation from a young woman:

“I sent Pornhub begging emails. I pleaded with them. I wrote, ‘Please, I’m a minor, this was assault, please take it down.’”

She received no reply and the videos remained live. That is from a BBC article entitled “I was raped at 14, and the video ended up on a porn site”.

This was no one-off. Some of the world’s biggest pornography websites allow members of the public to upload videos without verifying that everyone in the film is an adult or that everyone in the film gave their permission for it to be uploaded. As a result, leading pornography websites have been found to be hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse.

In 2020, The New York Times documented the presence of child abuse videos on Pornhub, one of the most popular pornography websites in the world, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site. The New York Times reporter Nicholas Kristof wrote about Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

Even before that, in 2019, PayPal took the decision to stop processing payments for Pornhub after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. The newspaper reported:

“Pornhub is awash with secretly filmed ‘creepshots’ of schoolgirls and clips of men performing sex acts in front of teenagers on buses. It has also hosted indecent images of children as young as three.

The website says it bans content showing under-18s and removes it swiftly. But some of the videos identified by this newspaper’s investigation had 350,000 views and had been on the platform for more than three years.”

One of the women who is now being forced to take legal action against Pornhub’s parent company, MindGeek, is Crystal Palace footballer Leigh Nicol. Leigh’s phone was hacked and private content was uploaded to Pornhub without her knowledge. She said in an interview:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do…The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

Leigh Nicol is spot on.

Unfortunately, when this subject was debated in Committee, the previous Minister, the hon. Member for Croydon South (Chris Philp), argued that the content I have described—including child sexual abuse images and videos—was already illegal, and there was therefore no need for the Government to introduce further measures. However, that misses the point: the Minister was arguing against the very basis of his own Government’s Bill. At the core of the Bill, as I understand it, is a legal duty placed on online platforms to combat and remove content that is already illegal, such as material relating to terrorism. ln keeping with that, my amendments would place a legal duty on online platforms hosting pornographic content to combat and remove illegal content through the specific and targeted measure of verifying the age and consent of every individual featured in pornographic content on their sites. The owners and operators of pornography websites are getting very rich from hosting footage of rape, trafficking and child sexual abuse, and they must be held to account under the law and required to take preventive action.

The Organisation for Security and Co-operation in Europe, which leads action to combat human trafficking across 57 member states, recommends that Governments require age and consent verification on pornography websites in order to combat exploitation. The OSCE told me:

“These sites routinely feature sexual violence, exploitation and abuse, and trafficking victims. Repeatedly these sites have chosen profits over reasonable prevention and protection measures. At the most basic level, these sites should be required to ensure that each person depicted is a consenting adult, with robust age verification and the right to withdraw consent at any time. Since self- regulation hasn’t worked, this will only work through strong, state-led regulation”.

Who else supports that? Legislation requiring online platforms to verify the age and consent of all individuals featured in pornographic content on their sites is backed by leading anti-sexual exploitation organisations including CEASE—the Centre to End All Sexual Exploitation—UK Feminista and the Traffickinghub movement, which has driven the global campaign to expose the abuses committed by, in particular, Pornhub.

New clause 7 and amendments 33 and 34 are minimum safety measures that would stop the well-documented practice of pornography websites hosting and profiting from videos of rape, trafficking and child sexual abuse. I urge the Government to reconsider their position, and I will seek to test the will of the House on new clause 7 later this evening.

Adam Afriyie Portrait Adam Afriyie
- View Speech - Hansard - - - Excerpts

I echo the concerns expressed by the right hon. Member for Kingston upon Hull North (Dame Diana Johnson). Some appalling abuses are taking place online, and I hope that the Bill goes some way to address them, to the extent that that is possible within the framework that it sets up. I greatly appreciate the right hon. Lady’s comments and her contribution to the debate.

I have a tight and narrow point for the Minister. In amendment 56, I seek to ensure that only pornographic material is caught by the definition in the Bill. My concern is that we catch these abuses online, catch them quickly and penalise them harshly, but also that sites that may display, for example, works of art featuring nudes—or body positivity community sites, of which there are several—are not inadvertently caught in our desire to clamp down on illegal pornographic sites. Perhaps the Minister will say a few words about that in his closing remarks.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to speak to this small group of amendments on behalf of the Opposition. Despite everything that is going on at the moment, we must remember that this Bill has the potential to change lives for the better. It is an important piece of legislation, and we cannot miss the opportunity to get it right. I would like to join my hon. Friend the Member for Pontypridd (Alex Davies-Jones) in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins) to his role. His work as Chair of the Joint Committee on this Bill was an important part of the pre-legislative scrutiny process, and I look forward to working in collaboration with him to ensure that this legislation does as it should in keeping us all safe online. I welcome the support of the former Minister, the hon. Member for Croydon South (Chris Philp), on giving access to data to academic researchers and on looking at the changes needed to deal with the harm caused by the way in which algorithmic prompts work. It was a pity he was not persuaded by the amendments in Committee, but better late than never.

17:34
Earlier we debated new clause 14, which will reduce the amount of illegal content and fraudulent advertising that is identified and acted upon. In our view, this new clause undermines and weakens the safety mechanisms that members of the Joint Committee and the Public Bill Committee worked so hard to get right. I hope the Government will reconsider this part of the Bill when it goes through its stages in the House of Lords. Even without new clause 14, though, there are problems with the provisions around fraudulent advertising. Having said that, we were pleased that the Government conceded to our calls in Committee to ensure that major search engines and social media sites would be subject to the same duties to prevent fraudulent advertising from appearing on their sites.
However, there are other changes that we need to see if the Bill is to be successful in reducing the soaring rates of online fraud and changing the UK’s reputation as the
“scam capital of the world”,
according to Which? The Government voted against other amendments tabled in Committee by me and my hon. Friend the Member for Pontypridd that would have tackled the reasons why people become subject to online fraud. Our amendments would have ensured that customers had better protection against scams and a clearer understanding of which search results were paid-for ads. In rejecting our amendments, the Government have missed an opportunity to tackle the many forms of scamming that people experience online.
One of those forms of scamming is in the world of online ticketing. In my role as shadow Minister for the Arts and Civil Society, I have worked on this issue and been informed by the expertise of my hon. Friend the Member for Washington and Sunderland West (Mrs Hodgson), who chairs the all-party parliamentary group on ticket abuse. I would like to thank her and those who have worked with the APPG on the anti-ticket touting campaign for their insights. Ticket reselling websites have a well-documented history of breaching consumer protection laws. These breaches include cases of fraud such as the sale of non-existent tickets. If our amendment had been passed, secondary ticketing websites such as Viagogo would have had to be members of a regulatory body responsible for secondary ticketing such as the Society of Ticket Agents and Retailers, and they would have had to comply with established standards.
I have used ticket touting as an example, but the repercussions of this change go wider to include scamming by holiday websites, debt services and fraudulent passport renewal companies. Our amendments, together with amendments 205 to 210, which were tabled by the hon. Members for Ochil and South Perthshire (John Nicolson) and for Aberdeen North (Kirsty Blackman), would improve protection against scams and close loopholes in the definitions of fraudulent advertising. I hope the Minister recognises how many more scams these clauses would prevent if the amendments were accepted.
Part 5 of the Bill includes provisions that relate to pornographic content, which we have already heard about in this debate. For too long, we have seen a proliferation of websites with illegal and harmful content rife with representations of sexual violence, incest, rape and exploitation, and I thank my right hon. Friend the Member for Kingston upon Hull North (Dame Diana Johnson) for the examples she has just given us. We welcomed the important changes made to the Bill before the Committee stage, which meant that all pornographic content, not just user-generated content, would now be included within the duties in part 5.
Other Members have tabled important amendments to this part of the Bill. New clause 33 and new schedule 1, tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North, will ensure parity between online and offline content standards for pornography. New clause 33 is important in specifying that content that fails to obtain an R18 certificate has to be removed, just as happens in the offline world under the Video Recordings Act 1984. My right hon. Friend the Member for Kingston upon Hull North tabled amendment 33 and new clause 7, which place new duties on user-generated commercial pornography sites to verify the age and obtain consent of people featured in pornographic content, and to remove content should that consent be withdrawn. These are safeguards that should have been put in place by pornography platforms from the very start.
I would like to raise our concern about how quickly these duties can be brought into force. Clause 196 lays out that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be those covering the definitions—clauses 66 and 67(4)—and not those covering the duties. Children cannot wait another three years for protections from harm, having been promised this five years ago under part 3 of the Digital Economy Act 2017, which was never implemented. I hope the Minister appreciates the need for speed in regulating this particularly high harm part of the internet.
Part 11 clarifies companies’ liability and outlines the type of information offences contained in the Bill. It is important that liability is at the heart of discussions about the practical applications of the Bill, because we know that big internet companies have got away with doing nothing for far too long. However, the current focus on information offences means that criminal liability for repeated and systematic failures resulting in serious harm to users remains a crucial omission from the Bill.
My hon. Friend the Member for Pontypridd was vocal in making the point, but it needs to be made again, that we are very concerned about the volume of last-minute amendments tabled by the Government, and particularly their last-ditch attempt at power grabbing through amendment 144. The Secretary of State should not have the ability to decide what constitutes a priority offence without appropriate scrutiny, and our amendments would bring appropriate parliamentary oversight.
Amendment 31, in my name and in the name of my hon. Friend, would require that any changes to clauses 53 or 54, on harmful content, are debated on the Floor of the House rather than in a Delegated Legislation Committee. Without this change, the Secretary of State of the day will have the power to make decisions about priority content quietly through secondary legislation, which could have real-life consequences. Any changes to priority content are worthy of proper debate. If the Minister is serious about proper scrutiny of the online safety regime, he should carefully consider amendment 31. I urge hon. Members to support the amendment.
Finally, part 12 includes clarifications and definitions. The hon. Members for Ochil and South Perthshire and for Aberdeen North tabled amendment 158, which would expand the definition of content in the Bill. This is an important future-proofing measure.
As I mentioned, we are concerned about the delays to the implementation of certain duties set out in part 12. We are now in a situation in which many children who need protection will no longer even be children by the time this legislation and its protections come into effect. Current uncertainty about the running of Government will compound the concerns of many charities and children’s advocacy groups. I hope the Minister will agree that we cannot risk further delays.
At its core, the Online Safety Bill should be about reducing harm, and we are all aligned on that aim. I am disappointed that the Government have reversed some of the effectiveness of the scrutiny in Committee by now amending the Bill to such a degree. I hope the Minister considers our amendments in the collaborative spirit in which they are intended, and recognises their potential to make this Bill stronger and more effective for all.
Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - - - Excerpts

I think it is extraordinarily important that this Bill does what the hon. Member for Worsley and Eccles South (Barbara Keeley) has just described. As the Bill moves from this place to the other place, we must debate what the right balance is between what the Secretary of State must do—in the previous group of amendments, we heard that many of us believe that is too extensive as the Bill stands—what the regulator, Ofcom, must do and what Parliament must do. There is an important judgment call for this House to make on whether we have that balance right in the Bill as it stands.

These amendments are very interesting. I am not convinced that the amendments addressed by the hon. Lady get the balance exactly right either, but there is cause for further discussion about where we in this House believe the correct boundary is between what an independent regulator should be given authority to do under this legislative and regulatory structure and what we wish to retain to ourselves as a legislature.

Adam Afriyie Portrait Adam Afriyie
- Hansard - - - Excerpts

My right hon. and learned Friend is highlighting, and I completely agree, that there is a very sensitive balance between different power bases and between different approaches to achieving the same outcome. Does he agree that as even more modifications are made—the nipping and tucking I described earlier—this debate and future debates, and these amendments, will contribute to those improvements over the weeks and months ahead?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

Yes, I agree with my hon. Friend about that. I hope it is some comfort to the hon. Member for Worsley and Eccles South when I say that if the House does not support her amendment, it should not be taken that she has not made a good point that needs further discussion—probably in the other place, I fear. We are going to have think carefully about that balance. It is also important that we do not retain to ourselves as a legislature those things that the regulator ought to have in its own armoury. If we want Ofcom to be an effective and independent regulator in this space, we must give it sufficient authority to fulfil that role. She makes interesting points, although I am not sure I can go as far as supporting her amendments. I know that is disappointing, but I do think that what she has done is prompted a further debate on exactly this balance between Secretary of State, Executive, legislature and regulator, which is exactly where we need to be.

I have two other things to mention. The first relates to new clause 7 and amendment 33, which the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) tabled. She speaks powerfully to a clear need to ensure that this area is properly covered. My question, however, is about practicalities. I am happy to take an intervention if she can answer it immediately. If not, I am happy to discuss it with her another time. She has heard me speak many times about making sure that this Bill is workable. The challenge in what she has described in her amendments may be that a platform needs to know how it is to determine and “verify”—that is the word she has used—that a participant in a pornographic video is an adult and a willing participant. It is clearly desirable that the platform should know both of those things, but the question that will have to be answered is: by what mechanism will it establish that? Will it ask the maker of the pornographic video and be prepared to accept the assurances it is given? If not, by what other mechanism should it do this? For example, there may be a discussion to be had on what technology is available to establish whether someone is an adult or is not—that bleeds into the discussion we have had about age assurance. It may be hard for a platform to establish whether someone is a willing participant.

Jess Phillips Portrait Jess Phillips (Birmingham, Yardley) (Lab)
- View Speech - Hansard - - - Excerpts

This has been quite topical this week. When we have things on any platform that is on our television, people absolutely have to have signed forms to say that they are a willing participant. It is completely regular within all other broadcast media that people sign consent forms and that people’s images are not allowed to be used without their consent.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

Yes, I am grateful to the hon. Lady for that useful addition to this debate, but it tends to clarify the point I was seeking to clarify, which is whether or not what the right hon. Member for Kingston upon Hull North has in mind is to ensure that a platform would be expected to make use of those mechanisms that already exist in order to satisfy itself of the things that she rightly asks it to be satisfied of or whether something beyond that would be required to meet her threshold. If it is the former, that is manageable for platforms and perfectly reasonable for us to expect of them. If it is the latter, we need to understand a little more clearly how she expects a platform to achieve that greater assurance. If it is that, she makes an interesting point.

Finally, let me come to amendment 56, tabled by my hon. Friend the Member for Windsor (Adam Afriyie). Again, I have a practical concern. He seeks to ensure that the pornographic content is “taken as a whole”, but I think it is worth remembering why we have included pornographic content in the context of this Bill. We have done it to ensure that children are not exposed to this content online and that where platforms are capable of preventing that from happening, that is exactly what they do. There is a risk that if we take this content as a whole, it is perfectly conceivable that there may be content online that is four hours long, only 10 minutes of which is pornographic in nature. It does not seem to me that that in any way diminishes our requirement of a platform to ensure that children do not see those 10 minutes of pornographic content.

Adam Afriyie Portrait Adam Afriyie
- Hansard - - - Excerpts

I am very sympathetic to that view. I am merely flagging up for the Minister that if we get the opportunity, we need to have a look at it again in the Lords, to be absolutely certain that we are not ruling out certain types of art, and certain types of community sites that we would all think were perfectly acceptable, that are probably not accessible to children, just to ensure that we are not creating further problems down the road that we would have to correct.

17:45
Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

I follow that point. I will channel, with some effort, the hon. Member for Birmingham, Yardley (Jess Phillips), who I suspect would say that these things are already up for debate and discussed in other contexts—the ability to distinguish between art and pornography is something that we have wrestled with in other media. Actually, in relation to the Bill, I think that one of our guiding principles ought to be that we do not reinvent the wheel where we do not have to, and that we seek to apply to the online world the principles and approaches that we would expect in all other environments. That is probably the answer to my hon. Friend’s point.

I think it is very important that we recognise the need for platforms to do all they can to ensure that the wrong type of material does not reach vulnerable users, even if that material is a brief part of a fairly long piece. Those, of course, are exactly the principles that we apply to the classification of films and television. It may well be that a small portion of a programme constitutes material that is unsuitable for a child, but we would still seek to put it the wrong side of the 9 o’clock watershed or use whatever methods we think the regulator ought to adopt to ensure that children do not see it.

Good points are being made. The practicalities are important; it may be that because of a lack of available time and effort in this place, we have to resolve those elsewhere.

John Nicolson Portrait John Nicolson
- View Speech - Hansard - - - Excerpts

I wish to speak to new clause 33, my proposed new schedule 1 and amendments 201 to 203. I notice that the Secretary of State is off again. I place on record my thanks to Naomi Miles of CEASE—the Centre to End All Sexual Exploitation—and Ceri Finnegan of Barnardos for their support.

The UK Government have taken some steps to strengthen protections on pornography and I welcome the fact that young teenagers will no longer be able to access pornography online. However, huge quantities of extreme and harmful pornography remain online, and we need to address the damage that it does. New clause 33 would seek to create parity between online and offline content—consistent legal standards for pornography. It includes a comprehensive definition of pornography and puts a duty on websites not to host content that would fail to attain the British Board of Film Classification standard for R18 classification.

The point of the Bill, as the Minister has repeatedly said, is to make the online world a safer place, by doing what we all agree must be done—making what is illegal offline, illegal online. That is why so many Members think that the lack of regulation around pornography is a major omission in the Bill.

The new clause stipulates age and consent checks for anyone featured in pornographic content. It addresses the proliferation of pornographic content that is both illegal and harmful, protecting women, children and minorities on both sides of the camera.

The Bill presents an opportunity to end the proliferation of illegal and harmful content on the internet. Representations of sexual violence, animal abuse, incest, rape, coercion, abuse and exploitation—particularly directed towards women and children—are rife. Such content can normalise dangerous and abusive acts and attitudes, leading to real-world harm. As my hon. Friend the Member for Pontypridd (Alex Davies-Jones) said in her eloquent speech earlier, we are seeing an epidemic of violence against women and girls online. When bile and hatred is so prolific online, it bleeds into the offline space. There are real-world harms that flow from that.

The Minister has said how much of a priority tackling violence against women and girls is for him. Knowing that, and knowing him, he will understand that pornography is always harmful to children, and certain kinds of pornographic content are also potentially harmful to adults. Under the Video Recordings Act 1984, the BBFC has responsibility for classifying pornographic content to ensure that it is not illegal, and that it does not promote an interest in abusive relationships, such as incest. Nor can it promote acts likely to cause serious physical harm, such as breath restriction or strangulation. In the United Kingdom, it is against the law to supply pornographic material that does not meet this established BBFC classification standard, but there is no equivalent standard in the online world because the internet evolved without equivalent regulatory oversight.

I know too that the Minister is determined to tackle some of the abusive and dangerous pornographic content online. The Bill does include a definition of pornography, in clause 66(2), but that definition is inadequate; it is too brief and narrow in scope. In my amendment, I propose a tighter and more comprehensive definition, based on that in part 3 of the Digital Economy Act 2017, which was debated in this place and passed into law. The amendment will remove ambiguity and prevent confusion, ensuring that all websites know where they stand with regard to the law.

The new duty on pornographic websites aligns with the UK Government’s 2020 legislation regulating UK-established video-sharing platforms and video-on-demand services, both of which appeal to the BBFC’s R18 classification standards. The same “high standard of rules in place to protect audiences”, as the 2020 legislation put it, and “certain content standards” should apply equally to online pornography and offline pornography, UK-established video-sharing platforms and video-on-demand services.

Let me give some examples sent to me by Barnardo’s, the children’s charity, which, with CEASE, has done incredibly important work in this area. The names have been changed in these examples, for obvious reasons.

“There are also children who view pornography to try to understand their own sexual abuse. Unfortunately, what these children find is content that normalises the most abhorrent and illegal behaviours, such as 15-year-old Elizabeth, who has been sexually abused by a much older relative for a number of years. The content she found on pornography sites depicted older relatives having sex with young girls and the girls enjoying it. It wasn’t until she disclosed her abuse that she realised that it was not normal.

Carrie is a 16-year-old who was being sexually abused by her stepfather. She thought this was not unusual due to the significant amount of content she had seen on pornography sites showing sexual relationships within stepfamilies.”

That is deeply disturbing evidence from Barnardo’s.

Although in theory the Bill will prevent under-18s from accessing such content, the Minister knows that under-18s will be able to bypass regulation through technology like VPNs, as the DCMS Committee and the Bill Committee—I served on both—were told by experts in various evidence sessions. The amendment does not create a new law; it merely moves existing laws into the online space. There is good cause to regulate and sometimes prohibit certain damaging offline content; I believe it is now our duty to provide consistency with legislation in the online world.

Kirsty Blackman Portrait Kirsty Blackman
- View Speech - Hansard - - - Excerpts

I want to talk about several things, but particularly new clause 7. I am really pleased that the new clause has come back on Report, as we discussed it in the Bill Committee but unfortunately did not get enough support for it there—as was the case with everything we proposed—so I thank the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for tabling it. I also thank my hon. Friend the Member for Inverclyde (Ronnie Cowan) for his lobbying and for providing us with lots of background information. I agree that it is incredibly important that new clause 7 is agreed, particularly the provisions on consent and making sure that participants are of an appropriate age to be taking part. We have heard so many stories of so many people whose videos are online—whose bodies are online—and there is nothing they can do about it because of the lack of regulation. My hon. Friend the Member for Ochil and South Perthshire (John Nicolson) has covered new clause 33 in an awful lot of detail—very good detail—so I will not comment on that.

The right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) mentioned how we need to get the balance right, and specifically talked about the role of the regulator. In many ways, this Bill has failed to get the balance right in its attempts to protect children online. Many people who have been involved in writing this Bill, talking about this Bill, scrutinising this Bill and taking part in every piece of work that we have done around it do not understand how children use the internet. Some people do, absolutely, but far too many of the people who have had any involvement in this Bill do not. They do not understand the massive benefits to children of using the internet, the immense amount of fun they can have playing Fortnite, Fall Guys, Minecraft, or whatever it is they happen to be playing online and how important that is to them in today’s crazy world with all of the social media pressures. Children need to decompress. This is a great place for children to have fun—to have a wonderful time—but they need to be protected, just as we would protect them going out to play in the park, just the same as we would protect them in all other areas of life. We have a legal age for smoking, for example. We need to make sure that the protections are in place, and the protections that are in place need to be stronger than the ones that are currently in the Bill.

I did not have a chance earlier—or I do not think I did—to support the clause about violence against women and girls. As I said in Committee, I absolutely support that being in the Bill. The Government may say, “Oh we don’t need to have this in the Bill because it runs through everything,” but having that written in the Bill would make it clear to internet service providers—to all those people providing services online and having user-generated content on their sites—how important this is and how much of a scourge it is. Young women who spend their time on social media are more likely to have lower outcomes in life as a result of problematic social media use, as a result of the pain and suffering that is caused. We should be putting such a measure in the Bill, and I will continue to argue for that.

We have talked a lot about pornographic content in this section. There is not enough futureproofing in the Bill. My hon. Friend the Member for Ochil and South Perthshire and I tabled amendment 158 because we are concerned about that lack of futureproofing. The amendment edits the definition of “content”. The current definition of “content” says basically anything online, and it includes a list of stuff. We have suggested that it should say “including but not limited to”, on the basis that we do not know what the internet will look like in two years’ time, let alone what it will look like in 20 years’ time. If this Bill is to stand the test of time, it needs to be clear that that list is not exhaustive. It needs to be clear that, when we are getting into virtual reality metaverses where people are meeting each other, that counts as well. It needs to be clear that the sex dungeon that exists in the child’s game Roblox is an issue—that that content is an issue no matter whether it fits the definition of “content” or whether it fits the fact that it is written communication, images or whatever. It does not need to fit any of that. If it is anything harmful that children can find on the internet, it should be included in that definition of “content”, no matter whether it fits any of those specific categories. We just do not know what the internet is going to look like.

I have one other specific thing in relation to the issues of content and pornography. One of the biggest concerns that we heard is the massive increase in the amount of self-generated child sexual abuse images. A significant number of new images of child sexual abuse are self-generated. Everybody has a camera phone these days. Kids have camera phones these days. They have much more potential to get themselves into really uncomfortable and difficult situations than when most of us were younger. There is so much potential for that to be manipulated unless we get this right.

18:03
I have concerns about the age assurance that was mentioned. If it is livestreamed content, content that is being generated right now at this moment, scanning for those child sexual abuse images will be very difficult. There will not be a hashtag. It has not been discovered before. It is not an image that will have been looked at, categorised and worked out before. It is something that evil people are convincing and forcing children to do today.
That is another place where the Bill fails to recognise how young people use the internet today. It fails to talk specifically about things such as livestreaming and how duties on providing services for children on the internet should reduce things such as the ability to livestream and to have private conversations with potential abusers. All those things are not in the Bill in the way I would like. I understand that the codes of practice will come through and there will be guidance on the risk assessments, but I have not seen enough so far to convince me that people know what they are doing when they are writing those codes of practice.
From what I have heard from Ofcom, it has generally been pretty sensible, but nearly every person I have encountered talking about this Bill who has had or continues to have any say over it does not understand how children actually use the internet. I have been online for nearly 30 years, since I was younger than my children are now. I grew up on the internet. I spend a lot of time on the internet. I have spoken to so many people who I did not know online. I have had so many ridiculous, harmful conversations that I would be aghast and devastated if my children were having now. I do not want that to be happening to tomorrow’s generation of children.
No matter what we put in place, there will always be loopholes and bad actors and there will always be issues, but we want the Bill to be genuinely the best possible. The biggest failing is that lack of understanding of how children interact with the internet. We want it to be a safe place for them. We want them to be able to have great experiences online and to enjoy themselves, and we want to put up those protections in the same way that we put up crossing patrols and so on to protect children, and we are just not there yet.
I appreciate that the Minister and the previous Minister, the hon. Member for Croydon South (Chris Philp), have brought forward a significant number of amendments, although it is unfortunate that they have come at such short notice that we have not had enough time to look at them properly, and I appreciate that they will bring forward more, but there is still more they need to do. Even with all the amendments and all the commitments we have seen, I am still not comfortable enough that my children and my children’s children will be as safe on the internet as they should be.
Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle (Brighton, Kemptown) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

I rise to support new clauses 7 and 33 in particular. I support them sometimes from a different angle from my hon. Friends, but fundamentally from the same angle: consent. I am not afraid to say that I have a different perspective from some hon. Members in this House in that I view sex work as a legitimate form of work under regulated and protected conditions, and pornography as part of that. What I do have a problem with is the lack of consent that occurs far too often not only in the industry—that may be too broad a term—but in particular content that we see online at the moment.

That is true particularly for those sex workers who might have produced content with consent at the time, as adults, but who later in life realise that they do not wish that material to be available any more—not just because they may be embarrassed about it, but perhaps because they just do not want that material commercially available and people making profits off their bodies any more. They are struggling to get content taken down because they are told, “You gave consent at the time and that can’t now be removed. You have to allow your body to be used.” We would not allow any other form of worker or artist to suffer that. In any other form of music or production, if they wished to remove their consent for it to be played, it would be taken down, but in pornography there seems to be a free-for-all where, even if people remove their consent, it still proliferates in copies of copies that are put all over the internet. That is not even to mention people who never gave their consent at all and experience revenge porn or their phones being hacked and the devastation that that can cause.

I might come from a different position on some of this, but I think we can be united in saying that of course we need better action on under-18s, which is very important, but even for those who have supposedly given their consent at one point or another, the removal of consent must be put into the Bill and platforms must have a strict responsibility to remove that content. Without that being in the Bill, there is a danger that platforms will continue to play loophole after loophole and the content will still be there when it should not be.

Ronnie Cowan Portrait Ronnie Cowan (Inverclyde) (SNP)
- View Speech - Hansard - - - Excerpts

I was not planning to speak, but we have a couple of minutes so I will abuse that position.

I just want to say that I do not want new clause 7 to be lost in this debate and become part of the flotsam and jetsam of the tide of opinion that goes back and forth in this place, because new clause 7 is about consent. We are trying very hard to teach young men all about consent, and if we cannot do it from this place, then when can we do it? We can work out the details of the technology in time, as we always do. It is out there. Other people are way ahead of us in this matter. In fact, the people who produce this pornography are way ahead of us in this matter.

Diana Johnson Portrait Dame Diana Johnson
- Hansard - - - Excerpts

While we have been having this debate, Iain Corby, executive director at the Age Verification Providers Association, has sent me an email in which he said that the House may be interested to know that one of the members of that organisation offers adult sites a service that facilitates age verification and the obtaining and maintaining of records of consent. So it is possible to do this if the will is there.

Ronnie Cowan Portrait Ronnie Cowan
- View Speech - Hansard - - - Excerpts

I absolutely agree. We can also look at this from the point of view of gambling reform and age verification for that. The technology is there, and we can harness and use it to protect people. All I am asking is that we do not let this slip through the cracks this evening.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

We have had an important debate raising a series of extremely important topics. While the Government may not agree with the amendments that have been tabled, that is not because of a lack of seriousness of concern about the issues that have been raised.

The right hon. Member for Kingston upon Hull North (Dame Diana Johnson) spoke very powerfully. I have also met Leigh Nicol, the lady she cited, and she discussed with me the experience that she had. Sadly, it was during lockdown and it was a virtual meeting rather than face to face. There are many young women, in particular, who have experienced the horror of having intimate images shared online without their knowledge or consent and then gone through the difficult experience of trying to get them removed, even when it is absolutely clear that they should be removed and are there without their consent. That is the responsibility of the companies and the platforms to act on.

Thinking about where we are now, before the Bill passes, the requirement to deal with illegal content, even the worst illegal content, on the platforms is still largely based on the reporting of that content, without the ability for us to know how effective they are at actually removing it. That is largely based on old legislation. The Bill will move on significantly by creating proactive responsibilities not just to discover illegal content but to act to mitigate it and to be audited to see how effectively it is done. Under the Bill, that now includes not just content that would be considered to be an abuse of children. A child cannot give consent to have sex or to appear in pornographic content. Companies need to make sure that what they are doing is sufficient to meet that need.

It should be for the regulator, Ofcom, as part of putting together the codes of practice, to understand, even on more extreme content, what systems companies have in place to ensure that they are complying with the law and certainly not knowingly hosting content that has been flagged to them as being non-consensual pornography or child abuse images, which is effectively what pornography with minors would be; and to understand what systems they have in place to make sure that they are complying with the law and, as hon. Members have said, making sure that they are using available technologies in order to deliver that.

Jess Phillips Portrait Jess Phillips
- View Speech - Hansard - - - Excerpts

We have an opportunity here today to make sure that the companies are doing it. I am not entirely sure why we would not take that opportunity to legislate to make sure that they are. With the greatest of respect to the Minister back in a position of authority, it sounds an awful lot like the triumph of hope over experience.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It is because of the danger of such a sentiment that this Bill is so important. It not just sets the targets and requirements of companies to act against illegal content, but enables a regulator to ensure that they have the systems and processes in place to do it, that they are using appropriate technology and that they apply the principle that their system should be effective at addressing this issue. If they are defective, that is a failure on the company’s part. It cannot be good enough that the company says, “It is too difficult to do”, when they are not using technologies that would readily solve that problem. We believe that the technologies that the companies have and the powers of the regulator to have proper codes of practice in place and to order the companies to make sure they are doing it will be sufficient to address the concern that the hon. Lady raises.

Diana Johnson Portrait Dame Diana Johnson
- View Speech - Hansard - - - Excerpts

I am a little taken aback that the Minister believes that the legislation will be sufficient. I do not understand why he has not responded to the point that my hon. Friend the Member for Birmingham, Yardley (Jess Phillips) was making that we could make this happen by putting the proposal in the Bill and saying, “This is a requirement.” I am not sure why he thinks that is not the best way forward.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It is because the proposal would not make such content more illegal than it is now. It is already illegal and there are already legal duties on companies to act. The regulator’s job is to ensure they have the systems in place to do that effectively, and that is what the Bill sets out. We believe that the Bill addresses the serious issue that the right hon. Lady raises in her amendments. That legal requirement is there, as is the ability to have the systems in place.

If I may, I will give a different example based on the fraud example given by the shadow Minister, the hon. Member for Worsley and Eccles South (Barbara Keeley). On the Joint Committee that scrutinised the Bill, we pushed hard to have fraudulent ads included within the scope of the Bill, which has been one of the important amendments to it. The regulator can consider what systems the company should have in place to identify fraud, but also what technologies it employs to make it far less likely that fraud would be there in the first place. Google has a deal with the Financial Conduct Authority, whereby it limits advertisers from non-accredited companies advertising on its platform. That makes it far less likely that fraud will be discovered because, if the system works, only properly recognised organisations will be advertising.

Facebook does not have such a system in place. As a consequence, since the Google system went live, we have seen a dramatic drop in fraud ads on Google, but a substantial increase in fraud ads on Facebook and platforms such as Instagram. That shows that if we have the right systems in place, we can have a better outcome and change the result. The job of the regulator with illegal pornography and other illegal content should be to look at those systems and say, “Do the companies have the right technology to deliver the result that is required?” If they do not, that would still be a failure of the codes.

Barbara Keeley Portrait Barbara Keeley
- View Speech - Hansard - - - Excerpts

The Minister is quoting a case that I quoted in Committee, and the former Minister, the hon. Member for Croydon South (Chris Philp), would not accept amendments on this issue. We could have tightened up on fraudulent advertising. If Google can do that for financial ads, other platforms can do it. We tabled an amendment that the Government did not accept. I do not know why this Minister is quoting something that we quoted in Committee—I know he was not there, but he needs to know that we tried this and the former Minister did not accept what we called for.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am quoting that case merely because it is a good example of how, if we have better systems, we can get a better result. As part of the codes of practice, Ofcom will be able to look at some of these other systems and say to companies, “This is not just about content moderation; it is about having better systems that detect known illegal activity earlier and prevent it from getting on to the platform.” It is not about how quickly it is removed, but how effective companies are at stopping it ever being there in the first place. That is within the scope of regulation, and my belief is that those powers exist at the moment and therefore should be used.

Jess Phillips Portrait Jess Phillips
- Hansard - - - Excerpts

Just to push on this point, images of me have appeared on pornographic sites. They were not necessarily illegal images of anything bad happening to me, but other Members of Parliament in this House and I have suffered from that. Is the Minister telling me that this Bill will allow me to get in touch with that site and have an assurance that that image will be taken down and that it would be breaking the law if it did not do so?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Bill absolutely addresses the sharing of non-consensual images in that way, so that would be something the regulator should take enforcement action against—

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Well, the regulator is required, and has the power, to take enforcement action against companies for failing to do so. That is what the legislation sets out, and we will be in a very different place from where we are now. That is why the Bill constitutes a very significant reform.

Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle
- Hansard - - - Excerpts

Will the Minister give way?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Very briefly, and then I want to wrap up.

Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle
- Hansard - - - Excerpts

Could the Minister give me a reassurance about when consent is withdrawn? The image may initially have been there “consensually”—I would put that in inverted commas—so the platform is okay to put it there. However, if someone contacts the platform saying that they now want to change their consent—they may want to take a role in public life, having previously had a different role; I am not saying that about my hon. Friend the Member for Birmingham, Yardley (Jess Phillips)—my understanding is that there is no ability legally to enforce that content coming down. Can the Minister correct me, and if not, why is he not supporting new clause 7?

18:15
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

With people who have appeared in pornographic films consensually and signed contracts to do so, that would be a very different matter from the question of intimate images being shared without consent. When someone has not consented for such images to be there, that would be a very different matter. I am saying that the Bill sets out very clearly—it did not do so in draft form—that non-consensual sexual images and extreme pornography are within the scope of the regulator’s power. The regulator should be taking action not just on what a company does to take such content down when it is discovered after the event, but on what systems the company has in place and whether it deploys all available technology to make sure that such content is never there in the first place.

Before closing, I want to touch briefly on the point raised about the Secretary of State’s powers to designate priority areas of harm. This is now under the affirmative procedure in the Bill, and it requires the approval of both Houses of Parliament. The priority illegal harms will be based on offences that already exist in law, and we are writing those priority offences into the Bill. The other priorities will be areas where the regulator will seek to test whether companies adhere to their terms of service. The new transparency requirements will set that out, and the Government have said that we will set out in more detail which of those priority areas of harm such transparency will apply to. There is still more work to be done on that, but we have given an indicative example. However, when it comes to adding a new priority illegal offence to the Bill, the premise is that it will already be an offence that Parliament has created, and writing it into the Bill will be done with the positive consent of Parliament. I think that is a substantial improvement on where the Bill was before. I am conscious that I have filled my time.

Question put, That the clause be read a Second time.

18:17

Division 38

Ayes: 220

Noes: 285

Clause 34
Duties about fraudulent advertising: Category 1 services
Amendment made: 83, page 33, line 20, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 35
Duties about fraudulent advertising: Category 2A services
Amendment made: 84, page 34, line 21, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 63
Interpretation of this Chapter
Amendment made: 101, page 56, line 32, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 176
Powers to amend section 36
Amendment made: 141, page 141, line 39, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 177
Powers to amend or repeal provisions relating to exempt content or services
Amendments made: 177, page 142, line 6, at beginning insert “Subject to subsection (2A),”.
Amendment 178, page 142, line 7, after “49” insert “(2)(e),”.
Amendment 179, page 142, line 8, leave out paragraph (b).
Amendment 180, page 142, line 12, at end insert—
“(2A) Regulations under subsection (2) may not have the effect that comments and reviews on provider content present on a service of which the provider is a recognised news publisher become regulated user-generated content within the meaning of Part 3.”
Amendment 181, page 142, line 25, leave out “any” and insert “either”.
Amendment 182, page 142, line 28, leave out paragraph (b).
Amendment 183, page 142, line 34, at end insert—
“(7A) Subject to subsection (7B), the Secretary of State may by regulations amend paragraph 4 of Schedule 1 (limited functionality services) if the Secretary of State considers that it is appropriate because of the risk of harm to individuals in the United Kingdom presented by a service described in that paragraph.
(7B) Regulations under subsection (7A) may not have the effect that a service described in paragraph 4 of Schedule 1 of which the provider is a recognised news publisher is no longer exempt under that paragraph.”
Amendment 184, page 142, line 46, leave out subsection (11) and insert—
‘(11) In this section—
“comments and reviews on provider content” and “one-to-one live aural communications” have the meaning given by section 49;
“recognised news publisher” has the meaning given by section 50;
“regulated provider pornographic content” and “published or displayed” have the same meaning as in Part 5 (see section 66).’—(Damian Collins.)
Amendments 177 to 184 ensure that the power to amend the definition of “regulated user-generated content” in clause 49 cannot be exercised so as to include comments and reviews on content on services provided by recognised news publishers, and the power to amend paragraph 4 of Schedule 1 (limited functionality services) cannot be exercised so as to remove such services which are provided by recognised news publishers from the exemption.
Clause 179
Powers to amend Schedules 5, 6 and 7
Amendments made: 142, page 145, leave out lines 13 and 14 and insert—
“But an offence may be added to that Schedule only on the grounds in subsection (4) or (4A), and subsection (5) limits the power to add an offence.”
This amendment is consequential on Amendments 143 and 144.
Amendment 143, page 145, line 15, leave out from beginning to “the” and insert
“The first ground for adding an offence to Schedule 7 is that”.
This amendment is consequential on Amendment 144.
Amendment 144, page 145, line 24, at end insert—
“(4A) The second ground for adding an offence to Schedule 7 is that the Secretary of State considers it appropriate to do so because of—
(a) the prevalence of the use of regulated user-to-user services for the commission or facilitation of that offence,
(b) the risk of harm to individuals in the United Kingdom presented by the use of such services for the commission or facilitation of that offence, and
(c) the severity of that harm.”
This amendment extends the Secretary of State’s power to make regulations adding an offence to Schedule 7 so that it will be a priority offence. The new grounds concern the prevalence of user-to-user services being used to commit or facilitate the offence in question.
Amendment 145, page 146, line 5, leave out “and (9)” and insert “, (9) and (9A)”.—(Damian Collins.)
This technical amendment ensures that a reference to clause 52 takes account of the new subsection inserted by Amendment 98.
Clause 182
Parliamentary procedure for regulations
Amendment made: 185, page 147, line 3, after “(6)” insert “, (7A)”.—(Damian Collins.)
This amendment ensures that regulations under clause 177(7A) (inserted by Amendment 183) are subject to the affirmative procedure.
Amendment proposed: 31, page 147, line 16, leave out from “unless” to end of line 17 and insert—
“(a) a draft of the instrument has been laid before each House of Parliament,
“(b) the Secretary of State has made a motion in the House of Commons in relation to the draft instrument, and
(c) the draft instrument has been approved by a resolution of each House of Parliament.”—(Barbara Keeley.)
This amendment would require a draft of a statutory instrument containing regulations under sections 53 or 54 to be debated on the floor of the House of Commons, rather than in a delegated legislation committee (as part of the affirmative procedure).
Question put, That the amendment be made.
18:34

Division 39

Ayes: 188

Noes: 283

Clause 193
Index of defined terms
Amendments made: 186, page 158, line 25, at end insert—

“journalistic content (in Part 3)

section 16”

This is a technical amendment adding a definition of “journalistic content” to the index of defined terms.
Amendment 146, page 159, line 18, at end insert—

“priority offence (in Part 3)

section 52”

—(Damian Collins.)
This technical amendment adds a definition of “priority offence” to the index of defined terms.
Schedule 8
Transparency reports by providers of Category 1 services, Category 2a services and Category 2b services
Amendment made: 150, page 188, line 29, at end insert—
“7A Features, including functionalities, that a provider considers may contribute to risks of harm to individuals using the service, and measures taken or in use by the provider to mitigate and manage those risks.”—(Damian Collins.)
This amendment adds a new matter to Schedule 8, which is about things that providers can be asked to provide transparency reports about. The new matter is about risks around functionalities used by user-to-user services.
Ordered, That further consideration be now adjourned. —(James Duddridge.)
Bill to be further considered tomorrow.
Business of the House (Today)
Ordered,
That, at today’s sitting, the Speaker shall put the Questions necessary to dispose of proceedings on the Motion in the name of Mark Spencer relating to Restoration and Renewal of the Palace of Westminster not later than two hours after the commencement of proceedings on the Motion for this Order; such Questions shall include the Questions on any Amendments selected by the Speaker which may then be moved; the business on that Motion may be entered upon and proceeded with at any hour, though opposed; and Standing Order No. 41A (Deferred divisions) shall not apply.—(James Duddridge.)