All 2 Lindsay Hoyle contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Mon 5th Dec 2022

Online Safety Bill

Lindsay Hoyle Excerpts
Lindsay Hoyle Portrait Mr Speaker
- Hansard - -

With this it will be convenient to discuss the following:

New clause 2—Secretary of State’s powers to suggest modifications to a code of practice

“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.

(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.

(3) The Secretary of State may only write to OFCOM twice under this section for each code.

(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.

(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”

New clause 3—Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).”

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

New clause 4—Duty about content advertising or facilitating prostitution: Category 1 and Category 2B services

“(1) A provider of a Category 1 or Category 2B service must operate the service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of a Category 1 or Category 2B service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one Category 1 or Category 2B service, the duties set out in this section apply in relation to each such service.

(4) The duties set out in this section extend only to the design, operation and use of a Category 1 or Category 2B service in the United Kingdom.

(5) For the meaning of ‘Category 1 service’ and ‘Category 2B service’, see section 81 (register of categories of services).

(6) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 5—Duty about content advertising or facilitating prostitution: Category 2A services

“(1) A provider of a Category 2A service must operate that service so as to minimise the risk of individuals encountering content which advertises or facilitates prostitution in or via search results of the service.

(2) A provider of a Category 2A service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The reference to encountering content which advertises or facilitates prostitution “in or via search results” of a search service does not include a reference to encountering such content as a result of any subsequent interactions with an internet service other than the search service.

(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extend only to the design, operation and use of a Category 2A service in the United Kingdom.

(6) For the meaning of ‘Category 2A service’, see section 81 (register of categories of services).

(7) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 6—Duty about content advertising or facilitating prostitution: internet services providing pornographic content

“(1) A provider of an internet service within the scope of section 67 of this Act must operate that service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of an internet service under this section must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one internet service under this section, the duties set out in this section apply in relation to each such service.

(4) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 8—Duties about advertisements for cosmetic procedures

“(1) A provider of a regulated service must operate the service using systems and processes designed to—

(a) prevent individuals from encountering advertisements for cosmetic procedures that do not meet the conditions specified in subsection (3);

(b) minimise the length of time for which any such advertisement is present;

(c) where the provider is alerted by a person to the presence of such an advertisement, or becomes aware of it in any other way, swiftly take it down.

(2) A provider of a regulated service must include clear and accessible provisions in the terms of service giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The conditions under subsection (1)(a) are that the advertisement—

(a) contains a disclaimer as to the health risks of the cosmetic procedure, and

(b) includes a certified service quality indicator.

(4) If a person is the provider or more than one regulated service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extent only to the design, operation and use of a regulated service in the United Kingdom.

(6) For the meaning of ‘regulated service’, see section 3 (‘Regulated service’. ‘Part 3 service’ etc).”

This new clause would place a duty on all internet service providers regulated by the Bill to prevent individuals from encountering adverts for cosmetic procedures that do not contain a disclaimer as to the health risks of the procedure nor include a certified service quality indicator.

New clause 9—Content harmful to adults risk assessment duties: regulated search services

“(1) This section sets out the duties about risk assessments which apply in relation to all regulated search services.

(2) A duty to carry out a suitable and sufficient priority adults risk assessment at a time set out in, or as provided by Schedule 3.

(3) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adult risk assessment relating to the impacts of that proposed change.

(5) An ‘adults risk assessment’ of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the level of risk of individuals who are users of the service encountering each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) risks presented by algorithms used by the service, and the way that the service indexes, organises and presents search results;

(b) the level of risk of functionalities of the service facilitating individuals encountering search content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(c) the nature, and severity, of the harm that might be suffered by individuals from the matters identified in accordance with paragraphs (a) and (b);

(d) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6) In this section, references to risk profiles are to the risk profiles for the time being published under section 84 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(7) See also—section 20(2) (records of risk assessments), and Schedule 3 (timing of providers’ assessments).”

New clause 10—Safety Duties Protecting Adults: regulated search services

“(1) This section sets out the duties about protecting adults which apply in relation to all regulated search services.

(2) A duty to summarise in the policies of the search service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).

(3) A duty to include provisions in the search service policies specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (4), which of those kinds of treatment is to be applied.

(4) The duties set out in subsections (2) and (3) apply across all areas of a service, including the way the search engine is operated and used as well as search content of the service, and (among other things) require the provider of a service to take or use measures in the following areas, if it is proportionate to do so—

(a) regulatory compliance and risk management arrangements,

(b) design of functionalities, algorithms and other features relating to the search engine,

(c) functionalities allowing users to control the content they encounter in search results,

(d) content prioritisation and ranking,

(e) user support measures, and

(f) staff policies and practices.

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—

(a) any provisions of the policies included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the policies in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently in relation to content which the provider reasonably considers is priority

(NaN) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(NaN) A duty to ensure that the provisions of the publicly available statement referred to in subsections (5) and (7) are clear and accessible.

(NaN) In this section—

‘adults’ risk assessment’ has the meaning given by section 12;

‘non-designated content that is harmful to adults’ means content that is harmful to adults other than priority content that is harmful to adults.”

New clause 18—Child user empowerment duties

“(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section ‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 57(1)).

(12) In this section references to features include references to functionalities and settings.”

New clause 24—Category 1 services: duty not to discriminate, harass or victimise against service users

“(1) The following duties apply to all providers of Category 1 services.

(2) A duty not to discriminate, on the grounds of a protected characteristic, against a person wishing to use the service by not providing the service, if the result of not providing the service is to cause harm to that person.

(3) A duty not to discriminate, on the grounds of a protected characteristic, against any user of the service in a way that causes harm to the user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(4) A duty not to harass, on the grounds of a protected characteristic, a user of the service in a way that causes harm to the user.

(5) A duty not to victimise because of a protected characteristic a person wishing to use the service by not providing the user with the service, if the result of not providing the service is to cause harm to that person.

(6) A duty not to victimise a service user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(7) In this section—

references to harassing, discriminating or victimising have the same meaning as set out in Part 2 of the Equality Act 2010;

‘protected characteristic’ means a characteristic listed in section 4 of the Equality Act 2010.”

This new clause would place a duty, regulated by Ofcom, on Category 1 service providers not to discriminate, harass or victimise users of their services on the basis of a protected characteristic if doing so would result in them being caused harm. Discrimination, harassment and victimisation, and protected characteristics, have the same meaning as in the Equality Act 2010.

New clause 25—Report on duties that apply to all internet services likely to be accessed by children

“(1) Within 12 months of this Act receiving Royal Assent, the Secretary of State must commission an independent evaluation of the matters under subsection (2) and must lay the report of the evaluation before Parliament.

(2) The evaluation under subsection (1) must consider whether the following duties should be imposed on all providers of services on the internet that are likely to be accessed by children, other than services regulated by this Act—

(a) duties similar to those imposed on regulated services by sections 10 and 25 of this Act to carry out a children’s risk assessment, and

(b) duties similar to those imposed on regulated services by sections 11 and 26 of this Act to protect children’s online safety.”

This new clause would require the Secretary of State to commission an independent evaluation on whether all providers of internet services likely to be accessed by children should be subject to child safety duties and must conduct a children’s risk assessment.

New clause 26—Safety by design

“(1) In exercising their functions under this Act—

(a) The Secretary of State, and

(b) OFCOM

must have due regard to the principles in subsections (2)-(3).

(2) The first principle is that providers of regulated services should design those services to prevent harmful content from being disseminated widely, and that this is preferable in the first instance to both—

(a) removing harmful content after it has already been disseminated widely, and

(b) restricting which users can access the service or part of it on the basis that harmful content is likely to disseminate widely on that service.

(4) The second principle is that providers of regulated services should safeguard freedom of expression and participation, including the freedom of expression and participation of children.”

This new clause requires the Secretary of State and Ofcom to have due regard to the principle that internet services should be safe by design.

New clause 27—Publication of risk assessments

“Whenever a Category 1 service carries out any risk assessment pursuant to Part 3 of this Act, the service must publish the risk assessment on the service’s website.”

New clause 38—End-to-end encryption

“Nothing in this Act shall prevent providers of user-to-user services protecting their users’ privacy through end-to-end encryption.”

Government amendment 57.

Amendment 202, in clause 6, page 5, line 11, at end insert—

“(ba) the duty about pornographic content set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that user-to-user services must meet the new duties set out in NS1.

Government amendments 163, 58, 59 and 60.

Amendment 17, in clause 8, page 7, line 14, at end insert—

“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—

(i) enable users to encounter illegal content on other regulated user-to-user services, and

(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”

This amendment would incorporate into the duties a requirement to consider cross-platform risk.

Amendment 15, in clause 8, page 7, line 14, at end insert—

“(5A) The duties set out in this section apply in respect of content which reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

This amendment extends the illegal content risk assessment duties to cover content which could be foreseen to facilitate or aid the discovery or dissemination of CSEA content.

Government amendments 61 and 62.

Amendment 18, page 7, line 30 [Clause 9], at end insert—

“(none) ‘, including by being directed while on the service towards priority illegal content hosted by a different service;’

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Amendment 16, in clause 9, page 7, line 35, at end insert—

“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.

Amendment 19, in clause 9, page 7, line 35, at end insert—

“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content.”

This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.

Government amendments 63 to 67.

Amendment 190, page 10, line 11, in clause 11, at end insert “, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Government amendments 68 and 69.

Amendment 42, page 11, line 16, in clause 11, at end insert—

“(c) the benefits of the service to children’s well-being.”

Amendment 151, page 12, line 43, leave out Clause 13.

This amendment seeks to remove Clause 13 from the Bill.

Government amendment 70.

Amendment 48, page 13, line 5, in clause 13, leave out “is to be treated” and insert

“the provider decides to treat”

This amendment would mean that providers would be free to decide how to treat content that has been designated ‘legal but harmful’ to adults.

Amendment 49, page 13, line 11, in clause 13, at end insert—

‘(ca) taking no action;”

This amendment provides that providers would be free to take no action in response to content referred to in subsection (3).

Government amendments 71 and 72.

Amendment 157, page 14, line 11, in clause 14, leave out subsections (6) and (7).

This amendment is consequential to Amendment 156, which would require all users of Category 1 services to be verified.

Government amendments 73, 164, 74 and 165.

Amendment 10, page 16, line 16, in clause 16, leave out from “or” until the end of line 17.

Government amendments 166 and 167.

Amendment 50, page 20, line 21, in clause 19, at end insert—

“(6A) A duty to include clear provision in the terms of service that the provider will not take down, or restrict access to content generated, uploaded or shared by a user save where it reasonably concludes that—

(a) the provider is required to do so pursuant to the provisions of this Act, or

(b) it is otherwise reasonable and proportionate to do so.”

This amendment sets out a duty for providers to include in terms of service a commitment not to take down or restrict access to content generated, uploaded or shared by a user except in particular circumstances.

Government amendment 168.

Amendment 51, page 20, line 37, in clause 19, at end insert—

“(10) In any claim for breach of contract brought in relation to the provisions referred to in subsection (7), where the breach is established, the court may make such award by way of compensation as it considers appropriate for the removal of, or restriction of access to, the content in question.”

This amendment means that where a claim is made for a breach of the terms of service result from Amendment 50, the court has the power to make compensation as it considers appropriate.

Government amendment 169.

Amendment 47, page 22, line 10, in clause 21, at end insert—

“(ba) the duties about adults’ risk assessment duties in section (Content harmful to adult risk assessment duties: regulated search services),

(bb) the safety duties protecting adults in section (Safety duties protecting adults: regulated search services).”

Government amendments 75 to 82.

Amendment 162, page 31, line 19, in clause 31, leave out “significant”

This amendment removes the requirement for there to be a “significant” number of child users, and replaces it with “a number” of child users.

Government amendments 85 to 87.

Amendment 192, page 36, line 31, in clause 37, at end insert—

“(ha) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 about codes of practice.

Amendment 44, page 37, line 25, in clause 39, leave out from beginning to the second “the” in line 26.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 45, page 38, line 8, leave out Clause 40.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 13, page 38, line 12, in clause 40, leave out paragraph (a).

Amendment 46, page 39, line 30, leave out Clause 41.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 14, page 39, line 33, in clause 41, leave out subsection (2).

Amendment 21, page 40, line 29, in clause 43, leave out “may require” and insert “may make representations to”

Amendment 22, page 40, line 33, in clause 43, at end insert—

‘(2A) OFCOM must have due regard to representations by the Secretary of State under subsection (2).”

Government amendments 88 to 89 and 170 to 172.

Amendment 161, page 45, line 23, in clause 49, leave out paragraph (d).

This amendment removes the exemption for one-to-one live aural communications.

Amendment 188, page 45, line 24, in clause 49, leave out paragraph (e).

This amendment removes the exemption for comments and reviews on provider content.

Government amendments 90 and 173.

Amendment 197, page 47, line 12, in clause 50, after “material” insert

“or special interest news material”.

Amendment 11, page 47, line 19, in clause 50, after “has” insert “suitable and sufficient”.

Amendment 198, page 47, line 37, in clause 50, leave out the first “is” and insert

“and special interest news material are”.

Amendment 199, page 48, line 3, in clause 50, at end insert—

““special interest news material” means material consisting of news or information about a particular pastime, hobby, trade, business, industry or profession.”

Amendment 12, page 48, line 7, in clause 50, after “a” insert “suitable and sufficient”.

Government amendments 91 to 94.

Amendment 52, page 49, line 13, in clause 52, leave out paragraph (d).

This amendment limits the list of relevant offences to those specifically specified.

Government amendments 95 to 100.

Amendment 20, page 51, line 3, in clause 54, at end insert—

‘(2A) Priority content designated under subsection (2) must include—

(a) content that contains public health related misinformation or disinformation, and

(b) misinformation or disinformation that is promulgated by a foreign state.”

This amendment would require the Secretary of State’s designation of “priority content that is harmful to adults” to include public health-related misinformation or disinformation, and misinformation or disinformation spread by a foreign state.

Amendment 53, page 51, line 47, in clause 55, after “State” insert “reasonably”.

This amendment, together with Amendment 54, would mean that the Secretary of State must reasonably consider the risk of harm to each one of an appreciable number of adults before specifying a description of the content.

Amendment 54, page 52, line 1, in clause 55, after “to” insert “each of”.

This amendment is linked to Amendment 53.

Amendment 55, page 52, line 12, in clause 55, after “OFCOM” insert

“, Parliament and members of the public in a manner the Secretary of State considers appropriate”.

This amendment requires the Secretary of State to consult Parliament and the public, as well as Ofcom, in a manner the Secretary of State considers appropriate before making regulations about harmful content.

Government amendments 147 to 149.

Amendment 43, page 177, line 23, in schedule 4, after “ages” insert

“, including the benefits of the service to their well-being,”

Amendment 196, page 180, line 9, in schedule 4, at end insert—

Amendment 187, page 186, line 32, in schedule 7, at end insert—

Human trafficking

22A An offence under section 2 of the Modern Slavery Act 2015.”

This amendment includes Human Trafficking as a priority offence.

Amendment 211, page 187, line 23, in schedule 7, at end insert—

Government new clause 14.

Government new clause 15.

Government amendments 83 to 84.

Amendment 156, page 53, line 7, in clause 57, leave out subsections (1) and (2) and insert—

‘(1) A provider of a Category 1 service must require all adult users of the service to verify their identity in order to access the service.

(2) The verification process—

(a) may be of any kind (and in particular, it need not require documentation to be provided),

(b) must—

(i) be carried out by a third party on behalf of the provider of the Category 1 service,

(ii) ensure that all anonymous users of the Category 1 service cannot be identified by other users, apart from where provided for by section (Duty to ensure anonymity of users).”

This amendment would require all users of Category 1 services to be verified. The verification process would have to be carried out by a third party and to ensure the anonymity of users.

Government amendment 101.

Amendment 193, page 58, line 33, in clause 65, at end insert—

“(ea) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 in respect of guidance about transparency reports.

Amendment 203, page 60, line 33, in clause 68, at end insert—

‘(2B) A duty to meet the conditions set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that commercial pornographic websites must meet the new duties set out in NS1.

Government amendments 141, 177 to 184, 142 to 145, 185 to 186 and 146.

New schedule 1—Additional duties on pornographic content

“30 All user-to-user services and an internet service which provides regulated provider pornographic content must meet the following conditions for pornographic content and content that includes sexual photographs and films (“relevant content”).

The conditions are—

(a) the service must not contain any prohibited material,

(b) the service must review all relevant content before publication.

31 In this Schedule—

“photographs and films” has the same meaning as section 34 of the Criminal Justice and Courts Act 2015 (meaning of “disclose” and “photograph or film”)

“prohibited material” has the same meaning as section 368E(3) of the Communications Act 2003 (harmful material).”

The new schedule sets out additional duties for pornographic content which apply to user-to-user services under Part 3 and commercial pornographic websites under Part 5.

Government amendments 150 and 174.

Amendment 191, page 94, line 24, in clause 12, at end insert—

“Section [Category 1 services: duty not to discriminate against, harass or victimise service users] Duty not to discriminate against, harass or victimise

This amendment makes NC24 an enforceable requirement.

Government amendment 131.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - -

I welcome the new Minister to the Dispatch Box.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.

Relative to the point of order from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.

We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discusssing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - -

And the other way, as well.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Exactly. The concept at the heart of this legislation is simple. Tech companies, like those in every other sector, must take appropriate responsibility for the consequences of their business decisions. As they continue to offer their users the latest innovations that enrich our lives, they must consider safety as well as profit. They must treat their users fairly and ensure that the internet remains a place for robust debate. The Bill has benefited from input and scrutiny from right across the House. I pay tribute to my predecessor, my hon. Friend the Member for Croydon South (Chris Philp), who has worked tirelessly on the Bill, not least through 50 hours of Public Bill Committee, and the Bill is better for his input and work.

We have also listened to the work of other Members of the House, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the right hon. Member for Barking (Dame Margaret Hodge), my right hon. Friend the Member for Haltemprice and Howden and the Chair of the Select Committee, my hon. Friend the Member for Solihull (Julian Knight), who have all made important contributions to the discussion of the Bill.

We have also listened to those concerned about freedom of expression online. It is worth pausing on that, as there has been a lot of discussion about whether the Bill is censoring legal speech online and much understandable outrage from those who think it is. I asked the same questions when I chaired the Joint Committee on the Bill. This debate does not reflect the actual text of the Bill itself. The Bill does not require platforms to restrict legal speech—let us be absolutely clear about that. It does not give the Government, Ofcom or tech platforms the power to make something illegal online that is legal offline. In fact, if those concerned about the Bill studied it in detail, they would realise that the Bill protects freedom of speech. In particular, the Bill will temper the huge power over public discourse wielded by the big tech companies behind closed doors in California. They are unaccountable for the decisions they make on censoring free speech on a daily basis. Their decisions about what content is allowed will finally be subject to proper transparency requirements.

--- Later in debate ---
Diana Johnson Portrait Dame Diana Johnson (Kingston upon Hull North) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Lindsay Hoyle Portrait Mr Deputy Speaker
- View Speech - Hansard - -

With this it will be convenient to discuss the following:

New clause 33—Meaning of “pornographic content”

“(1) In this Act ‘pornographic content’ means any of the following—

(a) a video work in respect of which the video works authority has issued an R18 certificate;

(b) content that was included in a video work to which paragraph (a) applies, if it is reasonable to assume from its nature that its inclusion was among the reasons why the certificate was an R18 certificate;

(c) any other content if it is reasonable to assume from its nature that any classification certificate issued in respect of a video work including it would be an R18 certificate;

(d) a video work in respect of which the video works authority has issued an 18 certificate, and that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal;

(e) content that was included in a video work to which paragraph (d) applies, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the certificate was an 18 certificate;

(f) any other content if it is reasonable to assume from its nature—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that any classification certificate issued in respect of a video work including it would be an 18 certificate;

(g) a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if—

(i) it includes content that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal, and

(ii) it is reasonable to assume from the nature of that content that its inclusion was among the reasons why the video works authority made that determination;

(h) content that was included in a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the video works authority made that determination;

(i) any other content if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that the video works authority would determine that a video work including it was not suitable for a classification certificate to be issued in respect of it.

(2) In this section—

‘18 certificate’ means a classification certificate which—

(a) contains, pursuant to section 7(2)(b) of the Video Recordings Act 1984, a statement that the video work is suitable for viewing only by persons who have attained the age of 18 and that no video recording containing that work is to be supplied to any person who has not attained that age, and

(b) does not contain the statement mentioned in section 7(2)(c) of that Act that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘classification certificate’ has the same meaning as in the Video Recordings Act 1984 (see section 7 of that Act);

‘content’ means—

(a) a series of visual images shown as a moving picture, with or without sound;

(b) a still image or series of still images, with or without sound; or

(c) sound;

‘R18 certificate’ means a classification certificate which contains the statement mentioned in section 7(2)(c) of the Video Recordings Act 1984 that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘the video works authority’ means the person or persons designated under section 4(1) of the Video Recordings Act 1984 as the authority responsible for making arrangements in respect of video works other than video games;

‘video work’ means a video work within the meaning of the Video Recordings Act 1984, other than a video game within the meaning of that Act.”

This new clause defines pornographic content for the purposes of the Act and would apply to user-to-user services and commercial pornographic content.

Amendment 205, in clause 34, page 33, line 23, at end insert—

“(3A) But an advertisement shall not be regarded as regulated user-generated content and precluded from being a ‘fraudulent advertisement’ by reason of the content constituting the advertisement being generated directly on, uploaded to, or shared on a user-to-user service before being modified to a paid-for advertisement.”

Amendment 206, page 33, line 30, after “has” insert

“or may reasonably be expected to have”.

Amendment 207, in clause 36, page 35, line 12, at end insert—

“(3A) An offence under section 993 of the Companies Act 2006 (fraudulent trading).”

Amendment 208, page 35, line 18, after “(3)” insert “, 3(A)”.

Amendment 209, page 35, line 20, after “(3)” insert “, 3(A)”

Amendment 210, page 35, line 23, after “(3)” insert “, 3(A)”

Amendment 201, in clause 66, page 59, line 8, leave out from “Pornographic content” to end of line 10 and insert

“has the same meaning as section [meaning of pornographic content]”.

This amendment defines pornographic content for the purposes of the Part 5. It is consequential on NC33.

Amendment 56, page 59, line 8, after “content” insert “, taken as a whole,”

This amendment would require that content is considered as a whole before being defined as pornographic content.

Amendment 33, in clause 68, page 60, line 33, at end insert—

“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.

(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.

(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.

(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”

This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.

Amendment 34, page 60, line 37, leave out “subsection (2)” and insert “subsections (2) to (2D)”.

This amendment is consequential on Amendment 33.

Amendment 31, in clause 182, page 147, line 16, leave out from “unless” to end of line 17 and insert—

“(a) a draft of the instrument has been laid before each House of Parliament,

“(b) the Secretary of State has made a motion in the House of Commons in relation to the draft instrument, and

(c) the draft instrument has been approved by a resolution of each House of Parliament.”

This amendment would require a draft of a statutory instrument containing regulations under sections 53 or 54 to be debated on the floor of the House of Commons, rather than in a delegated legislation committee (as part of the affirmative procedure).

Amendment 158, in clause 192, page 155, line 26, after “including” insert “but not limited to”.

This amendment clarifies that the list of types of content in clause 192 is not exhaustive.

Diana Johnson Portrait Dame Diana Johnson
- View Speech - Hansard - - - Excerpts

May I welcome the Minister to his place, as I did not get an opportunity to speak on the previous group of amendments?

New clause 7 and amendments 33 and 34 would require online platforms to verify the age and consent of all individuals featured in pornographic videos uploaded to their site, as well as enabling individuals to withdraw their consent to the footage remaining on the website. Why are the amendments necessary? Let me read a quotation from a young woman:

“I sent Pornhub begging emails. I pleaded with them. I wrote, ‘Please, I’m a minor, this was assault, please take it down.’”

She received no reply and the videos remained live. That is from a BBC article entitled “I was raped at 14, and the video ended up on a porn site”.

This was no one-off. Some of the world’s biggest pornography websites allow members of the public to upload videos without verifying that everyone in the film is an adult or that everyone in the film gave their permission for it to be uploaded. As a result, leading pornography websites have been found to be hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse.

In 2020, The New York Times documented the presence of child abuse videos on Pornhub, one of the most popular pornography websites in the world, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site. The New York Times reporter Nicholas Kristof wrote about Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

Even before that, in 2019, PayPal took the decision to stop processing payments for Pornhub after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. The newspaper reported:

“Pornhub is awash with secretly filmed ‘creepshots’ of schoolgirls and clips of men performing sex acts in front of teenagers on buses. It has also hosted indecent images of children as young as three.

The website says it bans content showing under-18s and removes it swiftly. But some of the videos identified by this newspaper’s investigation had 350,000 views and had been on the platform for more than three years.”

One of the women who is now being forced to take legal action against Pornhub’s parent company, MindGeek, is Crystal Palace footballer Leigh Nicol. Leigh’s phone was hacked and private content was uploaded to Pornhub without her knowledge. She said in an interview:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do…The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

Leigh Nicol is spot on.

Unfortunately, when this subject was debated in Committee, the previous Minister, the hon. Member for Croydon South (Chris Philp), argued that the content I have described—including child sexual abuse images and videos—was already illegal, and there was therefore no need for the Government to introduce further measures. However, that misses the point: the Minister was arguing against the very basis of his own Government’s Bill. At the core of the Bill, as I understand it, is a legal duty placed on online platforms to combat and remove content that is already illegal, such as material relating to terrorism. ln keeping with that, my amendments would place a legal duty on online platforms hosting pornographic content to combat and remove illegal content through the specific and targeted measure of verifying the age and consent of every individual featured in pornographic content on their sites. The owners and operators of pornography websites are getting very rich from hosting footage of rape, trafficking and child sexual abuse, and they must be held to account under the law and required to take preventive action.

The Organisation for Security and Co-operation in Europe, which leads action to combat human trafficking across 57 member states, recommends that Governments require age and consent verification on pornography websites in order to combat exploitation. The OSCE told me:

“These sites routinely feature sexual violence, exploitation and abuse, and trafficking victims. Repeatedly these sites have chosen profits over reasonable prevention and protection measures. At the most basic level, these sites should be required to ensure that each person depicted is a consenting adult, with robust age verification and the right to withdraw consent at any time. Since self- regulation hasn’t worked, this will only work through strong, state-led regulation”.

Who else supports that? Legislation requiring online platforms to verify the age and consent of all individuals featured in pornographic content on their sites is backed by leading anti-sexual exploitation organisations including CEASE—the Centre to End All Sexual Exploitation—UK Feminista and the Traffickinghub movement, which has driven the global campaign to expose the abuses committed by, in particular, Pornhub.

New clause 7 and amendments 33 and 34 are minimum safety measures that would stop the well-documented practice of pornography websites hosting and profiting from videos of rape, trafficking and child sexual abuse. I urge the Government to reconsider their position, and I will seek to test the will of the House on new clause 7 later this evening.

Online Safety Bill

Lindsay Hoyle Excerpts
Further consideration of Bill, as amended in the Public Bill Committee
Lindsay Hoyle Portrait Mr Speaker
- Hansard - -

Before I call the Minister to open the debate, I have something to say about the scope of today’s debate. This is day 2 of debate on consideration of the Bill as amended in the Public Bill Committee. We are debating today only the new clauses, amendments and new schedules listed on the selection paper that I have issued today.

Members may be aware that the Government have tabled a programme motion that would recommit certain clauses and schedules to a Public Bill Committee. There will be an opportunity to debate that motion following proceedings on consideration. The Government have also published a draft list of proposed amendments to the Bill that they intend to bring forward during the recommittal process. These amendments are not in scope for today. There will be an opportunity to debate, at a future Report stage, the recommitted clauses and schedules, as amended on recommittal in the Public Bill Committee.

Most of today’s amendments and new clauses do not relate to the clauses and schedules that are being recommitted. These amendments and new clauses have been highlighted on the selection paper. Today will be the final chance for the Commons to consider them: there will be no opportunity for them to be tabled and considered again at any point during the remaining Commons stages.

New Clause 11

Notices to deal with terrorism content or CSEA content (or both)

“(1) If OFCOM consider that it is necessary and proportionate to do so, they may give a notice described in subsection (2), (3) or (4) relating to a regulated user-to-user service or a regulated search service to the provider of the service.

(2) A notice under subsection (1) that relates to a regulated user-to-user service is a notice requiring the provider of the service—

(a) to do any or all of the following—

(i) use accredited technology to identify terrorism content communicated publicly by means of the service and to swiftly take down that content;

(ii) use accredited technology to prevent individuals from encountering terrorism content communicated publicly by means of the service;

(iii) use accredited technology to identify CSEA content, whether communicated publicly or privately by means of the service, and to swiftly take down that content;

(iv) use accredited technology to prevent individuals from encountering CSEA content, whether communicated publicly or privately, by means of the service; or

(b) to use the provider’s best endeavours to develop or source technology for use on or in relation to the service or part of the service, which—

(i) achieves the purpose mentioned in paragraph (a)(iii) or (iv), and

(ii) meets the standards published by the Secretary of State (see section 106(10)).

(3) A notice under subsection (1) that relates to a regulated search service is a notice requiring the provider of the service—

(a) to do either or both of the following—

(i) use accredited technology to identify search content of the service that is terrorism content and to swiftly take measures designed to secure, so far as possible, that search content of the service no longer includes terrorism content identified by the technology;

(ii) use accredited technology to identify search content of the service that is CSEA content and to swiftly take measures designed to secure, so far as possible, that search content of the service no longer includes CSEA content identified by the technology; or

(b) to use the provider’s best endeavours to develop or source technology for use on or in relation to the service which—

(i) achieves the purpose mentioned in paragraph (a)(ii), and

(ii) meets the standards published by the Secretary of State (see section 106(10)).

(4) A notice under subsection (1) that relates to a combined service is a notice requiring the provider of the service—

(a) to do any or all of the things described in subsection (2)(a) in relation to the user-to-user part of the service, or to use best endeavours to develop or source technology as described in subsection (2)(b) for use on or in relation to that part of the service;

(b) to do either or both of the things described in subsection (3)(a) in relation to the search engine of the service, or to use best endeavours to develop or source technology as described in subsection (3)(b) for use on or in relation to the search engine of the service;

(c) to do any or all of the things described in subsection (2)(a) in relation to the user-to-user part of the service and either or both of the things described in subsection (3)(a) in relation to the search engine of the service; or

(d) to use best endeavours to develop or source—

(i) technology as described in subsection (2)(b) for use on or in relation to the user-to-user part of the service, and

(ii) technology as described in subsection (3)(b) for use on or in relation to the search engine of the service.

(5) For the purposes of subsections (2) and (3), a requirement to use accredited technology may be complied with by the use of the technology alone or by means of the technology together with the use of human moderators.

(6) See—

(a) section (Warning notices), which requires OFCOM to give a warning notice before giving a notice under subsection (1), and

(b) section 105 for provision about matters which OFCOM must consider before giving a notice under subsection (1).

(7) A notice under subsection (1) relating to terrorism content present on a service must identify the content, or parts of the service that include content, that OFCOM consider is communicated publicly on that service (see section 188).

(8) For the meaning of “accredited” technology, see section 106(9) and (10).”—(Julia Lopez.)

This clause replaces existing clause 104. The main changes are: for user-to-user services, a notice may require the use of accredited technology to prevent individuals from encountering terrorism or CSEA content; for user-to-user and search services, a notice may require a provider to use best endeavours to develop or source technology to deal with CSEA content.

Brought up, and read the First time.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - -

With this it will be convenient to discuss the following:

Government new clause 12—Warning notices.

Government new clause 20—OFCOM’s reports about news publisher content and journalistic content.

Government new clause 40—Amendment of Enterprise Act 2002.

Government new clause 42—Former providers of regulated services.

Government new clause 43—Amendments of Part 4B of the Communications Act.

Government new clause 44—Repeal of Part 4B of the Communications Act: transitional provision etc.

Government new clause 51—Publication by providers of details of enforcement action.

Government new clause 52—Exemptions from offence under section 152.

Government new clause 53—Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2).

New clause 1—Provisional re-categorisation of a Part 3 service

“(1) This section applies in relation to OFCOM’s duty to maintain the register of categories of regulated user-to-user services and regulated search services under section 83.

(2) If OFCOM—

(a) consider that a Part 3 service not included in a particular part of the register is likely to meet the threshold conditions relevant to that part, and

(b) reasonably consider that urgent application of duties relevant to that part is necessary to avoid or mitigate significant harm,

New clause 16—Communication offence for encouraging or assisting self-harm

“(1) In the Suicide Act 1961, after section 3 insert—

“3A Communication offence for encouraging or assisting self-harm

(1) A person (“D”) commits an offence if—

(a) D sends a message,

(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and

(c) D’s act was intended to encourage or assist the infliction of serious physical harm.

(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.

(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.

(4) A person guilty of an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;

(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.

(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.

(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.

(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).

(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—

(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and

(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and

(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.””

This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.

New clause 17—Liability of directors for compliance failure

“(1) This section applies where OFCOM considers that there are reasonable grounds for believing that a provider of a regulated service has failed, or is failing, to comply with any enforceable requirement (see section 112) that applies in relation to the service.

(2) If OFCOM considers that the failure results from any—

(a) action,

(b) direction,

(c) neglect, or

(d) with the consent

This new clause would enable Ofcom to exercise its enforcement powers under Chapter 6, Part 7 of the Bill against individual directors, managers and other officers at a regulated service provider where it considers the provider has failed, or is failing, to comply with any enforceable requirement.

New clause 23—Financial support for victims support services

“(1) The Secretary of State must by regulations make provision for penalties paid under Chapter 6 to be used for funding for victims support services.

(2) Those regulations must—

(a) specify criteria setting out which victim support services are eligible for financial support under this provision;

(b) set out a means by which the amount of funding available should be determined;

(c) make provision for the funding to be reviewed and allocated on a three year basis.

(3) Regulations under this section—

(a) shall be made by statutory instrument, and

(b) may not be made unless a draft has been laid before and approved by resolution of each House of Parliament.”

New clause 28—Establishment of Advocacy Body

“(1) There is to be a body corporate (“the Advocacy Body”) to represent interests of child users of regulated services.

(2) A “child user”—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) “enforceable requirements” relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Advocacy Body must assess emerging threats to child users of regulated services and must bring information regarding these threats to OFCOM.

(8) The Advocacy Body may undertake research on their own account.

(9) The Secretary of State must either appoint an organisation known to represent children to be designated the functions under this Act, or create an organisation to carry out the designated functions.

(10) The budget of the Advocacy Body will be subject to annual approval by the board of OFCOM.

(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 71).”

New clause 29—Duty to promote media literacy: regulated user-to-user services and search services

“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.

(2) This section applies only in relation to OFCOM’s duty to regulate—

(a) user-to-user services, and

(b) search services.

(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—

(a) to reach audiences who are less engaged with, and harder to reach through, traditional media literacy initiatives;

(b) to address gaps in the availability and accessibility of media literacy provisions targeted at vulnerable users;

(c) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;

(d) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by—

(i) carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public;

(ii) seeking to ensure, through the exercise of OFCOM’s online safety functions, that providers of regulated services take appropriate measures to improve users’ media literacy;

(iii) seeking to improve the evaluation of the effectiveness of the initiatives and measures mentioned in sub paras (2)(d)(i) and (ii) (including by increasing the availability and adequacy of data to make those evaluations);

(e) to promote better coordination within the media literacy sector.

(4) OFCOM may prepare such guidance about the matters referred to in subsection (2) as it considers appropriate.

(5) Where OFCOM prepares guidance under subsection (4) it must—

(a) publish the guidance (and any revised or replacement guidance); and

(b) keep the guidance under review.

(6) OFCOM must co-operate with the Secretary of State in the exercise and performance of their duty under this section.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 30—Media literacy strategy

“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The strategy must—

(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),

(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;

(c) explain why OFCOM considers that the steps it proposes to take will be effective;

(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.

(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.

(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.

(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—

(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;

(b) the advisory committee on disinformation and misinformation, and

(c) any other person that OFCOM consider appropriate.

(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—

(a) revise the strategy, or

(b) publish an explanation of why they have decided not to revise it.

(7) If OFCOM decides to revise the strategy they must—

(a) consult in accordance with subsection (3), and

(b) publish the revised strategy.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 31—Research conducted by regulated services

“(1) OFCOM may, at any time it considers appropriate, produce a report into how regulated services commission, collate, publish and make use of research.

(2) For the purposes of the report, OFCOM may require services to submit to OFCOM—

(a) a specific piece of research held by the service, or

(b) all research the service holds on a topic specified by OFCOM.”

New clause 34—Factual Accuracy

“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.

(2) Any Regulated Service must provide an index of the historic factual accuracy of material published by each user who has—

(a) produced user-generated content,

(b) news publisher content, or

(c) comments and reviews on provider contact

(3) The index under subsection (1) must—

(a) satisfy minimum quality criteria to be set by OFCOM, and

(b) be displayed in a way which allows any user easily to reach an informed view of the likely factual accuracy of the content at the same time as they encounter it.”

New clause 35—Duty of balance

“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.

(2) Any Regulated Service which selects or prioritises particular—

(a) user-generated content,

(b) news publisher content, or

(c) comments and reviews on provider content

New clause 36—Identification of information incidents by OFCOM

“(1) OFCOM must maintain arrangements for identifying and understanding patterns in the presence and dissemination of harmful misinformation and disinformation on regulated services.

(2) Arrangements for the purposes of subsection (1) must in particular include arrangements for—

(a) identifying, and assessing the severity of, actual or potential information incidents; and

(b) consulting with persons with expertise in the identification, prevention and handling of disinformation and misinformation online (for the purposes of subsection (2)(a)).

(3) Where an actual or potential information incident is identified, OFCOM must as soon as reasonably practicable—

(a) set out any steps that OFCOM plans to take under its online safety functions in relation to that situation; and

(b) publish such recommendations or other information that OFCOM considers appropriate.

(4) Information under subsection (3) may be published in such a manner as appears to OFCOM to be appropriate for bringing it to the attention of the persons who, in OFCOM’s opinion, should be made aware of it.

(5) OFCOM must prepare and issue guidance about how it will exercise its functions under this section and, in particular—

(a) the matters it will take into account in determining whether an information incident has arisen;

(b) the matters it will take into account in determining the severity of an incident; and

(c) the types of responses that OFCOM thinks are likely to be appropriate when responding to an information incident.

(6) For the purposes of this section—

“harmful misinformation or disinformation” means misinformation or disinformation which, taking into account the manner and extent of its dissemination, may have a material adverse effect on users of regulated services or other members of the public;

“information incident” means a situation where it appears to OFCOM that there is a serious or systemic dissemination of harmful misinformation or disinformation relating to a particular event or situation.”

This new clause would insert a new clause into the Bill to give Ofcom a proactive role in identifying and responding to the sorts of information incidents that can occur in moments of crisis.

New clause 37—Duty to promote media literacy: regulated user-to-user services and search services

“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.

(2) This section applies only in relation to OFCOM’s duty to regulate—

(a) user-to-user services, and

(b) search services.

(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—

(a) to encourage the development and use of technologies and systems in relation to user-to-user services and search services which help to improve the media literacy of members of the public, including in particular technologies and systems which—

(i) indicate the nature of content on a service (for example, show where it is an advertisement);

(ii) indicate the reliability and accuracy of the content; and

(iii) facilitate control over what content is received;

(b) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;

(c) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public.

(4) OFCOM must prepare guidance about—

(a) the matters referred to in subsection (3) as it considers appropriate; and

(b) minimum standards that media literacy initiatives must meet.

(5) Where OFCOM prepares guidance under subsection (4) it must—

(a) publish the guidance (and any revised or replacement guidance); and

(b) keep the guidance under review.

(6) Every report under paragraph 12 of the Schedule to the Office of Communications Act 2002 (OFCOM’s annual report) for a financial year must contain a summary of the steps that OFCOM have taken under subsection (1) in that year.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 45—Sharing etc intimate photographs or film without consent

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) A person (A) does not commit an offence under this section if A shares a photograph or film of another person (B) with B or a third person (C) if—

(a) the photograph or film only shows activity that would be ordinarily seen on public street, except for a photograph or film of breastfeeding;

(b) the photograph or film was taken in public, where the person depicted was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;

(c) A reasonably believed that the photograph or film, taken in public, showed a person depicted who was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;

(d) the photograph or film has been previously shared with consent in public;

(e) A reasonably believed that the photograph or film had been previously shared with consent in public;

(f) the photograph or film shows a young child and is of a kind ordinarily shared by family and friends;

(g) the photograph or film is of a child shared for that child’s medical care or treatment, where there is parental consent.

(4) A person (A) does not commit an offence under this section if A shares information about where to access a photograph or film where this photograph or film has already been made available to A.

(5) It is a defence for a person charged with an offence under this section to prove that they—

(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime;

(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings;

(c) reasonably believed that the sharing was necessary for the administration of justice;

(d) reasonably believed that the sharing was necessary for a genuine medical, scientific or educational purpose; and

(e) reasonably believed that the sharing was in the public interest.

(6) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(7) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(8) “Photograph” includes the negative as well as the positive version.

(9) “Film” means a moving image.

(10) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(11) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(12) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding 6 months or a fine (or both).”

This new clause creates the offence of sharing an intimate image without consent, providing the necessary exclusions such as for children’s medical care or images taken in public places, and establishing the penalty as triable by magistrates only with maximum imprisonment of 6 months.

New clause 46—Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents; and

(c) A intends that the subject of the photograph or film will be caused alarm, distress or humiliation by the sharing of the photograph or film.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(4) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(5) “Photograph” includes the negative as well as the positive version.

(6) “Film” means a moving image.

(7) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(9) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 47—Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents; and

(c) A shared the photograph or film for the purpose of obtaining sexual gratification (whether for the sender or recipient).

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(4) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(5) “Photograph” includes the negative as well as the positive version.

(6) “Film” means a moving image.

(7) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(9) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 48—Threatening to share etc intimate photographs or film

“(1) A person (A) commits an offence if—

(a) A threatens to share an intimate photograph or film of another person (B) with B or a third person (C); and

(i) A intends B to fear that the threat will be carried out; or A is reckless as to whether B will fear that the threat will be carried out.

(2) “Threatening to share” should be read to include threatening to share an intimate photograph or film that does not exist and other circumstances where it is impossible for A to carry out the threat.

(3) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(4) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(5) References to sharing, or threatening to share, such a photograph or film with another person include—

(a) sending, or threatening to send, it to another person by any means, electronically or otherwise;

(b) showing, or threatening to show, it to another person;

(c) placing, or threatening to place, it for another person to find; or

(d) sharing, or threatening to share, it on or uploading it to a user-to-user service, including websites or online public forums.

(6) “Photograph” includes the negative as well as the positive version.

(7) “Film” means a moving image.

(8) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(9) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(10) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates another more serious offence of threatening to share an intimate image, regardless of whether such an image actually exists, and where the sender intends to cause fear, or is reckless to whether they would cause fear, punishable by 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 49—Special measures in criminal proceedings for offences involving the sharing of intimate images

“(1) Chapter 1 of Part 2 of the Youth Justice and Criminal Evidence Act 1999 (giving of evidence or information for purposes of criminal proceedings: special measures directions in case of vulnerable and intimidated witnesses) is amended as follows.

(2) In section 17 (witnesses eligible for assistance on grounds of fear or distress about testifying), in subsection (4A) after paragraph (b) insert “(c) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”

This new clause inserts intimate image abuse into legislation that qualifies victims for special measures when testifying in court (such as partitions to hide them from view, video testifying etc.) which is already prescribed by law.

New clause 50—Anonymity for victims of offences involving the sharing of intimate images

“(1) Section 2 of the Sexual Offences (Amendment) Act 1992 (Offences to which this Act applies) is amended as follows.

(2) In subsection 1 after paragraph (db) insert—

(dc) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”

Similar to NC49, this new clause allows victims of intimate image abuse the same availability for anonymity as other sexual offences to protect their identities and give them the confidence to testify against their abuser without fear of repercussions.

New clause 54—Report on the effect of Virtual Private Networks on OFCOM’s ability to enforce requirements

“(1) The Secretary of State must publish a report on the effect of the use of Virtual Private Networks on OFCOM’s ability to enforce requirements under section 112.

(2) The report must be laid before Parliament within six months of the passing of this Act.”

New clause 55—Offence of sending communication facilitating modern slavery and illegal immigration

‘(1) A person (A) commits an offence if—

(a) (A) intentionally shares with a person (B) or with a third person (C) a photograph or film which is reasonably considered to be, or to be intended to be, facilitating or promoting any activities which do, or could reasonably be expected to, give rise to an offence under—

(i) sections 1 (Slavery, servitude and forced labour), 2 (Human trafficking) or 4 (Committing offence with intent to commit an offence under section 2) of the Modern Slavery Act 2015; or

(ii) sections 24 (Illegal Entry and Similar Offences) or 25 (Assisting unlawful immigration etc) of the Immigration Act 1971; and

(a) (A) does so knowing, or when they reasonably ought to have known, that the activities being depicted are unlawful.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) A person (A) does not commit an offence under this section if—

(a) the sharing is undertaken by or on behalf of a journalist or for journalistic purposes;

(b) the sharing is by a refugee organisation registered in the UK and which falls within the scope of sub-section (3) or section 25A of the Immigration Act 1971;

(c) the sharing is by or on behalf of a duly elected Member of Parliament or other elected representative in the UK.

(4) It is a defence for a person charged under this section to provide that they—

(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime and

(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings.

(5) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).”

This new clause would create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration.

Government amendments 234 and 102 to 117.

Amendment 195, in clause 104, page 87, line 10, leave out subsection 1 and insert—

“(1) If OFCOM consider that it is necessary and proportionate to do so, they may—

(a) give a notice described in subsection (2), (3) or (4) relating to a regulated user to user service or a regulated search service to the provider of the service;

(b) give a notice described in subsection (2), (3) or (4) to a provider or providers of Part 3 services taking into account risk profiles produced by OFCOM under section 84.”

Amendment 152, page 87, line 18, leave out ‘whether’.

This amendment is consequential on Amendment 153.

Amendment 153, page 87, line 19, leave out ‘or privately’.

This amendment removes the ability to monitor encrypted communications.

Government amendment 118.

Amendment 204, in clause 105, page 89, line 17, at end insert—

“(ia) the level of risk of the use of the specified technology accessing, retaining or disclosing the identity or provenance of any confidential journalistic source or confidential journalistic material.”

This amendment would require Ofcom to consider the risk of the use of accredited technology by a Part 3 service accessing, retaining or disclosing the identity or provenance of journalistic sources or confidential journalistic material, when deciding whether to give a notice under Clause 104(1) of the Bill.

Government amendments 119 to 130, 132 to 134, 212, 213, 135 and 214.

Amendment 23, in clause 130, page 114, line 3, leave out paragraph (a).

Government amendment 175.

Amendment 160, in clause 141, page 121, line 9, leave out subsection (2).

This amendment removes the bar of conditionality that must be met for super complaints that relate to a single regulated service.

Amendment 24, page 121, line 16, leave out “The Secretary of State” and insert “OFCOM”.

Amendment 25, page 121, line 21, leave out from “(3),” to end of line 24 and insert “OFCOM must consult—

“(a) The Secretary of State, and

“(b) such other persons as OFCOM considers appropriate.”

This amendment would provide that regulations under clause 141 are to be made by OFCOM rather than by the Secretary of State.

Amendment 189, in clause 142, page 121, line 45, leave out from “including” to end of line 46 and insert

“90 day maximum time limits in relation to the determination and notification to the complainant of—”.

This requires the Secretary of State’s guidance to require Ofcom to determine whether a complaint is eligible for the super-complaints procedure within 90 days.

Amendment 26, in clause 146, page 123, line 33, leave out

“give OFCOM a direction requiring”

and insert “may make representations to”.

Amendment 27, page 123, line 36, leave out subsection (2) and insert—

“(2) OFCOM must have due regard to any representations made by the Secretary of State under subsection (1).”

Amendment 28, page 123, line 38, leave out from “committee” to end of line 39 and insert

“established under this section is to consist of the following members—”.

Amendment 29, page 124, line ], leave out from “committee” to “publish” in line 2 and insert

“established under this section must”.

Amendment 30, page 124, line 4, leave out subsection (5).

Amendment 32, page 124, line 4, leave out clause 148.

Government amendments 176, 239, 138, 240, 215, 241, 242, 217, 218, 243, 219, 244, 245, 220, 221, 140, 246, 222 to 224, 247, 225, 248, 226 and 227.

Amendment 194, in clause 157, page 131, line 16, leave out from beginning to end of line 17 and insert—

“(a) B has not consented for A to send or give the photograph or film to B, and”.

Government amendments 249 to 252, 228, 229 and 235 to 237.

Government new schedule 2—Amendments of Part 4B of the Communications Act.

Government new schedule 3—Video-sharing platform services: transitional provision etc.

Government amendment 238

Amendment 35, schedule 11, page 198, line 5, leave out “The Secretary of State” and insert “OFCOM”.

This amendment would give the power to make regulations under Schedule 11 to OFCOM.

Amendment 2, page 198, line 9, leave out “functionalities” and insert “characteristics”.

Amendment 1, page 198, line 9, at end insert—

“(1A) In this schedule, “characteristics” of a service include its functionalities, user base, business model, governance and other systems and processes.”

Amendment 159, page 198, line 9, at end insert—

“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”

This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.

Amendment 36, page 198, line 10, leave out “The Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 37, page 198, line 16, leave out “The Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 3, page 198, line 2, leave out “functionalities” and insert “characteristics”.

Amendment 9, page 198, line 28, leave out “and” and insert “or”.

Amendment 4, page 198, line 29, leave out “functionality” and insert “characteristic”.

Amendment 38, page 198, line 32, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 5, page 198, line 34, leave out “functionalities” and insert “characteristics”.

Amendment 39, page 198, line 37, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 40, page 198, line 41, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 6, page 198, line 4, leave out “functionalities” and insert “characteristics”.

Amendment 7, page 199, line 11, leave out “functionalities” and insert “characteristics”.

Amendment 8, page 199, line 28, leave out “functionalities” and insert “characteristics”.

Amendment 41, page 199, line 3, leave out subparagraphs (5) to (11).

This amendment is consequential on Amendment 35.

Government amendments 230, 253 to 261 and 233.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was about to speak to the programme motion, Mr Speaker, but you have outlined exactly what I was going to say, so thank you for that—I am glad to get the process right.

I am delighted to bring the Online Safety Bill back to the House for the continuation of Report stage. I start by expressing my gratitude to colleagues across the House for their contributions to the Bill through pre-legislative scrutiny and before the summer recess, and for their engagement with me since I took office as the Minister for Tech and the Digital Economy.

The concept at the heart of this legislation is simple: tech companies, like those in every other sector, must take responsibility for the consequences of their business decisions. As they continue to offer users the latest innovations, they must consider the safety of their users as well as profit. They must treat their users fairly and ensure that the internet remains a place for free expression and robust debate. As Members will be aware, the majority of the Bill was discussed on Report before the summer recess. Our focus today is on the provisions that relate to the regulator’s power and the criminal law reforms. I will take this opportunity also to briefly set out the further changes that the Government recently committed to making later in the Bill’s passage.

Let me take the Government amendments in turn. The Government’s top priority for this legislation has always been the protection of children. We recognise that the particularly abhorrent and pernicious nature of online child sexual exploitation and abuse—CSEA—demands the most robust response possible. Throughout the passage of the Bill, we have heard evidence of the appalling harm that CSEA causes. Repeatedly, we heard calls for strong incentives for companies to do everything they can to innovate and make safety technologies their priority, to ensure that there is no place for offenders to hide online. The Bill already includes a specific power to tackle CSEA, which allows Ofcom, subject to safeguards, to require tech companies to use accredited technology to identify and remove illegal CSEA content in public and private communications. However, we have seen in recent years how the online world has evolved to allow offenders to reach their victims and one another in new ways.

--- Later in debate ---
Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Hansard - - - Excerpts

To take that one step further, is it correct that Ofcom would set minimum standards for operators? For example, the Content Authenticity Initiative does not need primary legislation, but is an industry open-standard, open-source format. That is an example of modern technology that all companies could sign up to use, and Ofcom would therefore determine what needs to be done in primary legislation.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - -

Can I be helpful? We did say that our discussions should be within scope, but the Minister is tempting everybody to intervene out of scope. From his own point of view, I would have thought that it would be easier to keep within scope.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Thank you, Mr Speaker; I will just respond to my hon. Friend the Member for Bosworth (Dr Evans). There is a minimum standard in so far as the operators have to adhere to the terms of the Bill. Our aim is to exclude illegal content and ensure that children are as safe as possible within the remit of the Bill.

The changes will ensure a flexible approach so that companies can use their expertise to develop or source the most effective solution for their service, rather than us being prescriptive. That, in turn, supports the continued growth of our digital economy while keeping our citizens safe online.

--- Later in debate ---
Lindsay Hoyle Portrait Mr Speaker
- Hansard - -

Order. I am really bothered. I am trying to help the Minister, because although broadening discussion of the Bill is helpful, it is also allowing Members to come in with remarks that are out of scope. If we are going to go out of scope, we could be here a long time. I am trying to support the Minister by keeping him in scope.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Thank you, Mr Speaker; I will try to keep my remarks very much in scope.

The harmful communications offence in clause 151 was a reform to communication offences proposed in the Bill. Since the Bill has been made public, parliamentarians and stakeholders have expressed concern that the threshold that would trigger prosecution for the offence of causing serious distress could bring robust but legitimate conversation into the illegal space. In the light of that concern, we have decided not to take forward the harmful communications offence for now. That will give the Government an opportunity to consider further how the criminal law can best protect individuals from harmful communications, and ensure that protections for free speech are robust.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am glad that I gave way so that the hon. Lady could raise that point. Baroness Kidron and her organisation have raised that issue with me directly, and they have gathered media support. We will look at that as the Bill goes through this place and the Lords, because we need to see what the powers are at the moment and why they are not working.

Now is the time to take this legislation forward to ensure that it can deliver the safe and transparent online environment that children and adults so clearly deserve.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - -

I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is an absolute pleasure to be back in the Chamber to respond on behalf of the Opposition to this incredibly important piece of legislation on its long overdue second day on Report. It certainly has not been an easy ride so far: I am sure that Bill Committee colleagues across the House agree that unpicking and making sense of this unnecessarily complicated Bill has been anything but straightforward.

We should all be incredibly grateful and are all indebted to the many individuals, charities, organisations and families who have worked so hard to bring online safety to the forefront for us all. Today is a particularly important day, as we are joined in the Public Gallery by a number of families who have lost children in connection with online harms. They include Lorin LaFave, Ian Russell, Andy and Judy Thomas, Amanda and Stuart Stephens and Ruth Moss. I sincerely hope that this debate will do justice to their incredible hard work and commitment in the most exceptionally difficult of circumstances.