Crime and Policing Bill Debate

Full Debate: Read Full Debate
Department: Home Office
Moved by
Baroness Levitt Portrait Baroness Levitt
- View Speech - Hansard - -

That this House do not insist on its Amendments 256 and 257 and do agree with the Commons in their Amendments 257A and 257B in lieu.

257A: Page 99, line 24, at end insert the following new Clause—
“Taking down intimate image content
(1) The Online Safety Act 2023 is amended as follows.
(2) In section 10 (regulated user-to-user services: safety duties about illegal content) after subsection (3) insert—
“(3A) A duty to operate a service using proportionate systems and processes designed to take down—
(a) content in relation to which an intimate image content report is made to the provider (see section 20A(2)), and
(b) any other content identified by the provider as the same, or substantially the same, as that content, as soon as reasonably practicable, and no later than 48 hours, after the provider receives the report (unless subsection (3B) applies).
(3B) This subsection applies if the provider considers that—
(a) the content is not intimate image content, or
(b) the person making the report is not—
(i) the subject of the content, or
(ii) a person acting on that person’s behalf.”
(3) After section 20 (duty about content reporting) insert—
“20A Reporting of intimate image content
(1) The duty in section 20(2) includes a duty to operate a service using systems and processes that allow users and affected persons to easily make an intimate image content report to the provider.
(2) An “intimate image content report” is a report which—
(a) declares that content present on the service is intimate image content,
(b) declares that the report is made by—
(i) the subject of the content, or
(ii) a person acting on that person’s behalf,
(c) declares that the report—
(i) is made in good faith, and
(ii) to the best of the knowledge and belief of the person making the report, is true,
(d) provides sufficient information about the content for the provider to identify it,
(e) provides contact details for the person making the report, and
(f) complies with any other requirements specified in regulations made by the Secretary of State.
(3) The Secretary of State may by regulations make provision about how the requirements in subsection (2)(a) to (e) are to be met.”
(4) In section 21 (duties about complaints procedures) after subsection (2) insert—
“(2A) The duty in subsection (2) includes a duty to operate an expedited complaints procedure in relation to complaints within subsection (4)(a), (b)(i) or (b)(ii) that—
(a) are made by users or affected persons who have made an intimate image content report (see section 20A(2)), and
(b) are about the content to which the report relates.”
(5) In section 27 (regulated search services: safety duties about illegal content) after subsection (3) insert—
“(3A) A duty to operate a service using proportionate systems and processes designed to ensure that individuals are no longer able to encounter—
(a) search content in relation to which an intimate image content report is made to the provider (see section 31A(2)), and
(b) any other search content identified by the provider as the same, or substantially the same, as that content, as soon as reasonably practicable, and no later than 48 hours, after the provider receives the report (unless subsection (3B) applies).
(3B) This subsection applies if the provider considers that—
(a) the search content is not intimate image content, or
(b) the person making the report is not—
(i) the subject of the content, or
(ii) a person acting on that person’s behalf.”
(6) After section 31 (duty about content reporting) insert—
“31A Reporting of intimate image content
(1) The duty in section 31(2) includes a duty to operate a service using systems and processes that allow users and affected persons to easily make an intimate image content report to the provider.
(2) An “intimate image content report” is a report which—
(a) declares that search content is intimate image content,
(b) declares that the report is made by—
(i) the subject of the content, or
(ii) a person acting on that person’s behalf,
(c) declares that the report—
(i) is made in good faith, and
(ii) to the best of the knowledge and belief of the person making the report, is true,
(d) provides sufficient information about the search content for the provider to identify it,
(e) provides contact details for the person making the report, and
(f) complies with any other requirements specified in regulations made by the Secretary of State.
(3) The Secretary of State may by regulations make provision about how the requirements in subsection (2)(a) to (e) are to be met.”
(7) In section 32 (duties about complaints procedures) after subsection (2) insert—
“(2A) The duty in subsection (2) includes a duty to operate an expedited complaints procedure in relation to complaints within subsection (4)(a), (b)(i) or (b)(ii) that—
(a) are made by users or affected persons who have made an intimate image content report (see section 31A(2)), and
(b) are about the search content to which the report relates.”
(8) In section 59 (meaning of “illegal content” etc) after subsection (10) insert—
“(10A) “Intimate image content” means content that amounts to an offence under section 66B(1), (2) or (3) of the Sexual Offences Act 2003 (sharing intimate image of a person without consent).”
(9) In section 133 (confirmation decisions: requirements to take steps)—
(a) in subsection (4) after paragraph (c) insert—
“(ca) specify which of those requirements (if any) have been designated as intimate image content requirements (see subsections (7A) and (7B)),”;
(b) after subsection (7) insert—
“(7A) If the condition in subsection (7B) is met in relation to a requirement imposed by a confirmation decision which is of a kind described in subsection (1), OFCOM must designate the requirement as an “intimate image content requirement” for the purposes of section
138(3A) (offence of failure to comply with confirmation decision).
(7B) The condition referred to in subsection (7A) is that the requirement is imposed (whether or not exclusively) in relation to—
(a) a failure to comply with a provision listed in column 1 of the table, which
(b) where there is an entry for the provision in column 2 of the table, is in respect of a matter listed in column 2.

Provision

Failure in respect of

Section 10(2)(a)

(1) Intimate image content

(2) Priority illegal content which includes intimate image content

Section 10(2)(b)

(1) An offence under section 66B of the Sexual Offences Act 2003

(2) Priority offences which include an offence under that section

Section 10(3)(a) Section 10(3)(a)

(1) Intimate image content

(2) Priority illegal content which includes intimate image content

Section 10(3)(b)

(1) Intimate image content

(2) Illegal content which includes intimate image content

Section 10(3A)

Section 27(3)(a)

(1) Intimate image content

(2) Priority illegal content which includes intimate image content

Section 27(3)(b)

(1) Intimate image content

(2) Illegal content which includes intimate image content

Section 27(3A)”;

(c) in subsection (10) after ““CSEA content”,” insert ““intimate image content”,”.
(10) In section 138 (offence of failing to comply with requirements imposed by confirmation decision) after subsection (3) insert—
“(3A) A person to whom a confirmation decision is given commits an offence if, without reasonable excuse, the person fails to comply with an intimate image content requirement imposed by the decision (see section 133(7A) and (7B)).””
257B: Page 99, line 24, at end insert the following new Clause—
“Taking down intimate image content: consequential amendments
(1) The Online Safety Act 2023 is amended as follows.
(2) In section 10 (regulated user-to-user services: safety duties about illegal content)—
(a) in subsection (4) for “and (3)” substitute “to (3A)”;
(b) in subsection (5)—
(i) the words from “each paragraph” to the end become paragraph (a);
(ii) at the end of that paragraph insert “, and
(b) subsection (3A).”;
(c) in subsection (7) for “subsection (2) or (3)” substitute “subsections (2) to (3A)”.
(3) In section 23(5) (record-keeping and review duties) for “or (3)”, in the first place it occurs, substitute “, (3) or (3A)”.
(4) In section 27 (regulated search services: safety duties about illegal content)—
(a) in subsection (4) for “and (3)” substitute “to (3A)”;
(b) in subsection (7) for “subsection (2) or (3)” substitute “subsections (2) to (3A)”.
(5) In section 34(5) (record-keeping and review duties), for “or (3)”, in the first place it occurs, substitute “, (3) or (3A)”.
(6) In section 59(14) (meaning of “illegal content” etc) for “and “priority illegal content”” substitute “, “priority illegal content” and “intimate image content””.
(7) In section 71(2)(a)(i) (duty not to take down content except in accordance with terms of service: exceptions) for “or (3)” substitute “, (3) or (3A)”.
(8) In section 136(5) (confirmation decisions: proactive technology)—
(a) in paragraph (a) for “or (3)” substitute “, (3) or (3A)”; (b) in paragraph (c) for “or (3)” substitute “, (3) or (3A)”.
(9) In section 237 (index of defined terms) at the appropriate place insert—
“intimate image content (in Part 3) section 59”.
(10) In Schedule 4 (codes of practice)—
(a) in paragraph 9(1) for “or (3)” substitute “, (3) or (3A)”;
(b) in paragraph 9(3) for “or (3)” substitute “, (3) or (3A)”;
(c) in paragraph 13(3)(a) for “or (3)” substitute “, (3) or (3A)”;
(d) in paragraph 13(3)(c) for “or (3)” substitute “, (3) or (3A)”.”
Baroness Levitt Portrait The Parliamentary Under-Secretary of State, Ministry of Justice (Baroness Levitt) (Lab)
- Hansard - -

My Lords, in moving Motion G, I will also speak to Motions H, J, K, L, M, W and Y.

I will start with the collection of intimate image abuse-related amendments in lieu. These all flow from the amendments tabled by the noble Baroness, Lady Owen of Alderley Edge, on Report. I say once more, with feeling, that I thank the noble Baroness for engaging with us over the past few weeks, which she has done extensively. It is in the best traditions of your Lordships’ House. We have worked together to ensure—I hope—that the Government have taken the right direction with these amendments across the piece.

I turn to the take-down powers. Amendments 257A and 257B in Motion G build on the Government’s existing provisions. They do that by making failure to comply with an Ofcom enforcement decision relating to the new take-down duties a criminal offence. That means that senior executives of the service could be personally criminally liable for the failure. Alongside that enforcement approach, the Government are also strengthening safeguards against malicious reporting. We will bring forward regulations that will enable Ofcom to scrutinise both the speed of intimate image removals and how clearly and effectively platforms enable users to report such content. We are determined that victims of non-consensual intimate image abuse should see swift action, clear routes for redress and transparency from platforms.

Therefore, in addition to these amendments, the Government are working with Ofcom to create a clear route for reporting complaints regarding compliance with the NCII duty, signposting to specialist organisations. We will also use existing powers under the Online Safety Act to strengthen transparency, enabling Ofcom to require services to report on and publish their average NCII take-down times. I reiterate my thanks to the noble Baroness, Lady Owen, for her continued advocacy on this important topic.

I turn to deletion orders and Motion H. The Government recognise the serious harm caused by perpetrators retaining copies of intimate images. We have listened to the will of the House on Report, which is why we have brought forward a new deletion order that will be available on conviction for a broader range of offences. This new order can be made for all intimate image abuse offences, including breastfeeding voyeurism recording and the new sharing of semen-defaced images offences, which I will refer to throughout my speech as “intimate image-related offences”.

It will enable courts to order the deletion and destruction of all copies of a relevant image. It does that by requiring the court to give reasons if it declines to make a deletion order for images related to the offence. This mirrors the criminal law in relation to compensation orders. It strikes the right balance between protecting victims and preserving judicial discretion in appropriate cases. Importantly, it will enable courts to order the deletion and destruction of all copies related to a specified offence in the offender’s possession or control, as well as any other relevant images of the same victim. Breaching such an order will be a separate criminal offence, itself carrying a maximum penalty of five years’ imprisonment.

On hashing and the non-consensual intimate image register, the Government will give statutory backing to a register of non-consensual intimate images. Amendments 260A to 260D in Motion J enable the Government to designate a trusted flagger—which will most likely be the Revenge Porn Helpline—and, following a scoping exercise, to make further provisions by regulations about the operation of a statutory register. That includes provisions for the Secretary of State to impose requirements on providers to share hashes and any other information deemed necessary with the register. As Lords Amendments 260A to 260D recognise, proceeding by regulations will enable us properly to evaluate the requirements necessary to ensure a register operates as effectively as possible.

I turn to the question of pornography. I again thank the noble Baroness, Lady Bertin, for the time she has taken over the past few weeks to meet many Ministers, who really are grateful to her for the time she has spent with us, making sure that we get this right. I stress to your Lordships that the changes to which I now turn are just the start, and the Government mean it when we say that we look forward to working with the noble Baroness even more in future.

I will speak briefly to the parity sprint. This is a key piece of ongoing work that will build on the provisions in the Bill, and the discussions have already started. This work will identify the best way to fix the gap between the regulation of pornography online and offline. It will address content that is not caught by our proposed offences that would otherwise be illegal offline. This could include, for example, pornography where there is a suggestion that a person is under 18, a relationship is portrayed as abusive, or there is a clear exploitative power imbalance or breach of trust, including some examples of depictions of step-incest between adults, or a teacher and a student. At the end of this work, the Government are fully committed to implementation. If regulators need to be assigned, this will happen. If legislation is needed, this will happen. This Government are serious about this.

In a similar vein, on the verification of age and consent, we agree with the sentiment that underlies the amendment: non-consensual intimate images and child sexual abuse have no place online, and the tech platforms need to do more to prevent this type of illegal content. Having said this, further work is needed to identify the most effective approach. For this reason, the Government’s amendments in lieu, Amendments 264A to 264F in Motion L, provide for a further statutory sprint to test which mechanisms will be most effective for tackling this kind of content. It will also place a duty on the Secretary of State to report to Parliament within 12 months of the Bill receiving Royal Assent on the outcome of this work and will provide a power to make regulations to give effect to its outcome.

Given the existing criminal and regulatory legal frameworks, we need carefully to consider the gaps and how best they can be addressed. Upon completion of the review, we have the option of putting in place regulations to impose new duties on providers of internet services relating to verification of age and consent. The power would allow the appointment of a regulator to oversee these duties.

With regard to adults role-playing as children in pornography, we have listened to the concerns that were raised. We must protect the legislative regime that protects actual children from harm, which is why we cannot support Lords Amendment 265. However, we absolutely agree with the noble Baroness, Lady Bertin, that content that mimics child sexual abuse must be tackled. That is why we have brought forward amendments in lieu, Amendments 265A to 265H in Motion M, which will criminalise the possession and publication of pornographic images portraying sexual activity between persons where one person is or is pretending to be under 16. This will be a priority offence under the Online Safety Act. Our intention with these amendments is clearly to signal that content which mimics, and thus risks normalising, child sexual abuse is totally unacceptable and should not be available online.

I need to make clear what the provision criminalises. It includes pornographic depictions of any sexual activity where one party is pretending to be under 16. It is intentionally wide and will capture harmful content that we know exists. I apologise for being graphic here, but there is no way of avoiding it. I can see the noble Lord, Lord Pannick, laughing as I say this—I am here again talking about rather graphic acts. This provision will capture images such as an actor role-playing as an underage girl, where, for example, her underwear has been moved aside, or male genitals are in shot.

As I have just said, this offence is just the start. Content that is illegal offline but not caught by this offence will be addressed through the Government’s work on parity.

Similarly, in relation to pornography depicting incest, we have listened to the concerns about the extent to which the Government’s Lords Amendment 263 should cover other troubling relationships, such as sex between step-relations. We completely agree with the need to curtail the depiction of step-incest pornography in cases where it portrays conduct that is illegal in the real world. To that extent, the Government’s amendments in lieu, Amendments 263A to 263G in Motion K, will restore and extend the new offence of possession and publication of incest pornography. They will list the relevant family relationships and expand this to include step-parents and children, step-siblings and foster parents, and children where one of the persons is or is pretending to be under the age of 18. Where there are grey areas, such as step-relationships over 18, that show a clear power imbalance and would be illegal offline, this will be addressed through the parity sprint, about which I have already spoken.

Through Lords Amendments 255 and 395, the Government are criminalising the making, adapting and supplying of the nudification tools and are bringing chatbots into the scope of the Online Safety Act. This means that the requirements of the Online Safety Act will kick in. Social media services will be required to take down content that supplies nudification tools, and search engines will have to reduce the visibility of search results linked to these tools. When chatbots come into the scope of the Online Safety Act, they will also have to ensure that illegal nudification tools and images cannot be made, supplied or appear on those services. Taken together, these measures will deliver an effective ban on nudification tools.

Given this, we do not believe that a separate possession offence, as provided for in Lords Amendment 505, would make a meaningful difference, not least as many such tools are not possessed in the technical legal sense, but rather are accessed online. For this reason, we are seeking its removal via Motion Y, but we are very grateful to the noble Baroness, Lady Bertin, for engaging with us on this and for supporting the approach that we have discussed at length with her and finally fixed upon.

--- Later in debate ---
Lord Davies of Gower Portrait Lord Davies of Gower (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank my noble friends Lady Owen and Lady Bertin, on behalf of all noble Lords on the Conservative Benches, for their sustained efforts on these important issues. Their work and amendments will surely help to protect women and girls, whether through legislation on the taking down of intimate images or greater protection for age verification in pornographic content. I also thank the Government, particularly the Minister, for their continued engagement on these topics. These Motions are evidence of what this Chamber can achieve through collaborative and productive dialogue.

Baroness Levitt Portrait Baroness Levitt (Lab)
- View Speech - Hansard - -

My Lords, I thank all noble Lords for their contributions not just today but during the passage of this Bill, and for the thoughtful and constructive way in which everybody has engaged with these issues.

I shall be brief and address only one or two of the points that were raised. The first is in relation to Motion G1, tabled by the noble Baroness, Lady Owen. Motion G strengthens accountability where platforms fail to comply with their duties to deal with non-consensual intimate images. Regarding Motion G1, we recognise the noble Baroness’s concern and want transparency beyond just the biggest platforms. That is why every regulated user-to-user service must be clear with users about how it is meeting the 48-hour takedown duty, while Ofcom can require detailed reporting where it will make the biggest difference. Through Schedule 8, the Online Safety Act allows Ofcom to require detailed information about how providers identify, deal with and take down illegal content. We will amend this through regulations to make it clear that these requirements cover compliance with the new NCII takedown duty, including average takedown times.

Turning to the verification of age, again the Government recognise the concerns raised by the noble Baroness, Lady Bertin. We are not intentionally delaying these important changes for the sake of it. I think that the noble Baroness recognises that we all agree that this issue is important, but we cannot shy away from the complex legal and practical issues that it presents. These considerations must be made alongside and flowing from the existing six-month review into parity, closing the gap between regulation of online and offline pornography. For this reason, the 12 months is needed to ensure that we get it right. We are grateful to the noble Baroness for supporting this approach.

On the issue of adults role-playing as children and the question of step-incest, in relation to the point made by the noble Lord, Lord Clement-Jones, as to the differential in age, it is to ensure that the online offences mirror the underlying offline criminal offences so that there is parity between the two. I should stress that for both these offences, adult role-playing and the extension to step-incest offences, this is a first step. The provisions in this Bill create significant changes already in the criminal law and the parity work to which we have all referred will build on this to address the grey areas where it is illegal offline but difficult to address online via the criminal law.

It remains for me only to thank once again the two noble Baronesses, Lady Bertin and Lady Owen. I genuinely look forward to continuing to work with them in future.

Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- View Speech - Hansard - - - Excerpts

I thank the Minister for her response and am assured by it. I beg leave to withdraw Motion G1.

--- Later in debate ---
Moved by
Baroness Levitt Portrait Baroness Levitt
- View Speech - Hansard - -

That this House do not insist on its Amendment 258 and do agree with the Commons in their Amendment 258A in lieu.

258A: Page 99, line 24, at end insert the following new Clause—
“Image deletion orders
(1) The Sentencing Code is amended as follows.
(2) In Part 7 (financial orders and orders relating to property), after Chapter 4 insert—
“CHAPTER 4A
IMAGE DELETION ORDERS
161ZA Image deletion orders
(1) In this code “image deletion order” means an order under this Chapter which—
(a) is made in respect of an offender for an offence,
(b) relates to a photograph or film which is in the offender’s possession or under their control, and
(c) requires the offender to take steps specified in the order to ensure, so far as is reasonably practicable, that the photograph or film is put beyond use.
(2) For the purposes of subsection (1)(c), a photograph or film is put beyond use if—
(a) in the case of a physical item, it is destroyed;
(b) in the case of data stored by any means by or on behalf of the offender, it is deleted;
(c) in the case of content on an internet service, it is removed from the service or permanently hidden.
(3) For the purposes of this section—
(a) something is “deleted” if it is irrecoverable;
(b) “content”, in relation to an internet service, has the meaning given by section 236(1) of the Online Safety Act 2023;
(c) “internet service” has the meaning given by section 228 of that Act (and section 204(1) of that Act applies).
161ZB Image deletion orders: availability
(1) This section applies where a person commits an offence under any of the following provisions of the Sexual Offences Act 2003—
(a) section 66AA (sharing semen-defaced image);
(b) section 66AA (taking or recording intimate photograph or film);
(c) section 66AD (creating a copy of intimate photograph or film shared temporarily);
(d) section 66B (sharing or threatening to share intimate photograph or film);
(e) section 66E (creating purported intimate image of adult);
(f) section 66F (requesting the creation of purported intimate image of adult);
(g) section 67A(2B) (recording a person breast-feeding child).
(2) This section also applies where a person commits an inchoate offence in relation to an offence specified in subsection (1).
(3) The court by or before which the offender is convicted of the offence may make an image deletion order in respect of—
(a) a photograph or film to which the offence relates, and
(b) any other photograph or film—
(i) which shows, or appears to show, a person who is the subject of the photograph or film to which the offence relates in an intimate state,
(ii) which is a semen-defaced image of a person who is the subject of the photograph or film to which the offence relates, or
(iii) which shows a person who is the subject of the photograph or film to which the offence relates breast-feeding a child.
(4) The following provisions of the Sexual Offences Act 2003 apply for the purposes of this section—
(a) section 66AA(2) (meaning of “semen-defaced image”);
(b) section 66D(5) to (9) (meaning of “showing, or appearing to show, another person in an intimate state”);
(c) section 67A(3A) and (3B) (meaning of references to a person breast-feeding a child), ignoring references to the intention of the person who recorded the photograph or film.
(5) In relation to an offence under section 66F of the Sexual Offences Act 2003, a photograph or film is a photograph or film to which the offence relates for the purposes of this section if—
(a) it appears to be of a person who was the subject of the request to which the offence relates (whether or not it is what was requested), and
(b) it was in the offender’s possession, or under the offender’s control, as a result of that request.
(6) An image deletion order is not available if the offence was committed before the day on which section (Image deletion orders) of the Crime and Policing Act 2026 comes into force.
161ZC Period for complying with requirements
(1) An image deletion order must specify, in respect of each step the order requires the offender to take, the date by which the step must be taken (and different dates may be specified in respect of different steps).
(2) Where the order requires the offender to take a step in relation to a photograph or film that would result in the offender being unable to recover the photograph or film—
(a) the order must not require the step to be taken before the end of the period for giving notice of appeal against the conviction or order, and
(b) where notice of appeal against the conviction or order is given, the offender is not required to take the step until the appeal is finally determined or withdrawn.
161ZD Offence of failing to comply with an image deletion order
(1) It is an offence for a person in respect of whom an image deletion order made under this Chapter is in force to fail without reasonable excuse to comply with any requirement included in the order.
(2) A person guilty of an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding the general limit in a magistrates’ court, or a fine not exceeding the statutory maximum, or both;
(b) on conviction on indictment, to imprisonment for a term not exceeding five years, or a fine, or both.
161ZE Image deletion orders: interpretation
(1) This section applies for the purposes of this Chapter.
(2) “Photograph” includes the negative as well as the positive version.
(3) “Film” means a moving image.
(4) References to a photograph or film also include—
(a) an image, whether made or altered by computer graphics or in any other way, which appears to be a photograph or film,
(b) a copy of a photograph, film or image within paragraph (a), and
(c) data stored by any means which is capable of conversion into a photograph, film or image within paragraph (a).”
(3) In Chapter 5 of Part 3 (duties to explain or give reasons), after section 55 insert—
“55A Duty to give reasons where image deletion order not made
Where—
(a) a court is dealing with an offender for an offence, and
(b) an image deletion order is available, the court must give reasons if it does not make an image deletion order in respect of a photograph or film to which the offence relates (see section 161ZB(3)(a)).””
Moved by
Baroness Levitt Portrait Baroness Levitt
- Hansard - -

That this House do not insist on its Amendments 259 and 260 and do agree with the Commons in their Amendments 260A to 260D in lieu.

That this House do not insist on its Amendments 259 and 260 and do agree with the Commons in their Amendments 260A to 260D in lieu.

260A: Page 99, line 24, at end insert the following new Clause—
“Intimate image material: reporting and registration
Schedule (Intimate image material: reporting and registration) makes provision about the reporting and registration of intimate image material.”
--- Later in debate ---
Moved by
Baroness Levitt Portrait Baroness Levitt
- Hansard - -

That this House do agree with the Commons in their Amendments 263A to 263G.

263A: Line 20, leave out from “think” to the end of line 22 and insert “what is set out in subsection (1A) or (1B).
(1A) That A and B were related, or pretending to be related, such that A was related to B as parent, grandparent, child, grandchild, brother, sister, half-brother, half-sister, uncle, aunt, nephew or niece.
(1B) That—
(a) A and B were related or had been related, or were pretending to be related or to have been related, such that A was or had been related to B as step-parent, step-child, stepbrother, stepsister, foster parent or foster child, and
(b) at least one of A and B was, or was pretending to be, under 18.”
--- Later in debate ---
Moved by
Baroness Levitt Portrait Baroness Levitt
- Hansard - -

That this House do not insist on its Amendment 264 and do agree with the Commons in their Amendments 264A to 264F in lieu.

264A: Page 99, line 24, at end insert the following new Clause—
“Online pornography (age and consent verification): duty to review and report
(1) The Secretary of State must conduct a review of the role of providers of internet services in—
(a) verifying the age of individuals appearing in pornographic content published or displayed on their services;
(b) verifying whether individuals appearing in pornographic content published or displayed on their services consent to the content being published or displayed.
(2) The Secretary of State must lay before Parliament, and publish, a report of the review.
(3) The Secretary of State must comply with subsections (1) and (2) before the end of the 12 month period beginning with the day on which this Act is passed.
(4) In this section the following terms have the same meaning as in the Online Safety Act 2023—
“internet service” (see section 228 of that Act);
“pornographic content” (see section 236 of that Act);
“provider”, in relation to an internet service of any kind (see section 226 of that Act).”
--- Later in debate ---
Moved by
Baroness Levitt Portrait Baroness Levitt
- Hansard - -

That this House do not insist on its Amendment 265 and do agree with the Commons in their Amendments 265A to 265C in lieu with the following amendments to Commons Amendment 265A—

265A: Page 99, line 24, at end insert the following new Clause—
“Pornographic images of sexual activity with child under 16
(1) After section 67G of the Criminal Justice and Immigration Act 2008 insert—
“67H Possession or publication of pornographic images of sexual activity with child under 16
(1) It is an offence for a person (P) to be in possession of an image if—
(a) the image is pornographic, within the meaning of section 63,
(b) the image portrays, in an explicit and realistic way, a person (A) engaged in sexual activity with another person (B),
(c) a reasonable person looking at the image would think that A and B were real, and
(d) a reasonable person—
(i) looking at the image, and
(ii) taking into account any sound or information associated with the image, would think that at least one of A or B was, or was pretending to be, under 16.
(2) It is an offence for a person to publish an image of the kind mentioned in subsection (1).
(3) Publishing an image includes giving or making it available to another person by any means.
(4) For the purposes of subsection (1)(d)—
(a) the reference to sound or information associated with the image is—
(i) when subsection (1)(d) applies for the purpose of an offence under subsection (1), to sound, or information, associated with the image that is in P’s possession, and
(ii) when subsection (1)(d) applies for the purpose of an offence under subsection (2), to sound, or information, associated with the image that the person in subsection (2) publishes with the image, and
(b) a person is not to be taken as pretending to be under 16 if it is fanciful that they are actually under 16 in the way pretended.
(5) In this section “image” has the same meaning as in section 63.
(6) Subsections (1) and (2) do not apply to excluded images, within the meaning of section 64.
(7) Proceedings for an offence under this section may not be instituted except by or with the consent of the Director of Public Prosecutions.
67I Defences to offences under section 67H
(1) Where a person is charged with an offence under section 67H(1), it is a defence for the person to prove any of the matters mentioned in subsection (2).
(2) The matters are—
(a) that the person had a legitimate reason for being in possession of the image concerned;
(b) that the person had not seen the image concerned and did not know, nor had any cause to suspect, it to be an image of the kind mentioned in section 67H(1);
(c) that the person—
(i) was sent the image concerned without any prior request having been made by or on behalf of the person, and
(ii) did not keep it for an unreasonable time;
(d) that—
(i) the person directly participated in the act portrayed as person A or person B mentioned in section 67H(1)(b),
(ii) the act did not involve the infliction of any non-consensual harm on any person, and
(iii) neither A nor B was under 16.
(3) Where a person is charged with an offence under section 67H(2), it is a defence for the person to prove any of the matters mentioned in subsection (4).
(4) The matters are—
(a) that the person had a legitimate reason for publishing the image concerned to the persons to whom they published it;
(b) that the person had not seen the image concerned and did not know, nor had any cause to suspect, it to be an image of the kind mentioned in section 67H(1);
(c) that—
(i) the person directly participated in the act portrayed as person A or person B mentioned in section 67H(1)(b),
(ii) the act did not involve the infliction of any non-consensual harm on any person,
(iii) neither A nor B was under 16, and
(iv) the person only published the image to person B or A (as the case may be).
(5) In this section “non-consensual harm” has the same meaning as in section 66.
67J Penalties for offences under section 67H
(1) A person who commits an offence under section 67H(1) is liable—
(a) on summary conviction, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years or a fine (or both).
(2) A person who commits an offence under section 67H(2) is liable—
(a) on summary conviction, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding five years or a fine (or both).”
(2) In section 68 of that Act (special rules relating to providers of information society services), after “67E” insert “and 67H”.
(3) In Schedule 14 to that Act (special rules relating to providers of information society services), in paragraphs 3(1), 4(2) and 5(1) after “67E” insert “or 67H”.
(4) In Schedule 34A to the Criminal Justice Act 2003 (child sex offences for the purposes of section 327A), after paragraph 13ZB insert—
“13ZC An offence under section 67H of that Act (possession or publication of pornographic images of sexual activity with child under 16) in relation to an image showing a person under 18.”
(5) In Schedule 7 to the Online Safety Act 2023 (priority offences), in paragraph 29 after paragraph (c) insert—
“(d) section 67H (possession or publication of pornographic images of sexual activity with child under 16).””