Serious Crime Bill [HL] Debate

Full Debate: Read Full Debate
Department: Home Office

Serious Crime Bill [HL]

Lord Harris of Haringey Excerpts
Tuesday 28th October 2014

(9 years, 7 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Moved by
46: After Clause 66, insert the following new Clause—
“Protection of children from sexual communications
(1) A person (“A”) commits an offence where A intentionally communicates with another person (“B”) in the following circumstances—
(a) A is aged 18 or over,(b) either—(i) B is under 16 and A does not reasonably believe that B is 16 or over, or(ii) B is under 13,(c) the content of the communication is sexual or intended to elicit a response that is sexual, and(d) subject to subsection (3) below, A’s purpose in sending the communication or seeking a response is sexual.(2) The communication may be in any form including verbal, written or pictorial (which may include still or moving images) and may be conveyed by any means whatever.
(3) A does not commit the offence in subsection (1) above where the purpose of the communication is for the protection of the child to which the communication is sent.
(4) For the purposes of subsection (3), a person acts for the protection of a child if he acts for the purpose of—
(a) protecting the child from sexually transmitted infection,(b) protecting the physical safety of the child,(c) preventing the child from becoming pregnant, or(d) promoting the child’s emotional well-being by the giving of advice, and not for a sexual purpose.”
Lord Harris of Haringey Portrait Lord Harris of Haringey (Lab)
- Hansard - -

My Lords, the purpose of this amendment is to create an offence where an adult engages in a sexual communication with a child or—this is very important—seeks to elicit from that child a sexual communication in response.

The amendment covers verbal, written or pictorial communication. It includes video communication and it covers all forms of communication whether by telephone, the internet, instant messaging and even gaming systems, such as the Xbox. This brings the law in this part of the United Kingdom into line with the law in Scotland, so this is not new territory. I am grateful to the NSPCC for the discussions and briefings I have had, and I know it has had discussions with a number of other noble Lords on this matter. I note that on Friday it launched an online petition on precisely this issue and that by last night it had already achieved 20,000 signatures, so there is a degree of interest and of belief that this is necessary. Indeed, if you speak to many parents, you come across the argument time and time again about why this is important and their concerns for their teenage and younger children.

The reality is that the current law that purports to cover these issues is fragmented and confused. It makes it hard for the police to bring suitable cases against perpetrators and what legislation there is by and large pre-dates the widespread use of the internet and social networking sites. In practice, the current law fails to recognise the nature of grooming. In grooming the perpetrator is not trying to be offensive to the child, to frighten the child or to intimidate the child. The abuser is trying to flatter the child and to persuade the child that they are the person who matters and the only person who cares for them and, as part of that, to persuade the child to respond to them sexually and send them sexual or indecent communications.

This is a widespread problem. Last year, ChildLine reported an increase of 168% in contacts of this nature. ChildLine is receiving reports daily of large numbers of these cases. For example, a 15 year-old girl was groomed by someone who she thought was 17. In fact, he was 44. She met him through a social networking site, and they chatted online most nights. In his guise as a 17 year-old boy, he said that he was in love with her. He started talking about more sexual things. At first she was not too worried as her friends told her that this was just what boys did. She then sent him a picture of herself naked. He had elicited that picture. At this point, he admitted that in fact he was 44 but said that age did not matter and that he really loved her. When the girl said that she was going to stop the contact, he threatened to share her images on the internet and tell all her friends what she had done. That is a real case from ChildLine of the sort of thing that happens. It would have been quite difficult to take the man concerned to court, as I understand it, on the existing basis.

By contrast, there is a case study from Scotland. It concerns a Mr James Sinclair who was 25. He gave a 14 year-old girl a mobile phone and sent her a series of sexual text messages. The girl’s family found the messages and contacted the police to report the matter. The family had reportedly tried for some time to stop the victim having any contact with the accused, but those efforts proved unsuccessful. Police officers examined her mobile phone and traced and detained the offender. Sinclair was put on the sexual offenders register. Under the current law in England, Wales and Northern Ireland, he could not have been prosecuted because he could have mounted a defence that he did not intend to cause distress or anxiety as the child seemed willingly to engage in the sexualised conversation. That is the context in which we are talking here. The current law is inadequate.

--- Later in debate ---
Lord Harris of Haringey Portrait Lord Harris of Haringey
- Hansard - -

I shall try not to intervene too often, given that we are on Report, but I would be grateful for this clarification. The Minister has referred to Section 127 of the Communications Act, which requires the message from the perpetrator to be,

“grossly offensive or of an indecent, obscene or menacing character”.

He also referred to Section 1 of the Malicious Communications Act where the offence is,

“with intent to cause distress or anxiety”.

In the sorts of cases that I have been talking about, there is no intent to cause distress or anxiety. There is no need to be,

“grossly offensive … indecent, obscene or menacing”,

because this is about coaxing the young person through flattery to send a naked image of themselves. Clearly, if it falls into these categories, there is no question that the Act covers it, but these are communications of a different nature.

Lord Bates Portrait Lord Bates
- Hansard - - - Excerpts

I accept that—and this may not endear me to the noble Lord, but I am only halfway through my speech. I will go through some other laws that could catch that particular matter. If it is not the case, I shall certainly come back and address the specific one that he deals with.

It has been pointed out that the Section 1 offence in the Malicious Communications Act is not suitable because it is a summary one and subject to a six-month time limitation on prosecutions. I assure the House that the Criminal Justice and Courts Bill includes an amendment to the 1988 Act, making that offence triable either way, which would have the effect of removing the six-month time limit. The material, depending on the content, could also be caught under the Obscene Publications Act 1959. There was a recent conviction under the Act which captured a paedophilic sexual discussion being held in a private e-mail conversation between paedophiles. This significant conviction demonstrates that the offence can be made out by a publication to one person.

If the contact or messaging involves the creation of indecent photographs of children under the age of 18, legislation such as the Protection of Children Act 1978 could be used against those circulating such images if, for example, an adult is inciting a child to self-produce indecent images. That was a specific issue that the noble Lord focused on. Section 160 of the Criminal Justice Act 1988 covers the simple possession of these images. There are a range of offences under the Sexual Offences Act 2003, including laws on attempting these offences, which would very likely cover this behaviour, its consequences or intended consequences. I shall spare the House a list of all the offences in the 2003 Act that might be engaged, but let me offer one example. Under Section 10 of the 2003 Act it is an offence for a person over 18 to cause or incite a child to engage in sexual activity. This carries a maximum 14-year sentence. Depending on the individual circumstances, this offence would very likely come into play when sexual communications were exchanged with children, or when they were coaxed, or when non-sexual communications were intended to elicit a sexual response.

There are other offences to deal with exploiting children through involvement in pornography and prostitution. I take the point that the noble Baroness took from the example in Manchester. But this is something that is constantly under review, and has to be, as part of wider efforts to tackle this issue. We have had conversations with the Crown Prosecution Service, which does not feel that there is a gap in the law at present. We have had conversations with the national policing lead, who also does not feel that there is a gap at present. These discussions are ongoing, and I will be very happy to include noble Lords—and specifically the noble Lord, Lord Harris, in the context of this amendment, as well as the noble Baronesses, Lady Howe and Lady Benjamin, in some of the discussions with the CPS and the police to see what needs to be done and whether the provisions are sufficiently robust to deal with the specific examples and case studies that they have given.

Even if the messages are not themselves illegal, if their distribution or sending to a child is carried out as part of a course of conduct that alarms the child or causes distress—something raised by a number of noble Lords—this could amount to a criminal offence under the Protection from Harassment Act 1997. On the face of it, therefore, it would appear to be the case that the current law, if applied properly, already does what the amendment seeks to do. We should be very wary of adding new offences to the statute book if to do so would result in an unnecessary and undesirable duplication of the existing criminal law. However, the Government are always open to suggestions that could strengthen the law in this difficult and sensitive area.

I agree with this amendment to the extent that we want to be absolutely sure that offenders who communicate sexual messages to children or elicit sexual replies are appropriately dealt with by the criminal law. We are therefore investigating with the Crown Prosecution Service and the police to ensure that there are no such gaps that could let those who offend against our young people in this manner escape justice. I am very happy to include noble Lords in that discussion. As part of our ongoing consideration of this issue, I have extended that invitation to discuss. I trust therefore that the noble Lord might accept that, in this regard, it is not a “resist” but that the Government are considering carefully what is being proposed, in the light of the existing legislation and to continue that discussion. In the mean time, I ask him to consider withdrawing his amendment.

Lord Harris of Haringey Portrait Lord Harris of Haringey
- Hansard - -

My Lords, I am grateful for the support that this amendment has had from the noble Baronesses, Lady Howe and Lady Benjamin, as well as my noble friend Lady Smith. The Minister said clearly that he shared its objectives. I have the advantage of seeing his colleagues behind him and I noticed that not only did quite a number of them seem to share the objectives but they were also not entirely convinced by some of his suggestions that these offences were met by the Bill.

I shall deal quickly with the noble Baroness, Lady Hamwee. She did not disappoint us in that she made her usual series of very precise and small points on the amendment. I am clear that this is not a professionally drafted amendment or one that would meet all the best requirements of those who sit in garrets in the Home Office or the Ministry of Justice producing these things. My hope was that the Minister would say that there were sufficient points here that he would come back to us at Third Reading with a beautifully professionally drafted amendment. However, I am not sure that the points that the noble Baroness, Lady Hamwee, made were terribly helpful. She talked about the recent amendment on revenge porn. The issue there was publishing material that had been shared in a private capacity more widely because the relationship had broken up. This does not apply in this instance; this is about eliciting an image from a child, not necessarily to share—although that might happen—but simply to obtain the image. So I am not sure that that change necessarily helps us on this issue. I am sure that we could all struggle with defining age and knowledge of age and we could no doubt find ways in which this proposal could be improved. I hope that the Government can accept that there are at least some points here that need to be looked at.

The Minister then went through, as predicted, some of the various sections that we talked about. Most of them require an intent to cause distress or anxiety, or that the matter is grossly offensive, or of an indecent, obscene or menacing character. As I have said repeatedly—I do not think that the Minister has addressed this issue—those are not the circumstances in which such messages are sent. They are sent not to cause offence to the child concerned, but to make children feel sufficiently comfortable to be able to share naked pictures of themselves.

The Minister referred to the Sexual Offences Act 2003, and causing or inciting a child to engage in sexual activity. I appreciate that there is a fine line to be drawn here, but I wonder whether it would be sufficient to achieve a conviction under Section 10 of that Act if all that the perpetrator has done is to persuade the child to stand naked in front of a webcam. No sexual activity is taking place there, so there are some issues around that.

The provision in the Protection from Harassment Act 1997 depends on whether the sender knows or ought to know that what is happening amounts to harassment of another. Harassment includes alarming a person or causing a person distress—but the child concerned may not be alarmed or distressed at the point when the actions take place. The child may only realise many years later what they have done, and what the implications are. Again, I am simply not convinced that this is covered. Scotland has legislation covering this point; there is a gap in England, Wales and Northern Ireland.

I am disappointed in the Minister’s reply. I take his offer for further consultation at face value, but I am conscious that Third Reading is only just over a week away, and I hope we can make some progress before then. Without that, I would feel that we need to return to these issues at that stage. However, on the basis of the promised discussions, I beg leave to withdraw the amendment.

Amendment 46 withdrawn.
--- Later in debate ---
Moved by
47: After Clause 67, insert the following new Clause—
“Protection of children: duty on internet service providers
(1) Internet service providers which provide third parties with any means or mechanisms to store digital content on the internet or other location remote from the third party must consider whether and to what extent the services they provide might be open to abuse by such third parties to store or transmit indecent images of children, contrary to section 1 of the Protection of Children Act 1978 (indecent photographs of children).
(2) Where an internet service provider considers that there is a material risk that their network or other facilities could be misused as set out in subsection (1), they must take such reasonable steps as might mitigate, reduce, eliminate or other disrupt said behaviour or restrict access to such images.
(3) In this section, “internet service provider” has the same meaning as in section 124N of the Communication Act 2003 (interpretation).”
Lord Harris of Haringey Portrait Lord Harris of Haringey
- Hansard - -

I am grateful to the Lord Chairman for allowing me to collect my thoughts on this amendment while he was going through those other amendments. The purpose of this amendment, which is rather different from that of the previous one, is to create a requirement for an internet service provider that provides a facility for the storage of digital content to consider—no more than that—whether and to what extent that facility might be open to abuse by the storage of indecent images of children. Where the service provider,

“considers that there is a material risk … they must take such reasonable steps as might mitigate, reduce, eliminate or … disrupt”,

such actions.

The context of the amendment is the fact that there are tools available to internet service providers to find out whether such indecent material is contained on their systems. As I am sure noble Lords are aware, images are reduced to digital content as a series of zeroes and ones, so even a very complex image, whether pornographic or otherwise, is simply reduced to a series of zeroes and ones. Most abuse photographs are circulated and recirculated. Many of them are known to the law enforcement authorities, and it is possible for those authorities to search for identical images, so that they know whether a particular image has appeared before, and in what circumstances.

However, I am told that increasingly, abusers are making tiny changes to images—sometimes no more than one pixel—so that the images are not identical, and are not picked up in the same way by those methods. However, I understand that Microsoft has developed a system called PhotoDNA, which it is making available free to providers. This converts images into greyscale and breaks the greyscale image down into a grid. Then each individual square on the grid is given what is called a histogram of intensity gradients; essentially, that decides how grey each square is. The signature based on those values provides a hash value, as they call it, which is unique to that particular image—I appreciate that these are technical terms, and until I started looking into this I did not know about them either. This technique allows people to identify images that are essentially the same.

Until now, the way to identify which images are essentially the same is that some poor police officer or analyst has had to look at all the images concerned. But it is now possible to do that automatically. Because the technology can operate in a robust fashion, it can identify what images are appearing, and whether they are essentially the same. It is not possible to recreate the image concerned from that PhotoDNA signature; it is only possible to scan systems or databases for signature matches. What is more, because the data for each signature are so small, the technology can scan a large volume of images extremely quickly. Apparently there is a 98% recognition rate.

I have gone through that in some detail simply to illustrate that there are such techniques available. I believe that Google is working on something—which would, of course, have to be bigger and more complex than what has been produced by Microsoft—which will do the same for videos. It will then be possible to identify similar videos in the same fashion.

The benefit of these techniques is that they make it possible for ISPs to trawl their entire database—to trawl what people are storing online and to identify whether some of the previously known indecent images are in the system. They will then be able to see whether there is a package, or a pattern, and whether particular users are storing more than others. That then gives them the opportunity to raise that issue with law enforcement officials or take disruptive action, perhaps by withdrawing service from that user.

The benefits of the specific technology are that humans do not have to scan the individual images. A number of noble Lords have seen the suites used by CEOP or New Scotland Yard whereby a row of police officers sit viewing indecent images of child pornography, which is distressing for those officers and possibly harmful to them in the long term. That does not need to happen in this case. The service providers do not have to store the images that they are matching to carry out this exercise because all they are storing are the DNA hash values of the images concerned, and they are therefore not exposing themselves to potential charges as far as that is concerned. The technology makes this comparatively easy and simple to do and does not involve a great deal of data. It also means that the service providers are not interfering in any way with the privacy of their users other than to check, in this anonymised way where they do not view the images, that no images contained there are of known child pornography.

The purpose of this amendment is to place an obligation on service providers to make use of these technologies as they are developed. Some providers already do this and are willing to do this. I think that Facebook has quite a good record as far as this is concerned. However, the amendment would place an obligation on all of them to consider whether they should use these techniques. As I say, in this instance Microsoft is making the technology and the system available free to providers.

Before the noble Baroness, Lady Hamwee, goes through whatever drafting faults the amendment may contain, I should point out why I think it is important. In our discussions just three months ago on the DRIPA legislation it was suggested that one of the reasons why the relevant changes were being made was to provide service providers with legal cover against legal challenge in other countries in which people asked why they were allowing law enforcement officials to do these things. The amendment would provide some legal cover for those service providers—in exactly the same way as the DRIPA legislation does—against challenges that this measure somehow infringes the freedom of speech of people who want to store pornographic images of children. The purpose of this amendment is to require service providers to consider whether or not they might be at risk of this misuse and then to take appropriate reasonable steps using the best available techniques to,

“mitigate, reduce, eliminate or … disrupt”,

it. I beg to move.

Baroness Howe of Idlicote Portrait Baroness Howe of Idlicote
- Hansard - - - Excerpts

My Lords, I rise briefly to speak in support of Amendment 47 of the noble Lord, Lord Harris. Some may take the view that internet service providers cannot be held responsible for information that people use them to hold. Although, in my view, ISPs certainly do not have responsibility for generating content, they do, however, play a very important role in facilitating it: first, in the sense that storage protects the material in question and thereby helps to guarantee its continued existence; and, secondly, in the sense of providing a basis from which the said material may be transmitted. In so doing, they have a responsibility actively to take all reasonable steps to ensure, on an ongoing basis, that they are not facilitating the storage and/or transmission of material of the kind set out in subsection (1) of the clause proposed in the amendment.

For myself, I would also like ISPs to have to demonstrate that these active steps have indeed been taken, and are being taken, on an ongoing basis. We must foster a legislative framework that exhibits zero tolerance of all aspects of child sex abuse images, including ISPs facilitating the storage and/or transmission of such images. I very much look forward to listening to what the Minister has to say in his response to this important amendment.

--- Later in debate ---
Lord Bates Portrait Lord Bates
- Hansard - - - Excerpts

My Lords, the noble Baroness is absolutely right again, in the sense that technology is the problem and therefore technology needs to offer the solution. Simply put, the numbers and the scale—of course, she has had those briefings and I have had them, too—are both distressing and mind-blowing in terms of their reach. As the technology is not limited to, and does not respect, geographies or jurisdictions, the matter is a global one. Therefore, we need to work very closely with the industry to ensure that this can be done.

I want to cover some of the issues that are being addressed at present which noble Lords may not be aware of. We recognise the concerns that the noble Lord has raised about the use of the internet to store and circulate indecent images of children. We fully accept that more needs to be done to address this issue, but I hope to be able to persuade the noble Lord that legislation is not required at this point, although we continue to keep that option under review.

We believe that the internet industry operating in the UK has taken significant steps, on a self-regulatory basis, to tackle the availability of indecent images online. The internet industry in the UK has worked closely for many years with the Internet Watch Foundation and the Child Exploitation and Online Protection command of the National Crime Agency to tackle illegal images. We recognise the support that responsible internet service providers have given to the Internet Watch Foundation, both financially and through taking action on the Internet Watch Foundation’s list of web pages identified as containing illegal images by either taking down such sites, if they are hosted in the UK, or blocking access to them if they are overseas.

The public and businesses can report images to the Internet Watch Foundation, which assesses them and determines whether they are illegal. Indeed, the Internet Watch Foundation took more than 51,000 reports from all sources last year. If the site containing the image is hosted in the UK, the details will be passed to law enforcement agencies, and the ISP will be asked to take down the web page using the “notice and take down” process. In 2013, the Internet Watch Foundation found that 47% of UK child abuse web pages were removed within 60 minutes. Thanks to the work of the Internet Watch Foundation, and the internet industry, less than 1% of the global total of indecent images of children is hosted in the UK.

However, we are not complacent, and we recognise the need to adapt to changing uses of technology by paedophiles. As the Prime Minister made clear in his speech to the NSPCC in July last year, we need to do more to eradicate these images from the internet and, in particular, ensure that the internet industry plays its full part in doing so. We have been working closely with the industry, and with its support we believe that significant steps have been taken towards removing these images. We have asked internet search engine providers such as Google—which was referred to by the noble Baroness and also by the noble Lord—and Microsoft to make changes to their search mechanisms, and these measures have been effective in preventing access to child abuse images.

We are also creating a new child abuse image database, using much of the same technology that the noble Lord, Lord Harris, referred to in setting out and introducing his amendment. This will enable the police to identify known images more quickly on suspects’ computers and will improve their ability to identify and safeguard victims from the images. A key part of this is not just about lining up prosecutions by identifying these images or getting the images taken down; it is about realising that the children behind them are vulnerable victims and need to be protected and get the help and support that they need.

Not only do we want the industry to remove such images, we want it to use its technical skills and capability to help develop the technical solutions to prevent the dissemination of these images online. The Home Office and the US Department of Justice have created a taskforce that provides a platform for industry to develop technical solutions to online child sexual exploitation. This work is ongoing under the chairmanship of my noble friend Lady Shields.

The UK will host a summit in December on online child exploitation. We have invited representatives of key partner Governments and organisations, including the internet industry, to participate in the summit, which will focus on protecting the victims of online child abuse and examine how we can work internationally to prevent children being exploited online.

The Government are very clear that those who provide services online, particularly those where images can be stored—a point that the noble Baroness, Lady Howe, made—have a responsibility to take action to prevent those services being used for the purposes of storing and sharing indecent images of children. In that regard, as she rightly said, we should have zero tolerance. We believe that internet service providers operating in the UK have a good record in this respect, both through their support for the Internet Watch Foundation and through the actions that they are taking to support the Prime Minister’s call for action.

Against this background of good co-operation and progress at present, we believe that the current system of self-regulation has been effective, and we are not persuaded at this time that more would be achieved by placing a legal requirement on these companies. In that regard I hope that, having heard the progress that has been made and our undertaking to keep this under review, the noble Lord will feel sufficiently reassured to consider withdrawing his amendment.

Lord Harris of Haringey Portrait Lord Harris of Haringey
- Hansard - -

My Lords, I am grateful to the noble Baroness, Lady Howe, and my noble friend Lady Smith for the support that they have given to this amendment. To the noble Baroness, Lady Hamwee, I say that, as I am not doing this on behalf of the Government or anyone else, I am not engaged in a lengthy process of consultation with internet service providers, but I would make the point that this is a very soft change. It is simply asking them to consider and, where they think there is a material risk, to take reasonable steps. It is difficult to imagine any internet service provider, unless it wants to provide a service for expressly illicit purposes, finding this difficult.

I am of course encouraged by what the Minister has described. Most of it does not in fact apply to the issues that I have raised, because this is about images stored for private purposes rather than public purposes. The web page stuff and the work of the Internet Watch Foundation, with which I am very familiar—I think I am an ambassador or a champion; I cannot quite recall what the certificate says—are clearly about public-facing material which people may access. All that work is extremely good. I accept that many internet service providers are extremely responsible and are operating as one would hope in a self-regulatory way. I think this would have helped encourage those that are not being quite so public-spirited or sensitive to these issues to be more so in the future.

However, in the light of the Minister’s undertakings that this is something that will continue to be looked at, I beg leave to withdraw the amendment.

Amendment 47 withdrawn.