Protecting Children Online Debate

Full Debate: Read Full Debate
Department: Ministry of Justice

Protecting Children Online

Helen Goodman Excerpts
Wednesday 12th June 2013

(10 years, 11 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Helen Goodman Portrait Helen Goodman (Bishop Auckland) (Lab)
- Hansard - -

I beg to move,

That this House deplores the growth in child abuse images online; deeply regrets that up to one and a half million people have seen such images; notes with alarm the lack of resources available to the police to tackle this problem; further notes the correlation between viewing such images and further child abuse; notes with concern the Government’s failure to implement the recommendations of the Bailey Review and the Independent Parliamentary Inquiry into Online Child Protection on ensuring children’s safe access to the internet; and calls on the Government to set a timetable for the introduction of safe search as a default, effective age verification and splash page warnings and to bring forward legislative proposals to ensure these changes are speedily implemented.

The motion is in the name of my right hon. Friend the Member for Doncaster North (Edward Miliband).

The whole country was shocked and revolted by the trials of Mark Bridger and Stuart Hazell, the two men who brutally murdered April Jones and Tia Sharp. They sent a shiver of horror down the spine of every parent in the land. In both cases, they were found to have huge libraries of child abuse images on their computers. In both cases, this was the first known offence against children. Surely it is now beyond doubt that what a person sees influences how they behave.

Let us be clear: there is no such thing as child pornography. There is child abuse online. Any image depicting a sexual act with or on a child under 18 is illegal. Child abuse images are illegal under international law and in every country on the globe. The Internet Watch Foundation is the UK hotline for reporting child abuse. It has pioneered this work since 1996. It can disrupt and delete content on the web within an hour and it protects child victims by working in co-operation with the police at the Child Exploitation and Online Protection Centre. It also aims to prevent people from stumbling across such images. We all owe an immense debt of gratitude to the IWF.

However, the surge in the scale of the problems threatens to overwhelm both the IWF and the police. The IWF’s independent survey by ComRes found that up to 1.5 million people have stumbled on child abuse images, yet last year the IWF received only 40,000 notifications and some 13,000 web pages were taken down as a result. Its latest figures show a 40% rise on last year.

Julian Huppert Portrait Dr Julian Huppert (Cambridge) (LD)
- Hansard - - - Excerpts

I support the hon. Lady’s opening words. I declare an interest as an IWF champion; the IWF does great work. Does she accept that her figure of 1.5 million people having seen child pornography is based on a sample of 2,000 people, of whom about 50 said that they seen such images? We do not know how much people have seen, or if they have seen anything. To extrapolate that far may be misleading.

--- Later in debate ---
Helen Goodman Portrait Helen Goodman
- Hansard - -

Yes; I discussed the numbers with the IWF, of course. It says that the survey on which it based that estimate was typical of surveys it has been doing over several years, so I think the problem is widespread and that we should not argue too much. It is clear that the numbers are far, far too big.

Up to 88% of the child victims appear to be 10 years old or under, and 61% of the images depicted sexual activity between adults and children, including rape and sexual torture.

Ann Coffey Portrait Ann Coffey (Stockport) (Lab)
- Hansard - - - Excerpts

Google is one of the biggest hosts of child sexual abuse images, albeit inadvertently, and it should therefore accept the major responsibility for proactively monitoring and removing those images. Does my hon. Friend agree that if Google spent as much money on monitoring and removing illegal child sexual abuse images as it does on paying accountants to avoid tax in the UK, it might go some way towards living up to its motto, “Don’t be evil”?

Helen Goodman Portrait Helen Goodman
- Hansard - -

My hon. Friend makes a good point. By a happy co-incidence I received an e-mail at 12.35 pm announcing that Google is increasing its contribution to the IWF to £1 million.

Madeleine Moon Portrait Mrs Madeleine Moon (Bridgend) (Lab)
- Hansard - - - Excerpts

My hon. Friend spoke about the number of reported cases. As an Internet Watch Foundation champion, I went into every one of my primary schools, spoke to the 10 and 11-year-olds at every one of those schools, and asked those children how many of them had seen indecent images online. Every child in every class had been exposed to such material. Is that not a national disgrace?

Helen Goodman Portrait Helen Goodman
- Hansard - -

My hon. Friend is obviously doing great work in her constituency and what she says is truly shocking. It is confirmed by the statistics which the NSPCC has been collecting.

Alun Cairns Portrait Alun Cairns (Vale of Glamorgan) (Con)
- Hansard - - - Excerpts

Reference has been made to Google. I do not defend that or any other search engine other than to say that this debate is highly technical and we need to be accurate. Google does not host anything; nor does any other search engine. Google merely provides the means of finding a site, and the hosting, which is an international problem, needs to be addressed appropriately.

Helen Goodman Portrait Helen Goodman
- Hansard - -

This is not an occasion for nit-picking—[Interruption.] It is important to take an international approach and I am disappointed in the Government for, among other things, not taking any international initiatives.

The police say their resources are inadequate to the task. Peter Davies, the head of CEOP, said that the police are aware of 60,000 people swapping or downloading images over peer-to-peer networks but they lack the resources to arrest them all. In any case, the IWF currently deals only with images on the web, not peer-to-peer images.

In answer to my parliamentary question last week, the Minister of State, Home Department, the hon. Member for Taunton Deane (Mr Browne) revealed that in 2012, despite the fact that the police are aware of those 60,000 people, only 1,570 were convicted of such offences. What do Ministers intend to do about the problem? I hope that in his winding-up remarks the Home Office Minister will tell us. There is no point huffing and puffing about the problem if Ministers do not take the necessary action. It is obvious to the whole country that the current situation is totally unacceptable. It is obvious that Ministers have not got a grip. It is obvious that we need a change.

That is why our motion proposes a complete shift in approach from a reactive stance to a proactive strategy. We are calling for three things—first, safe search as the default option. The industry has already made the filters that are needed to screen out not just child abuse but pornography and adult content generally. We are saying that the filters should be the default, either on all computers and devices connected to the internet or by requiring internet service providers to install them by default. Then we can institute the second part of an effective system: robust age verification. A person seeking to cross the filter would be asked to confirm their name, age and address, all of which can be independently checked. Again, we know that this works. It is what Labour did for gambling sites in 2005. It is what mobile phone companies do when someone opens an account and gets a SIM card. It is what people do when they get a driving licence.

Julian Huppert Portrait Dr Huppert
- Hansard - - - Excerpts

Does the hon. Lady accept that if we had safe search and such controls, young people would not be able either to access information about homophobic bullying, about how to deal with child abuse and about a range of other subjects? Indeed, such things are already filtered out by mobile phone providers, to the great detriment of many children.

Helen Goodman Portrait Helen Goodman
- Hansard - -

No, I do not accept that. I shall go on to explain why that is a misconception on the part of the hon. Gentleman.

The approach that we are suggesting would cut demand for sites as well as reducing the supply of them. It would tackle child abuse online and the other major issue addressed by the Bailey review and the independent parliamentary inquiry—children accessing unsuitable material online. In recent days I have had the benefit of energetic lobbying from Google in particular, pressing its view that except for child abuse images, which are illegal, all other images should be available unfiltered on the internet. I have heard its views and come to my own conclusion.

I hope the Government’s vacillation on this point is not because they cannot put children before powerful vested interests. I say safe search filters are not a free speech issue. This is not censorship. This is about child protection and reproducing online the conditions established over a long period in the real world.

Geraint Davies Portrait Geraint Davies (Swansea West) (Lab/Co-op)
- Hansard - - - Excerpts

Is my hon. Friend aware of the Council of Europe One in Five campaign, which is built on the fact that one in five children across Europe is likely to be a victim of sexual violence? Does she agree that the magnitude of sexual violence is enormously inflamed by the open gateway of internet child abuse?

Helen Goodman Portrait Helen Goodman
- Hansard - -

My hon. Friend makes a powerful point. Once again, he emphasises the importance of the international dimension.

What we are proposing is aimed at reproducing the conditions that we have already established in the real world. The distinction between legal and illegal content is far too simplistic. For cinemas we have the highly respected independent British Board of Film Classification. It produces age ratings—12, 15 or 18. Any cinema found to be regularly flouting the age restrictions would lose its local authority licence. Furthermore, material classified as R18 can be seen only in certain cinemas, and some material deemed obscene is cut entirely. Yet on the internet it is all freely accessible to every 12-year-old. Indeed—this relates to what my hon. Friend the Member for Bridgend (Mrs Moon) said a moment ago—the NSPCC believes that one quarter of nine to 16-year olds have seen sexual images online. We are not talking about young women baring their breasts—that is like something from Enid Blyton compared with the Frankenstein images now available.

The dangers are clear. On average, 29% of nine to 16-year-olds have contact online with someone they have never met face to face. Of course there is a real difference between child abuse online and extreme pornography, but unfortunately in the real world people who become addicted to pornography look for more and more extreme images, and that sometimes tips into child abuse images. Addiction is the issue. Users are found to have literally millions of images on their computer, and child abuse sites are signposted on pornography sites. Both are shared peer to peer.

Therefore, an effective age verification system would mean that paedophiles would lose the anonymity behind which they currently hide, and the denial of what they are really doing would be addressed by the third proposal in the motion, which is to have splash warnings before entering filtered sites. Work by Professor Richard Wortley at University College London suggests that that might halve the numbers viewing child abuse online.

Of course, those measures would have a cost to industry. TalkTalk, which has led the way in offering filters, has spent over £20 million. Some in the industry tell us that they do not want to lose their competitive edge, and some say that they do not want to act as censors. That is why the Government should act by putting a clear timetable for those reforms into law in order to speed up change, level the playing field and support parents. We know that most parents want to do what is right by their children, because 66% of people, and 78% of women, want an automatic block, according to a YouGov poll conducted last year, but the industry is not helping them enough. At the moment, some still require people to download their own filters—a near-impossible task for many of us—some see it as a marketing device, and others want to give the option of filters only to new customers. At the current rate of turnover, it would be 2019 before that approach had any hope of reaching total coverage. It simply is not good enough. [Interruption.] Does the hon. Member for Devizes (Claire Perry) wish to intervene?

Helen Goodman Portrait Helen Goodman
- Hansard - -

So what have the Government been doing? Before the general election, the Prime Minister promised that he would lead the most family-friendly Government ever, but so far there has been lots of talking and much less action. After three years and two Secretaries of State, the Government still seem to think that a voluntary approach will work. Do they not know when they are being strung along, or do they not care? How many more years must we wait? How many child deaths will it take to shock them into action?

Let us look at the record. First, the Prime Minister set up the Bailey review, which reported in June 2011. It recommended that after 18 months the internet industry must, as a matter of urgency, act decisively to develop and introduce effective parental controls—with Government regulation if voluntary action is not forthcoming within a reasonable time scale—and robust age verification. But here we are, fully two years on, and nothing has changed. Contrary to the answer I received from the Under-Secretary of State for Education, the hon. Member for Crewe and Nantwich (Mr Timpson), who is in his place, the fact is that BT, Sky and Virgin are yet to come forward to announce their proposals on how they intend to deliver.

Then we had the independent parliamentary inquiry into child protection online, an all-party group. It recommended an accelerated implementation timetable, a formal consultation on the introduction of an opt-in content filtering system, and that the Government should seek back-stop legal powers to intervene should the ISPs fail to implement an appropriate solution. A year later, no solution has been implemented. Why did the Government not introduce a communications Bill with appropriate measures in the Queen’s Speech?

Finally, last autumn the Government undertook a consultation. It was so badly advertised that 68% of respondents were members of the Open Rights Group, an important group but a lobbying group with 1,500 members, compared with the 34% of respondents who were parents of Britain’s 11 million children. Despite that, the Government concluded that parents did not want to see parental controls turned on by default.

The Government have zig-zagged back and forth but we have seen no action in the real world. The Secretary of State has called a meeting with industry representatives next week. What will she say to them? I hope that she will not engage in yet another round of fruitless pleas and requests. There is a total lack of strategy from the Department for Culture, Media and Sport.

I want to make an offer to the Under-Secretary of State for Culture, Media and Sport, the hon. Member for Wantage (Mr Vaizey): if he brings forward measures, with a speedy timetable, for the introduction of safe search as a default, robust age verification and splash warnings, we will support him. I gather that Ministers are urging their colleagues to vote against the motion. It is time that the Government stopped hoping that everything will turn out for the best and started taking responsibility. The time for talking is over. The time for action is now. We must put our children first. I hope that all hon. Members will vote for the motion in the Lobby this afternoon.

--- Later in debate ---
Julian Huppert Portrait Dr Julian Huppert (Cambridge) (LD)
- Hansard - - - Excerpts

I was not planning to speak, but I found the tenor of so much of what has been said so frustrating in its lack of accuracy that I had to speak. I would exempt some speeches, particularly that of the hon. Member for Vale of Glamorgan (Alun Cairns), who used technical accuracy, which does matter. It is a pleasure to follow the hon. Member for Slough (Fiona Mactaggart), for whom I usually have great respect, but she gave it away when she complained about “the technocrats”. Technical accuracy matters if we are going to do things that work. We need to know exactly what “inviting urls to a meeting” is supposed to mean.

There is a huge danger of falling into the trap of the politician’s syllogism: we must do something; this is something; therefore we must do this. That is the danger we face. Is there a problem? Absolutely, there is a huge problem with child pornography, which is nasty, cruel and illegal. We have to stop it. The Internet Watch Foundation does an excellent job in trying to do so. Is there a problem with young people having inappropriate access? Yes. Is there a problem with online grooming? Yes. Is there a problem with online cyber-bullying? Absolutely. Is there a problem with the widespread sexualisation of young women in particular? Absolutely, and I pay tribute to the Under-Secretary of State for Women and Equalities, my hon. Friend the Member for East Dunbartonshire (Jo Swinson) for her consistent work to combat it.

The approach highlighted today, particularly by the hon. Member for Bishop Auckland (Helen Goodman), simply will not work. I find that frustrating, as it does not engage with the facts or reality of what is happening. The right hon. and learned Member for Camberwell and Peckham (Ms Harman) was heckling earlier and said that we should not focus on the detail. If we do not focus on the detail, we will not get something that works.

What would work? I absolutely endorse the work of the Internet Watch Foundation. It does excellent work and I am delighted to see it getting more funding, as I think it should have extra support. I am pleased, too, that the Government are supporting CEOP so that when we find people carrying out illegal activities, we take the correct legal action. That is what should happen. We should never allow a situation in which the police simply do not have the money to arrest somebody who they know is doing something illegal.

The things we have heard about today will not make a difference. The people who are heavily engaged in child pornography will not be tackled. Those people are very internet savvy. They will use virtual private networks that are not listed, so nothing we have heard about today will tackle any of those problems. We have to work at the technical level to get things right rather than just try to make it look as if we are doing something.

In some ways, child pornography is easier to deal with because it is possible to define it. We know what is illegal and there are clear definitions. The IWF has a manual check for the sites. Certain sites can be blocked only when it knows that there is something wrong. That is very different from the space around legal material, or trying to come up with ways of filtering out things that are fundamentally legal and making a judgment call based on them.

Helen Goodman Portrait Helen Goodman
- Hansard - -

We are making judgment calls all the time.

Julian Huppert Portrait Dr Huppert
- Hansard - - - Excerpts

The hon. Lady is absolutely right, but writing algorithms to do that on millions and millions of websites simply cannot be done correctly. I shall come back to that, although I know that the hon. Lady and the right hon. and learned Member for Camberwell and Peckham are not concerned about the errors that would be made.

It is absolutely right to provide tools for parents to control what is happening. They should be the ones empowered to look after their children. I would rather trust the parents to look after their children than require state-level controls. It is absolutely right to have those available for people to use and to make them easy and clear to use. I think there should be no default because I think we should encourage parents to engage with the question before they make a decision. They should be faced with a box that they have to tick, but they should be in charge. The Byron review was very clear that a false sense of security could be created if we just tell people that everything is safe.