Internet-based Media Companies Debate

Full Debate: Read Full Debate

Fiona Mactaggart

Main Page: Fiona Mactaggart (Labour - Slough)

Internet-based Media Companies

Fiona Mactaggart Excerpts
Wednesday 31st October 2012

(11 years, 6 months ago)

Westminster Hall
Read Full debate Read Hansard Text

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Fiona Mactaggart Portrait Fiona Mactaggart (Slough) (Lab)
- Hansard - -

It is a pleasure to appear in this Chamber in front of you, Mr Owen. I feel as though I have spent most of the day here. I am pleased to have been able to secure this debate, for which I have been pressing for some weeks.

Politicians and companies alike have failed to address the new challenges that the internet brings. I am not at this point arguing that the state needs to do more right now, although it might need to in future. Companies that use the internet need to have robust policies to protect vulnerable users. They need to take responsibility for the impact of what they do from the start of their operations.

Children and young people are a substantial and persistent body of internet users. A report published in September 2012 by McAfee suggested that 82% of five-year-olds

“have access to a computer, smartphone, tablet or other way of getting online.”

Nine out of 10 of those aged between 12 and 15 live in homes with internet access. In schools, use of the internet is now more or less universal. Increasingly, it is being integrated into lesson plans to make use of richer content, and it is often a regular part of how schools communicate with parents.

The internet is used at home to enable children to do their homework. It is a major linchpin or communications hub in huge numbers of children’s social lives. Indeed, not having access to the internet can mark someone out as odd, or as coming from a disadvantaged family.

With the rise of smartphones and other internet-enabled portable devices such as games consoles, and the emergence of large-scale public wi-fi, internet access is also pretty ubiquitous, or soon will be in all our major cities. Thus the notion that parents could in any meaningful sense provide constant support or supervision of their children’s use of the internet is becoming impossible to sustain. I make these points in part to underline a core element of my argument about industry’s responsibility, which I will come to later.

First, I will say a word about the industry. In fact, there is no such thing as the internet industry. At one point there was: back in the 1980s and early ’90s. Computers and networking had been well-established for years, so the then new internet industry essentially consisted solely of internet service providers and geeks who wrote software. It was all very neat and tidy, and easy to identify and deal with.

Today almost every business of any size has some sort of stake in the internet. All of them have a responsibility of some sort to people who go online, especially to children. Many of them make great efforts to discharge that responsibility with great care and attention, but I am afraid that it is also quite plain that many do not. It is the many that we need to focus on.

The internet is not a sort of social service, or an extension of the classroom with knobs on, like social networking sites. Just as money is said to make the world go round, it most certainly makes the internet go round, and children are right in the middle of it. In 2006, children and young people in the UK up to the age of 19 spent £12 billion from their pocket money, or from earnings derived from part-time jobs. Of that, £1.53 billion went on clothes, and £1 billion on food and drink; music and computer-related projects took another £1 billion. In the same year, when account is taken of the amounts spent by parents on their children or in the home—spending over which children and young people often have influence—the total value of the market increased to almost £100 billion.

One of the largest of the virtual worlds aimed expressly at young children is Club Penguin. When Disney acquired the business in 2007, it was reported to have paid $700 million. According to the Financial Times, in June 2011, the UK-based children’s site, Moshi Monsters, was reported to be valued at £125 million. Children and young people are therefore major economic actors, both in their own right and through the influence that they exert on patterns of consumption within the wider family.

The size of the market helps to explain why so many different companies are interested in children and young people. It is not just about cash tomorrow; it is very much about cash today. Moreover, the sums indicated suggest that this market matters not only to the individual firms that may be competing for parts of it, but for the national economy.

Children’s and young people’s online spending is also growing. A report published in December 2010 suggested that British kids between the ages of 7 and 16 spent £448 million, with eight out of 10 using their parents’ cards, online accounts, or PayPal. Apparently, £64 million was spent without parents’ knowledge.

The emergence of the internet as a major force in commerce, particularly in retailing, has created a number of anomalies in policy, as well as market distortions that discriminate against companies that trade solely or principally on the high street, but some of those anomalies are connected to wider risks to children and young people. Many of the rules established to protect children and young people from unfair or age-inappropriate commercial practices in the real world do not yet seem to have been fully translated into the virtual space, or to have found an online equivalent or proxy. There is a tendency for firms to say that what children do when they go online is entirely the responsibility of the parents or carers. While no one would dispute that parents and carers have a role to play, what we need to clarify is the extent of the obligations placed on companies and on every part of the internet value chain.

Can manufacturers of internet-enabled devices, perhaps especially portable devices, simply wash their hands of any and all liability for anything and everything that happens to children and young people when they use them? What about the companies engaged in providing access to the internet, whether via a fixed-line connection or via wi-fi? Then there are the online service providers, such as Google and Facebook, and online vendors such as Amazon and Tesco. What parameters are applicable to them? Where are the boundaries? This whole area has been largely neglected by scholars and the legal profession, and, I am ashamed to say, politicians.

No doubt companies have considered their position, but if they have, they have been slow to publicise their legal advisers’ views. Even if they did, it is likely that such views would take a very particular perspective.

Rushanara Ali Portrait Rushanara Ali (Bethnal Green and Bow) (Lab)
- Hansard - - - Excerpts

One of my constituents came to see me after being sexually harassed for years on Facebook. Her identity was stolen and her Facebook pages were photoshopped to damage her reputation. It took her a great deal of time to get any attention from the police or the organisation concerned—in this case, Facebook. Does my hon. Friend think that there should be greater clarity and transparency about what the process and principles should be, and what citizens and consumers can expect from the suppliers such as Facebook, and from the police? Only when a death threat was made against my constituent did the police feel that they could take action. Until that point, they had to advise her to complain to Facebook.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - -

The case that my hon. Friend cites is an example of exactly why I called for this debate. In that case, Facebook was not taking proper responsibility. It did not have a transparent complaints process that my hon. Friend’s constituent was able to use. It did not have a mechanism for remedying the harm that she had experienced and, frankly, the police are not up to date enough with the online world. That is not true of the whole of the police service—for example, when it comes to child abuse images, the police have quite well-developed policing strategies—but in the case of online bullying, I think they are behind the game.

The fundamental responsibility, in that case, belongs to Facebook, but the police must take more seriously the fact that things happen in the virtual world that they would not tolerate in the real world, and they must ensure that their policies and procedures function appropriately in both. We have not grown up, as it were, and ensured that we have modernised our systems, including those of the police. My big argument is with companies such as Facebook. If they were to take their responsibilities more seriously, my hon. Friend’s constituent would have been much safer, and the problem would perhaps not have got as far as requiring police action.

Some new media companies seem persistently to fail to establish clear values and procedures for handling matters, such as the one that my hon. Friend raised, that can profoundly affect individuals and wider society. In the early days of the internet, that was perhaps understandable to a degree. They were learning; we were all learning. We are, however, no longer in the early days, and now such failure looks more like negligence or lack of concern. Too often, companies seem to struggle to recover a position, by which time a great deal of damage might have been done. I want to establish a new norm, whereby we expect companies, from very early on in their lives, to have an enforceable social responsibility code, which contains a publicly declared process for dealing with objectionable or illegal content.

Tom Harris Portrait Mr Tom Harris (Glasgow South) (Lab)
- Hansard - - - Excerpts

Does my hon. Friend not accept that putting “objectionable” in with “illegal” poses a danger to freedom of expression? The two terms mean completely different things. As a party that has generally supported freedom of speech, surely we should protect the right of someone to be offended if they so wish, or to say something offensive, as long as it is not illegal. We should be careful about merging the two definitions.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - -

My view is that because the internet so substantially broadens the audience for material, those who are responsible for doing that must take some responsibility for the content, in a way that they are not currently prepared to do. They obviously need to do that when the content is illegal, but I will go on to argue that they should also do it when it is objectionable. They should not necessarily delete everything in the first instance, but they must have a process by which someone who wants to object can properly make a case and argue for something to be taken down. The process should be transparent and contain a right of appeal, so that the matter can be dealt with.

Our publishers in the real world take responsibility for what they publish, choosing not to publish material that they deem profoundly offensive, and YouTube is effectively a publisher. It is dodging its responsibility as an institution that broadens the audience so significantly for the material that it carries. It is pretending not to be a publisher, and that is a bit of a fraud. I will go on to deal further with the issue that my hon. Friend the Member for Glasgow South (Mr Harris) raised.

A policy should guide companies when they decide whether to take down material, and there should be a right of appeal where appropriate. I would want companies to work with groups such as the Internet Watch Foundation and the UK Council for Child Internet Safety to ensure the promotion of public safety.

I initially intended to raise this issue because of the evidence that paedophiles have been using Twitter to groom young children; Members might have seen reports on that in The Sunday Mirror. I praise the newspaper for its campaign, because it has forced Twitter to take action to protect children. However, Twitter has still not joined the Internet Watch Foundation to show its support for the wider industry’s measures to keep child abuse images off the internet as a whole. That is a shameful example of a profound disregard for the interests of British children and young people. What is worse is that when the storm broke, Twitter simply retreated into a Californian bunker. It seems to me that it cynically decided to sit out the storm, in the hope that it would blow over and people would forget about it. Well, here is the bad news: it did not.

Habbo Hotel took a similar line when Channel 4 exposed how its site was being grossly misused and was putting children in danger. This case was, in a sense, much worse, because Habbo had at least signed up to various voluntary codes of practice. The only problem was that it was not honouring them, which speaks volumes about the weakness of our so-called self-regulatory regime for the internet in the UK. Even BlackBerry, a company in my constituency that is ethical in many important ways, was found wanting when it emerged that child pornography was not being blocked by users of its handsets on any network except T-Mobile, and the same was true for adult content. Given how popular BlackBerry handsets are with kids, that was truly appalling, but I am happy to say that both matters have now been put right.

Failure to act can lead to tragedy. It is only two weeks since Tallulah Wilson killed herself after visiting suicide websites. At the time, a spokesman for the Samaritans put the need for more responsible behaviour well:

“It is important that organisations which run sites that are highly popular with young people develop responsible practices around suicide-related content, including promoting sources of support and by removing content which actively encourages or glorifies self-harm or suicide”.

Glorifying self-harm or suicide is not illegal, but it is profoundly dangerous. The new Health Minister, the hon. Member for North Norfolk (Norman Lamb), last month warned that telecommunications companies faced being regulated by the Government if they failed to block websites offering advice on suicide. It is time for the companies to act.

Then there was the unrest caused by the publication on YouTube of the provocative American-made video insulting Mohammed. It caused deaths and injuries around the world when so many people saw or heard of it.

Tom Harris Portrait Mr Tom Harris
- Hansard - - - Excerpts

I feared that the debate was heading in that direction. Can we just be absolutely clear that the deaths and injuries throughout the world were not caused by the YouTube video, obnoxious and appalling though it was? They were caused by fanatics who chose to resort to violence against innocent people. No one forced them to do that.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - -

My hon. Friend is right, but what happened was completely predictable. Responsible publishers choose not to publish things that are designed to provoke. I have not seen the video, but I persuaded someone in my office to, and the clear intention of the material is absolutely to provoke. It was irresponsible for YouTube to carry the video.

In its response, Google, rather like my hon. Friend, uttered pious words about free speech and the first amendment, but I would like to make some observations about that. Google is an exceptionally profitable business. It is not a charity, or an agency that can lay claim to moral or political leadership in any credible way. I say that not just because of the mounting number of times Google is being hauled, in relation to other parts of the internet, before the courts and regulators and losing. The company seems to be highly selective about the parts of the law that it wishes to observe.

Many Muslims in the UK and throughout the world—some of whom reacted in the way my hon. Friend described, and some of whom simply demonstrated peacefully outside Google’s UK headquarters—were deeply offended by the video and by YouTube’s failure to remove it, except in the two countries where the company acknowledged that there might be violent protests. I understand that YouTube has now also disabled links to the clip in at least two other countries, including India. It became clear, therefore, as the tragedy of the video unfolded, that the company did not have an absolute fixed position that it would defend to the nth degree. It was a movable feast, but it moved too slowly, and only after too many people had died, been injured or had their property destroyed. That highlights the inadequacy, or at any rate the inconsistency, of YouTube’s processes. I have looked at those processes so that I can try to advise people who have been hurt by the video, and the processes are almost deliberately opaque and make it hard for people to find any mechanism to address their hurt.

I shall not address the issues that the hon. Member for Devizes (Claire Perry) has led on in Parliament, because she wants to speak later, and I want other Members to have a chance to contribute to this debate, but I am concerned that decisions—the Muslim video is one example—appear to be taken on an ad hoc basis. A codified, publicly available system would help to show that Google—this applies to other companies, too—is serious about its responsibilities. The companies need to grow up. They are not young cowboys battling on the wilder edges of a new territory about which we know little; we now know a lot, and it is time that that was reflected in the behaviour of internet businesses.

Gregory Campbell Portrait Mr Gregory Campbell (East Londonderry) (DUP)
- Hansard - - - Excerpts

The hon. Lady is outlining the thrust of her powerful argument against the likes of Google, Facebook and Twitter, but she has not said what sanctions, if she were successful and her campaign moved to a logical conclusion, a Parliament in an individual nation state might apply that could protect the people whom she and I seek to defend.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - -

The hon. Gentleman is right that I have not stated the sanctions that Parliament could apply, because in this debate I am arguing, in the first place, for the industry to grow up, take responsibility and properly self-regulate, and not to say, “Oh, whoops, we are being embarrassed, so we are going to do something,” or, “Oh, whoops, it is dangerous in that country, so we will sort it there.” I am saying, “Come on; you are in the last chance saloon, and you need to take responsibility. If you do it well and right, the Minister will not need to intervene, but if you do not, I will be the first person, not just in this Chamber but in the House, arguing for much more powerful regulation.” That is not where I want to go first. I expect companies not to be surprised when they get it wrong, and to ensure that they put in place proper mechanisms to protect not just vulnerable internet users, but all of us.

My final point is about child abuse images. The Internet Watch Foundation is a model and example to the rest of the world, but it addresses only a narrow, albeit important, part of the internet—the web and newsgroups. Figures recently released by five police forces in England and Wales—Cambridgeshire, Dyfed-Powys, Humberside, Lincolnshire and Nottinghamshire—show that between 2010 and mid-2012, they seized 26 million pornographic images of children, which is an incredibly troubling number, but think about this: someone calculated that that might mean that more than 300 million images were seized across the country in the same period. Not only does that beggar belief, but it tells us that something is definitely not working as it should. Somehow or other, the industry and all of us need to up our game and confront such harm.