Internet-based Media Companies Debate

Full Debate: Read Full Debate

Internet-based Media Companies

Rushanara Ali Excerpts
Wednesday 31st October 2012

(11 years, 6 months ago)

Westminster Hall
Read Full debate Read Hansard Text

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Fiona Mactaggart Portrait Fiona Mactaggart (Slough) (Lab)
- Hansard - - - Excerpts

It is a pleasure to appear in this Chamber in front of you, Mr Owen. I feel as though I have spent most of the day here. I am pleased to have been able to secure this debate, for which I have been pressing for some weeks.

Politicians and companies alike have failed to address the new challenges that the internet brings. I am not at this point arguing that the state needs to do more right now, although it might need to in future. Companies that use the internet need to have robust policies to protect vulnerable users. They need to take responsibility for the impact of what they do from the start of their operations.

Children and young people are a substantial and persistent body of internet users. A report published in September 2012 by McAfee suggested that 82% of five-year-olds

“have access to a computer, smartphone, tablet or other way of getting online.”

Nine out of 10 of those aged between 12 and 15 live in homes with internet access. In schools, use of the internet is now more or less universal. Increasingly, it is being integrated into lesson plans to make use of richer content, and it is often a regular part of how schools communicate with parents.

The internet is used at home to enable children to do their homework. It is a major linchpin or communications hub in huge numbers of children’s social lives. Indeed, not having access to the internet can mark someone out as odd, or as coming from a disadvantaged family.

With the rise of smartphones and other internet-enabled portable devices such as games consoles, and the emergence of large-scale public wi-fi, internet access is also pretty ubiquitous, or soon will be in all our major cities. Thus the notion that parents could in any meaningful sense provide constant support or supervision of their children’s use of the internet is becoming impossible to sustain. I make these points in part to underline a core element of my argument about industry’s responsibility, which I will come to later.

First, I will say a word about the industry. In fact, there is no such thing as the internet industry. At one point there was: back in the 1980s and early ’90s. Computers and networking had been well-established for years, so the then new internet industry essentially consisted solely of internet service providers and geeks who wrote software. It was all very neat and tidy, and easy to identify and deal with.

Today almost every business of any size has some sort of stake in the internet. All of them have a responsibility of some sort to people who go online, especially to children. Many of them make great efforts to discharge that responsibility with great care and attention, but I am afraid that it is also quite plain that many do not. It is the many that we need to focus on.

The internet is not a sort of social service, or an extension of the classroom with knobs on, like social networking sites. Just as money is said to make the world go round, it most certainly makes the internet go round, and children are right in the middle of it. In 2006, children and young people in the UK up to the age of 19 spent £12 billion from their pocket money, or from earnings derived from part-time jobs. Of that, £1.53 billion went on clothes, and £1 billion on food and drink; music and computer-related projects took another £1 billion. In the same year, when account is taken of the amounts spent by parents on their children or in the home—spending over which children and young people often have influence—the total value of the market increased to almost £100 billion.

One of the largest of the virtual worlds aimed expressly at young children is Club Penguin. When Disney acquired the business in 2007, it was reported to have paid $700 million. According to the Financial Times, in June 2011, the UK-based children’s site, Moshi Monsters, was reported to be valued at £125 million. Children and young people are therefore major economic actors, both in their own right and through the influence that they exert on patterns of consumption within the wider family.

The size of the market helps to explain why so many different companies are interested in children and young people. It is not just about cash tomorrow; it is very much about cash today. Moreover, the sums indicated suggest that this market matters not only to the individual firms that may be competing for parts of it, but for the national economy.

Children’s and young people’s online spending is also growing. A report published in December 2010 suggested that British kids between the ages of 7 and 16 spent £448 million, with eight out of 10 using their parents’ cards, online accounts, or PayPal. Apparently, £64 million was spent without parents’ knowledge.

The emergence of the internet as a major force in commerce, particularly in retailing, has created a number of anomalies in policy, as well as market distortions that discriminate against companies that trade solely or principally on the high street, but some of those anomalies are connected to wider risks to children and young people. Many of the rules established to protect children and young people from unfair or age-inappropriate commercial practices in the real world do not yet seem to have been fully translated into the virtual space, or to have found an online equivalent or proxy. There is a tendency for firms to say that what children do when they go online is entirely the responsibility of the parents or carers. While no one would dispute that parents and carers have a role to play, what we need to clarify is the extent of the obligations placed on companies and on every part of the internet value chain.

Can manufacturers of internet-enabled devices, perhaps especially portable devices, simply wash their hands of any and all liability for anything and everything that happens to children and young people when they use them? What about the companies engaged in providing access to the internet, whether via a fixed-line connection or via wi-fi? Then there are the online service providers, such as Google and Facebook, and online vendors such as Amazon and Tesco. What parameters are applicable to them? Where are the boundaries? This whole area has been largely neglected by scholars and the legal profession, and, I am ashamed to say, politicians.

No doubt companies have considered their position, but if they have, they have been slow to publicise their legal advisers’ views. Even if they did, it is likely that such views would take a very particular perspective.

Rushanara Ali Portrait Rushanara Ali (Bethnal Green and Bow) (Lab)
- Hansard - -

One of my constituents came to see me after being sexually harassed for years on Facebook. Her identity was stolen and her Facebook pages were photoshopped to damage her reputation. It took her a great deal of time to get any attention from the police or the organisation concerned—in this case, Facebook. Does my hon. Friend think that there should be greater clarity and transparency about what the process and principles should be, and what citizens and consumers can expect from the suppliers such as Facebook, and from the police? Only when a death threat was made against my constituent did the police feel that they could take action. Until that point, they had to advise her to complain to Facebook.

Fiona Mactaggart Portrait Fiona Mactaggart
- Hansard - - - Excerpts

The case that my hon. Friend cites is an example of exactly why I called for this debate. In that case, Facebook was not taking proper responsibility. It did not have a transparent complaints process that my hon. Friend’s constituent was able to use. It did not have a mechanism for remedying the harm that she had experienced and, frankly, the police are not up to date enough with the online world. That is not true of the whole of the police service—for example, when it comes to child abuse images, the police have quite well-developed policing strategies—but in the case of online bullying, I think they are behind the game.

The fundamental responsibility, in that case, belongs to Facebook, but the police must take more seriously the fact that things happen in the virtual world that they would not tolerate in the real world, and they must ensure that their policies and procedures function appropriately in both. We have not grown up, as it were, and ensured that we have modernised our systems, including those of the police. My big argument is with companies such as Facebook. If they were to take their responsibilities more seriously, my hon. Friend’s constituent would have been much safer, and the problem would perhaps not have got as far as requiring police action.

Some new media companies seem persistently to fail to establish clear values and procedures for handling matters, such as the one that my hon. Friend raised, that can profoundly affect individuals and wider society. In the early days of the internet, that was perhaps understandable to a degree. They were learning; we were all learning. We are, however, no longer in the early days, and now such failure looks more like negligence or lack of concern. Too often, companies seem to struggle to recover a position, by which time a great deal of damage might have been done. I want to establish a new norm, whereby we expect companies, from very early on in their lives, to have an enforceable social responsibility code, which contains a publicly declared process for dealing with objectionable or illegal content.

--- Later in debate ---
Claire Perry Portrait Claire Perry
- Hansard - - - Excerpts

I thank the hon. Gentleman for that thoughtful intervention. Those are some of the questions that get raised: blocking sites that help children with their homework, or that concern sexual health, sexuality and other things that we know children are more comfortable talking about to friends and others on the internet than to their family.

We asked the Family Planning Association, a laudable organisation that publishes a lot of material about sexual health and guidance, and it was supportive. The FPA says that the problem right now is that children are accessing porn as a way of receiving sex education. That is not good sex education. It teaches children nothing about relationships. The FPA felt that using an age verification system—

Rushanara Ali Portrait Rushanara Ali
- Hansard - -

I support the hon. Lady’s proposal. It will protect young people not only from being groomed but from being radicalised on the internet; we have seen examples. It happens particularly to Muslim parents but also to others—those whose children are converts, for instance. The individual responsible for the attack on my right hon. Friend the Member for East Ham (Stephen Timms) was radicalised on the internet. We need action not just to protect children against harassment but on those kinds of issue. Anything that can address the problem would be welcome from both perspectives.

Claire Perry Portrait Claire Perry
- Hansard - - - Excerpts

I thank the hon. Lady for pointing out that it is not just what we might think of as pure pornography that is a problem, but many other things too. I say to both hon. Members that in the debate on this issue, we have always been in danger of letting the perfect be the enemy of the good. Filtering systems are well established. A lot of human intelligence goes into the filtering systems used by companies such as TalkTalk, which has gone furthest. It is completely possible to amend the system while ensuring that appropriate levels of material are available, just as they might be in a school environment. However, it is a worthy point.

I will continue, as I know that others are keen to speak. I was extremely proud that with the help of Members from across the House, we were able to persuade the Government to lead a formal inquiry into the opt-in proposal, led by UKCCIS. I will raise the question of Government complexity in a moment, but the inquiry had more than 3,500 responses, and I was proud to help deliver a petition with more than 115,000 signatures to No. 10 calling for an opt-in system and calling on the Government to take the issue seriously.

I think the Government do take the issue seriously, but there are many complications that must be addressed. First, as the hon. Member for Slough said, we do not have a regulator; we have a mish-mash of organisations involved in regulating the internet. In such a system, it is easy for companies to behave in an irresponsible manner or, as she mentioned in referring to a large search company, to basically make it up as they go along, with every test case being a different case. There is no clear regulation setting out a course of direction or what responsible behaviour looks like. That was one of our recommendations: give the issue to one regulator.

Secondly, there is the ideological question. It behoves us all not to have the debate about free speech versus censorship here. Of course, we must have that debate, but it is a false debate here. We are talking about children in unprotected households accessing damaging, dangerous and violent material, and we know that people are concerned about it. It is important to have a pragmatic solution rather than an ideological response.

I say not to the Minister, to whom I know it does not apply, but to others that we run in fear of the internet companies in many cases. I have asked repeatedly for evidence suggesting that an opt-in solution would be disproportionately costly or technologically impossible, or would somehow damage Britain’s internet economy, which is extremely valuable—it contributes about 8% of GDP—and is growing rapidly. Evidence there is none. It is a pence-per-1,000-users solution. It already exists, the technology is there and it is well developed. We can deal with the question of false positives and false negatives. If I ask start-up companies located at the Shoreditch roundabout, “Do you care if we have opt-in filtering on home broadband or internet provision?”—that is the most developed part of the market; only six companies offer 95% of services—they look at me as though I am mad. It has nothing to do with their business model.

I urge the Government to review the evidence. We have not yet had the evidence review session that we were promised on the inquiry. I understand that faces have changed. I would like to get it right rather than do it quickly, but also to focus as best we can, given the number of Departments involved, on the right solution to protect our children.

Yasmin Qureshi Portrait Yasmin Qureshi (Bolton South East) (Lab)
- Hansard - - - Excerpts

It is a pleasure to speak in this debate under your chairmanship, Mr Owen. I congratulate my hon. Friend the Member for Slough (Fiona Mactaggart) on securing it.

I start from where the hon. Member for Devizes (Claire Perry) stopped. Asking for self-imposed regulation of the industry does not mean that the economy of our country, the booming internet trade or what happens on the internet will suddenly come to a stop and that we as a country will somehow become less economically effective. This debate is about the fact that, as has been said, the internet reaches out to billions and billions of people around the world. Unlike what is in newspapers or on television, which may be limited to particular countries—although somebody travelling to a country might be able to see it—something posted on the internet can be seen by everyone in the world who has access to a computer.

What the internet says is therefore powerful. It is amazing that such a powerful institution or body has no regulation and no sense of responsibility for what is put on it or taken off. As has been said, a lot of internet companies act differently in different countries, so they seem to be sensitive in relation to different countries, although that sensitivity is probably based on economic rationales rather than anything else. Although economics is important, so is the internet’s effect on people.

This debate always ends up with arguments about freedom of expression and the idea that saying that there should be an element of regulation of what appears on the internet, or even in the print media or on TV, somehow curtails people’s freedom of expression. Freedom of expression has never been completely unfettered. As has been said, there have always been things that are illegal to say. Some people might say that if we want to take freedom of expression to its extreme, people should even be allowed to say things that are illegal, and that there should be no restrictions at all. However, we do have restrictions, and rightly so. There is nothing wrong with talking about objectionable material.

I will not discuss sexualisation or the effect of pornography, as the hon. Member for Devizes spoke about it in detail and it is pointless to repeat the same thing. However, I entirely agree with her about the dangers to young people, adults and others who are vulnerable, and I agree with everything that my hon. Friend the Member for Slough said.

May I say on record that I agree with self-regulation rather than a statutory framework? An awful lot is said on the internet that can harm people’s reputation, for instance. I do not see why everybody always says that people’s sensitivities should be ignored completely and that everything objectionable should be on the internet. I am sorry, but while there is freedom of expression—I know that there is no such thing as the freedom not to be offended—we must draw sensible parameters.

If I, or anyone, was to say on the internet that everyone with pink eyes should be put to death at birth, some might say, “Well, what is wrong with that? That is not too objectionable. Pink is not my favourite colour, so why not?” That is a bizarre example, but people might want to say it—in the past, people have used expressions regarding specific groups of people in the world. That would be objectionable and it might be illegal, but I do not think people should be putting things like that on the internet. If they do, there should be a mechanism for regulation. Even if material is not as extreme as saying that people with pink eyes should be put to death at birth, it is still objectionable. I do not see why there should not be a system in place to enable people to raise the issue with the companies concerned and explain why it is a problem.

We touched on the issue of the American film on YouTube. My hon. Friend the Member for Glasgow South (Mr Harris) said that this debate would end up going in that direction, but I want to address the point because a lot of people wrote to me to complain about the content of that film and said that it was objectionable. If people want to discuss a concept in any religion or culture, they should be able to write about it. Nobody is saying that there should not be a discussion or dissemination of ideas. However, when the whole intent is to provoke people, abuse people and vilify people, that cannot be right. Surely somewhere along the line common sense must come into play.

Rushanara Ali Portrait Rushanara Ali
- Hansard - -

Does my hon. Friend agree that it would be helpful, particularly for those who do not have power and money and are not clear about their rights, for people to be able to receive advice that is free, high-quality and accessible on some of these questions? I am not aware that such a provision exists, but perhaps the Minister could consider that as a first step, particularly to help vulnerable people—parents who worry about what their rights are and how they can be enforced—or to help put pressure, as I found in a case with my constituent, on the police to take action so that these issues do not get passed around before they become more serious. Related to that point is libel—where people’s reputations are damaged, something that I experienced myself during my election campaign. It takes a long time and many threats of legal action before libellous material posted on the walls of host sites, or sites that are libellous and wrong, is taken down. Surely the Minister could help with that.

Yasmin Qureshi Portrait Yasmin Qureshi
- Hansard - - - Excerpts

I agree with my hon. Friend. Such an example would be the famous case of Max Mosley. Even though what was written in newspapers was found to be defamatory, it continues to be published on the internet.

I was a member of the Joint Committee on Privacy and Injunctions. The managing directors of Google, Facebook and Twitter gave evidence, and the Committee explored the issue of why content that a nation state has clearly declared illegal is not removed. There were not many issues on which the members of the Committee were unanimous, but we all agreed that all three companies were just twisting and turning and not giving us direct answers. They had to be pressed hard. Initially, they said that it was technically not possible, or difficult, or expensive, or impossible to monitor. When the Committee asked more detailed questions, such as, “Do you have the technology? Is there no software available?” basically, it boiled down to the fact that they did not want to do it—it was as simple as that. It was not in their financial interests to do it. It was not in their profit-making interests to do it. It was not that they could not do it because it was so difficult; they just did not want to. We got that answer—not even then was there complete acceptance—after God knows how many questions. Eventually, there was an admission that, technically, there was no reason why they could not do it. We at least got to the bottom of that.

The Committee looked at the whole issue of regulating the internet. Everybody accepts that there are challenges—they may be technical challenges, but they certainly can be overcome if the desire and intention is there. The issue is all about saying, “We know you can do these things. Why don’t you self-regulate?” If there is content on the internet, whether via YouTube, Facebook or Twitter, that is offensive, rude or defamatory, people should not have to go through the long process of dealing with the law. Max Mosley is a rich man and is able to do so. I think he has challenged Google many times. Every time he makes a challenge, content is deleted before it eventually reappears. Most ordinary people cannot do that—they do not have the money, time or resources. There should be an internal mechanism to deal with such cases. When there is freedom of expression and people can say what they like, it is important for there to be responsibility.

I will return to the recent YouTube case. I accept that YouTube did not cause the deaths, but it is right to say that it knew it would happen. It was done deliberately to provoke, annoy, vilify and abuse. It was not done to discuss and disseminate issues and ideas. It was not done as an academic discussion about a particular aspect of a particular religion, or any particular character in any religious history. It was done purely as a form of abuse. At that point, we have to think about the level of abuse that is aimed at people, whether they are dead or alive.