Regulating in a Digital World (Communications Committee Report) Debate

Full Debate: Read Full Debate
Department: Department for Digital, Culture, Media & Sport

Regulating in a Digital World (Communications Committee Report)

Baroness Kidron Excerpts
Wednesday 12th June 2019

(4 years, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, it is always a pleasure to follow the noble Baroness, Lady Harding, who, not for the first time, has beautifully articulated some of my points. But I intend to repeat them, and I hope that they will emerge not as stolen thunder but as a common cause, and perhaps a storm around the House as others speak also.

Since my time on the committee shortly comes to an end, I take this opportunity to record my personal thanks to the noble Lord, Lord Gilbert, for his excellent chairmanship throughout, and to pay tribute to my colleagues, who make our meetings so fantastically interesting, collaborative and, occasionally, robust. I also thank the clerk, Theo Pembroke, who has always met our insatiable curiosity with extraordinary patience and good humour. I draw the attention of the House to my interests as set out in the register, particularly as chair of the 5Rights Foundation.

In its introduction, Regulating in a Digital World offers the following observation:

“The need for regulation goes beyond online harms. The digital world has become dominated by a small number of very large companies. These companies enjoy a substantial advantage, operating with an unprecedented knowledge of users and other businesses”.


Having heard from scores of witnesses and read a mountain of written evidence, the committee concludes that regulatory intervention is required to tackle this “power imbalance” between those who use technology and those who own it. As witness after witness pointed out,

“regulation of the digital world has not kept pace with its role in our lives”;

the tech sector’s response to “growing public concern” has been “piecemeal”; and effective, comprehensive, and future-proof regulation is urgent and long overdue. It is on this point of the how the sector has responded to these calls for regulation that I will address the bulk of my remarks today.

Earlier this year, Mark Zuckerberg said:

“I believe we need a more active role for government and regulators. By updating the rules for the internet, we can preserve what’s best about it ... while also protecting society from broader harms”.


Meanwhile, Jeff Bezos said that Amazon will,

“work with any set of regulations we are given. Ultimately, society decides that, and we will follow those rules, regardless of the impact that they have on our business”.

These are just two of several tech leaders who have publicly accepted the inevitability of a regulated online world, which should, in theory, make the implementation of regulation passed in this House a collaborative affair. However, no sooner is regulation drafted than the warm words of sector leaders are quickly replaced by concerted efforts to dilute, delay and disrupt. Rather than letting society decide, the tech sector is putting its considerable resource and creativity into preventing society, and society’s representatives, applying its democratically agreed rules.

The committee’s proposal for a digital authority would provide independence from the conflicts built into the DNA of DCMS, whose remit to innovate and grow the sector necessarily demands a hand-in-glove relationship but which also has a mandate to speak up for the rights and protections of users. More broadly, such an authority would militate against the conflicts between several government departments, which, in speaking variously and vigorously on digital matters across security, education, health and business, are ultimately divided in their purpose. In this divide and rule, the industry position that can be summed up as, “Yes, the status quo needs to change but it shouldn’t happen now or to me, and it mustn’t cost a penny” remains unassailable.

The noble Lord, Lord Gilbert, set out many of the 10 principles by which to shape regulation into an agreed and enforceable set of societal expectations, but they are worth repeating: parity on- and offline, accountability, transparency, openness, privacy, ethical design, recognition of childhood, respect for human rights and equality, education and awareness-raising, and democratic accountability. I want to pick up on one single aspect of design because, if we lived in a world in which the 10 principles were routinely applied, maybe I would not have been profoundly disturbed by an article by Max Fisher and Amanda Taub in the New York Times last week, which reported on a new study by researchers from Harvard’s Berkman Klein Center. The researchers found that perfectly innocent videos of children, often simply playing around outside, were receiving hundreds of thousands of views. Why? Because YouTube algorithms were auto-recommending the videos to viewers who had just watched “prepubescent, partially clothed children”. The American news network MSNBC put it a little more bluntly:

“YouTube algorithm recommends videos of kids to paedophiles”.


However, although YouTube’s product director for trust and safety, Jennifer O’Connor, is quoted as saying that,

“protecting kids is at the top of our list”,

YouTube has so far declined to make the one change that researchers say would prevent this happening again: to identify videos of prepubescent children— which it can do automatically—and turn off its auto-recommendation system on those videos.

The article goes on to describe what it calls the “rabbit hole effect”, which makes the viewing of one thing result in the recommendation of something more extreme. In this case, the researchers noticed that viewing sexual content led to the recommendation of videos of ever younger women, then young adults in school uniforms and gradually to toddlers in swimming costumes or doing the splits. The reason for not turning off the auto-recommend for videos featuring prepubescent children is—again, I quote the YouTube representative’s answer to the New York Times—because,

“recommendations are the biggest traffic driver; removing them would hurt ‘creators’ who rely on those clicks”.

This is what self-regulation looks like.

Auto-recommend is also at the heart of provision 11 in the ICO’s recently published Age Appropriate Design Code, which, as the right reverend Prelate said, is commonly known as the “kids’ code”. Conceived in this House and supported by many noble Lords who are in the Chamber tonight, provision 11 prevents a company using a child’s data to recommend material or behaviours detrimental to children. In reality, this provision, and the kids’ code in general, does no more than what Mark Zuckerberg and Jeff Bezos have agreed is necessary and publicly promised to adhere to. It puts societal rules—in this case, the established rights of children, including their right to privacy and protection—above the commercial interests of the sector and into enforceable regulation.

Sadly, and yet unsurprisingly, the trade association of the global internet companies here in the UK, the Internet Association, which represents, among others, Amazon, Facebook, Google, Twitter and Snapchat, is furiously lobbying to delay, dilute and disrupt the code’s introduction. The kids’ code offers a world in which the committee’s principle—the recognition of childhood—is fundamental; a principle that, when enacted, would require online services likely to be accessed by children to introduce safeguards for all users under the age of 18.

The Internet Association cynically argues that the kids’ code should be restricted to services that are “targeted at children”, in effect putting CBeebies and “Sesame Street” in scope, while YouTube, Instagram, Facebook, Snapchat, et cetera, would be free to continue to serve millions of children as they alone deem fit. The Internet Association has also demanded that children be defined only as those under 13, so that anyone over 13 is effectively treated like an adult. This is out of step with the Data Protection Act 2018 that we passed in this House with government agreement, which defines a child as a person under 18. Moreover, in the event that it is successful in derailing the code in this way, it would leave huge numbers of children unprotected during some of the most vulnerable years of their life.

Perhaps the most disingenuous pushback of all is the Internet Association’s claim that complying with regulations is not technically feasible. This is a sector that promises eye-watering innovation and technical prowess, that intends to get us to the moon on holiday and fill our streets with driverless cars. In my extensive conversations with engineers and computer scientists both in and out of the sector, no one has ever suggested that the kids’ code presents an insurmountable technical problem, a fact underlined by conversations I had in Silicon Valley only a few weeks ago. Yes, it requires a culture change and it may have a price, but the digital sector must accept, like all other industries have before it, that promoting children’s welfare—indeed, citizens’ and community welfare more generally—is simply a price of doing business. Let us not make the mistake of muddling up price and cost, since the cost of not regulating the digital world is one that our children are already paying.

Regulating in a Digital World establishes beyond doubt that if we want a better digital world, we must act now to shape it according to societal values, one of which is to recognise the vulnerabilities and privileges of childhood. I recognise and very much welcome the future plans of the Government in this area, but if we cannot get one exemplar code effectively and robustly into the real world, what message does that send to the sector about our seriousness in fulfilling the grand ambitions of the online harms White Paper?

When replying, could the Minister give some reassurance that the Government will indeed stand four-square behind the Information Commissioner and her ground-breaking kids’ code? In doing so, will they meet the expectations of parents, who have been promised a great deal by this Government but have not yet seen the change in the lived experience of their children. More importantly still, will they meet the needs and uphold the rights of UK children, rather than once again giving in to tech sector lobbying?

I will finish with the words of a 12 year-old boy who I met last Thursday in a 5Rights workshop. A self-professed lover of technology, he said, “They sacrifice people for cash. It makes me so angry. I can’t believe that people are so unnecessarily greedy”. His words, remarkable from someone so young, eloquently sum up the committee’s report.