Baroness Kidron debates involving the Department for Digital, Culture, Media & Sport during the 2017-2019 Parliament

Thu 11th Jul 2019
Mon 8th Apr 2019
Wed 17th Jan 2018
Data Protection Bill [HL]
Lords Chamber

3rd reading (Hansard): House of Lords & Report: 2nd sitting (Hansard): House of Lords
Wed 10th Jan 2018
Data Protection Bill [HL]
Lords Chamber

Report: 3rd sitting Hansard: House of Lords
Wed 10th Jan 2018
Data Protection Bill [HL]
Lords Chamber

Report: 3rd sitting (Hansard - continued): House of Lords

Social Media

Baroness Kidron Excerpts
Thursday 11th July 2019

(4 years, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the right reverend Prelate for tabling today’s debate and draw the attention of the House to my interests as set out in the register. I very much welcome the Church of England’s social media guidelines. They have great force in their simplicity and generosity of spirit, and clearly outline our responsibilities to conduct our online interactions respectfully and honestly. I will focus my contribution on how they might be applied to the social media companies themselves.

For example, the first guideline is:

“Be safe. The safety of children, young people and vulnerable adults must be maintained”.


Far from taking reasonable steps to maintain the safety of children or to support their emotional and social development, social media companies refuse to even recognise the global consensus that a child is a person under the age of 18 as codified by the Convention on the Rights of the Child. Tick a box and a child of 13 can gain access to an environment that routinely exposes them to adult risks and deprives them of the rights that we have fought for decades to establish. Furthermore, minimum age limits are routinely bypassed and poorly enforced, a fact freely admitted by both Snap and Facebook when they appeared before Parliament in recent months. This leaves children of all ages unprotected through many of their most vulnerable years. For children to be safe online, social medial companies first have to provide a safe environment.

A similar scenario unfolds when you consider the guideline:

“Be honest. Don’t mislead people about who you are”.


The spread of misinformation and disinformation polarises debate, impacts on elections, drives the rise in intolerance and fuels spurious health claims and conspiracy theories. This is an area of considerable attention for legislators around the globe but, while much is said about those who create the misinformation, it is important to note that the platforms are not neutral bystanders. In an attention economy where clicks mean money, and the longer that someone stays on line the more you maximise your opportunity to serve them an ad or learn something about them that you can sell later, the spread of the extraordinary, the extreme or the loud is not an unintended consequence of your service; it becomes central to its purpose.

Being honest is not only about information but about the nature of the service itself. When we walk into a tea room, a cinema, a pub or a strip club, we understand the opportunities and risks that those environments offer and are given nuanced indicators about their suitability for ourselves or our children. Social media companies, by contrast, parade as tea rooms but behave like strip clubs. A simple answer would be greater honesty about what the nature of the service holds.

This leads me quite neatly to the guidance to,

“Follow the rules. Abide by the terms and conditions”.


Terms and conditions should enable users to decide whether a service is offering them an environment that will treat them fairly. They are, by any measure, a contract between user and platform; it is therefore unacceptable that these published rules are so opaque, so asymmetrical in the distribution of rights and responsibilities, so interminably long—and then so inconsistently and poorly upheld by the platforms themselves.

This failure to follow the rules is not without consequence. Noble Lords will remember the case of Molly Russell, who took her own life in 2017 after viewing and being auto-recommended graphic self-harm and suicide content. The spokesperson for one of the platforms responsible, Pinterest, said:

“Our existing self-harm policy already does not allow for anything that promotes self-harm. However, we know a policy isn’t enough. What we do is more important than what we say”.


Indeed, and while that tragedy has been widely and bravely publicised by Molly’s father, it is neither the only tragedy nor the only failure. Failure is built into the system. The responsibility for upholding terms and conditions must be a two-way street. I warmly welcome the Government’s proposal in the online harms White Paper:

“The regulator will assess how effectively these terms are enforced as part of any regulatory action”,


and I welcome the Information Commissioner’s similar commitment in the recently published age-appropriate design code.

Let me finish with this. On Monday, 22 children came to the House to see me and offer their thoughts on a 5Rights data literacy workshop that they had been doing for some months. Their observations can be usefully summed up by the fifth of the Church’s guidelines:

“Take responsibility. You are accountable for the things you do”.


These children and young people categorically understood their responsibilities, but they powerfully and explicitly expressed the requirement for the platforms to meet theirs too. It is for the platforms to make their services safe and respectful, for government to put in place the unavoidable requirement that they do so, and for the rest of us to keep speaking up until it is done. With that in mind, I commend the right reverend Prelate for his tireless work to that end and ask the Minister to reassure the House that the promises made to children and parents by the outgoing Executive will be implemented by the incoming Executive.

Regulating in a Digital World (Communications Committee Report)

Baroness Kidron Excerpts
Wednesday 12th June 2019

(4 years, 11 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, it is always a pleasure to follow the noble Baroness, Lady Harding, who, not for the first time, has beautifully articulated some of my points. But I intend to repeat them, and I hope that they will emerge not as stolen thunder but as a common cause, and perhaps a storm around the House as others speak also.

Since my time on the committee shortly comes to an end, I take this opportunity to record my personal thanks to the noble Lord, Lord Gilbert, for his excellent chairmanship throughout, and to pay tribute to my colleagues, who make our meetings so fantastically interesting, collaborative and, occasionally, robust. I also thank the clerk, Theo Pembroke, who has always met our insatiable curiosity with extraordinary patience and good humour. I draw the attention of the House to my interests as set out in the register, particularly as chair of the 5Rights Foundation.

In its introduction, Regulating in a Digital World offers the following observation:

“The need for regulation goes beyond online harms. The digital world has become dominated by a small number of very large companies. These companies enjoy a substantial advantage, operating with an unprecedented knowledge of users and other businesses”.


Having heard from scores of witnesses and read a mountain of written evidence, the committee concludes that regulatory intervention is required to tackle this “power imbalance” between those who use technology and those who own it. As witness after witness pointed out,

“regulation of the digital world has not kept pace with its role in our lives”;

the tech sector’s response to “growing public concern” has been “piecemeal”; and effective, comprehensive, and future-proof regulation is urgent and long overdue. It is on this point of the how the sector has responded to these calls for regulation that I will address the bulk of my remarks today.

Earlier this year, Mark Zuckerberg said:

“I believe we need a more active role for government and regulators. By updating the rules for the internet, we can preserve what’s best about it ... while also protecting society from broader harms”.


Meanwhile, Jeff Bezos said that Amazon will,

“work with any set of regulations we are given. Ultimately, society decides that, and we will follow those rules, regardless of the impact that they have on our business”.

These are just two of several tech leaders who have publicly accepted the inevitability of a regulated online world, which should, in theory, make the implementation of regulation passed in this House a collaborative affair. However, no sooner is regulation drafted than the warm words of sector leaders are quickly replaced by concerted efforts to dilute, delay and disrupt. Rather than letting society decide, the tech sector is putting its considerable resource and creativity into preventing society, and society’s representatives, applying its democratically agreed rules.

The committee’s proposal for a digital authority would provide independence from the conflicts built into the DNA of DCMS, whose remit to innovate and grow the sector necessarily demands a hand-in-glove relationship but which also has a mandate to speak up for the rights and protections of users. More broadly, such an authority would militate against the conflicts between several government departments, which, in speaking variously and vigorously on digital matters across security, education, health and business, are ultimately divided in their purpose. In this divide and rule, the industry position that can be summed up as, “Yes, the status quo needs to change but it shouldn’t happen now or to me, and it mustn’t cost a penny” remains unassailable.

The noble Lord, Lord Gilbert, set out many of the 10 principles by which to shape regulation into an agreed and enforceable set of societal expectations, but they are worth repeating: parity on- and offline, accountability, transparency, openness, privacy, ethical design, recognition of childhood, respect for human rights and equality, education and awareness-raising, and democratic accountability. I want to pick up on one single aspect of design because, if we lived in a world in which the 10 principles were routinely applied, maybe I would not have been profoundly disturbed by an article by Max Fisher and Amanda Taub in the New York Times last week, which reported on a new study by researchers from Harvard’s Berkman Klein Center. The researchers found that perfectly innocent videos of children, often simply playing around outside, were receiving hundreds of thousands of views. Why? Because YouTube algorithms were auto-recommending the videos to viewers who had just watched “prepubescent, partially clothed children”. The American news network MSNBC put it a little more bluntly:

“YouTube algorithm recommends videos of kids to paedophiles”.


However, although YouTube’s product director for trust and safety, Jennifer O’Connor, is quoted as saying that,

“protecting kids is at the top of our list”,

YouTube has so far declined to make the one change that researchers say would prevent this happening again: to identify videos of prepubescent children— which it can do automatically—and turn off its auto-recommendation system on those videos.

The article goes on to describe what it calls the “rabbit hole effect”, which makes the viewing of one thing result in the recommendation of something more extreme. In this case, the researchers noticed that viewing sexual content led to the recommendation of videos of ever younger women, then young adults in school uniforms and gradually to toddlers in swimming costumes or doing the splits. The reason for not turning off the auto-recommend for videos featuring prepubescent children is—again, I quote the YouTube representative’s answer to the New York Times—because,

“recommendations are the biggest traffic driver; removing them would hurt ‘creators’ who rely on those clicks”.

This is what self-regulation looks like.

Auto-recommend is also at the heart of provision 11 in the ICO’s recently published Age Appropriate Design Code, which, as the right reverend Prelate said, is commonly known as the “kids’ code”. Conceived in this House and supported by many noble Lords who are in the Chamber tonight, provision 11 prevents a company using a child’s data to recommend material or behaviours detrimental to children. In reality, this provision, and the kids’ code in general, does no more than what Mark Zuckerberg and Jeff Bezos have agreed is necessary and publicly promised to adhere to. It puts societal rules—in this case, the established rights of children, including their right to privacy and protection—above the commercial interests of the sector and into enforceable regulation.

Sadly, and yet unsurprisingly, the trade association of the global internet companies here in the UK, the Internet Association, which represents, among others, Amazon, Facebook, Google, Twitter and Snapchat, is furiously lobbying to delay, dilute and disrupt the code’s introduction. The kids’ code offers a world in which the committee’s principle—the recognition of childhood—is fundamental; a principle that, when enacted, would require online services likely to be accessed by children to introduce safeguards for all users under the age of 18.

The Internet Association cynically argues that the kids’ code should be restricted to services that are “targeted at children”, in effect putting CBeebies and “Sesame Street” in scope, while YouTube, Instagram, Facebook, Snapchat, et cetera, would be free to continue to serve millions of children as they alone deem fit. The Internet Association has also demanded that children be defined only as those under 13, so that anyone over 13 is effectively treated like an adult. This is out of step with the Data Protection Act 2018 that we passed in this House with government agreement, which defines a child as a person under 18. Moreover, in the event that it is successful in derailing the code in this way, it would leave huge numbers of children unprotected during some of the most vulnerable years of their life.

Perhaps the most disingenuous pushback of all is the Internet Association’s claim that complying with regulations is not technically feasible. This is a sector that promises eye-watering innovation and technical prowess, that intends to get us to the moon on holiday and fill our streets with driverless cars. In my extensive conversations with engineers and computer scientists both in and out of the sector, no one has ever suggested that the kids’ code presents an insurmountable technical problem, a fact underlined by conversations I had in Silicon Valley only a few weeks ago. Yes, it requires a culture change and it may have a price, but the digital sector must accept, like all other industries have before it, that promoting children’s welfare—indeed, citizens’ and community welfare more generally—is simply a price of doing business. Let us not make the mistake of muddling up price and cost, since the cost of not regulating the digital world is one that our children are already paying.

Regulating in a Digital World establishes beyond doubt that if we want a better digital world, we must act now to shape it according to societal values, one of which is to recognise the vulnerabilities and privileges of childhood. I recognise and very much welcome the future plans of the Government in this area, but if we cannot get one exemplar code effectively and robustly into the real world, what message does that send to the sector about our seriousness in fulfilling the grand ambitions of the online harms White Paper?

When replying, could the Minister give some reassurance that the Government will indeed stand four-square behind the Information Commissioner and her ground-breaking kids’ code? In doing so, will they meet the expectations of parents, who have been promised a great deal by this Government but have not yet seen the change in the lived experience of their children. More importantly still, will they meet the needs and uphold the rights of UK children, rather than once again giving in to tech sector lobbying?

I will finish with the words of a 12 year-old boy who I met last Thursday in a 5Rights workshop. A self-professed lover of technology, he said, “They sacrifice people for cash. It makes me so angry. I can’t believe that people are so unnecessarily greedy”. His words, remarkable from someone so young, eloquently sum up the committee’s report.

Online Harms

Baroness Kidron Excerpts
Monday 8th April 2019

(5 years, 1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, with regard to disinformation connected with democracy and those essential questions, the White Paper deals with disinformation generally. With regard to electoral reform and how elections can be affected by the use of the internet, as I said, the Cabinet Office is bringing out a report soon to deal with that. It is right that constitutional affairs are dealt with there.

On disinformation, we have listed in the White Paper some of the areas we expect the regulator to include, such as:

“Promoting diverse news content … Improving the transparency of political advertising”—


noble Lords can read it themselves; there are other things. That is how we are trying to do it across government. As I said, there are other areas that we deliberately do not cover in the White Paper, but that should not be taken to mean that work is not going on. However, I accept the noble Lord’s suggestion that it is important and needs to be done soon. I take that on board.

As far as time is concerned, we are having a consultation, as the noble Lord said, which will end on 1 July. Obviously, it is not possible for me to say today when legislation will come before the House. That is a decision for the Government and the Leaders of both Houses. Judging by the discussions we have had today, and the feeling I get from across the House, all noble Lords think that this is an important issue. The Government think that this is an important issue. We are aware that we have taken time over the consultation. As far as the Home Office and DCMS are concerned, we want to get on with it.

We have just announced a review of advertising that will report in due course.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I too welcome the White Paper. I thank the Minister and the Secretary of State for being open to discussions during the process, and for indicating that there will be more discussions. I feel that more discussions are required because it is a little lacking in detail, and I share others’ concerns about the definition of harms. I was particularly upset to not see a little more work done on the everyday harms: the gaming, the gambling and the addictive loops that drive such unhealthy behaviours online. There are a lot of questions in the paper and I look forward to us all getting together to answer them—I hope quickly and soon. I really welcome the Minister’s words about the anxiety of the Government and both Houses to bring a Bill forward, because that is the litmus test of this White Paper: how quickly we get something on the books.

I feel encouraged by the noble Lord, Lord Griffiths, to mention that on Monday next week we have the launch of the final stage of the age-appropriate design code, which takes a safety-by-design approach. That is what I most welcome in the White Paper, in the Government’s attitude and in the work that we have in front of us: what we want to do is drive good behaviour. We want to drive corporate responsibility. We want to drive shareholders to take responsibility for those massive profits and to make sure that we do not allow the tech sector its exceptionality. It is a business like any other and it must do no harm. In relation to that I mention Will Perrin and Lorna Woods, who brought it forth and did so much work.

Finally, I am really grateful for what the Minister said about the international community. It is worth saying that these problems are in all parts of the world —we are not alone—and they wait and look at what we are doing. I congratulate the Government on acting first.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

Obviously, there are details that need to be ironed out, and that is partly what the consultation is about. I expect there to be a lot of detail, which we will go over when a Bill finally comes to this House. In the past we have dealt with things like the Data Protection Act and have shown that we can do that well. The list in the White Paper of legal harms and everyday harms, as the noble Baroness calls them, is indicative. I completely agree with her that the White Paper is attempting to drive good behaviour. The difference it will make is that companies cannot now say, “It’s not my problem”. If we incorporate this safety by design, they will have to do that, because they will have a duty of care right from the word go. They cannot say, “It’s not my responsibility”, because we have given them the responsibility, and if they do not exercise it there will be serious consequences.

Children and Young People: Digital Technology

Baroness Kidron Excerpts
Thursday 17th January 2019

(5 years, 4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Moved by
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

That this House takes note of the relationship between the use of digital technology and the health and well-being of children and young people.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I am very grateful to all noble Lords who have chosen to speak this afternoon, and very much look forward to each of their contributions. I refer the House to my interests on the register, particularly that as founder and chair of 5Rights.

Fundamental to this debate is the fact that we invented a technology that assumes that all users are equal when, in fact, a third of users worldwide and a fifth of users in the UK are children. It has been 150 years since we pulled children out of the chimneys and put them into school. Since that time we have fought on their behalf for privileges, protections and inalienable rights that collectively constitute the concept of, and offer a legal framework, for, childhood.

Childhood is the journey from infancy to maturity, from dependence to autonomy. We design and mitigate for it in multiple ways across all aspects of society. We educate; we require doctors to obtain additional skills to practise paediatric medicine; we do not hold children to contractual obligations; we put pedestrian crossings near schools; we rate films according to age. Children have special protections around sexual activity. It is illegal for kids to smoke, drink and gamble. We even take steps to protect them in environments where adults smoke, drink and gamble.

In short, we provide a complex but widely understood and respected set of social norms, educational frameworks, regulatory interventions and national and international laws reflecting the global consensus that society as a whole must act in the best interests of the child, in the light of the vulnerabilities and immaturities associated with their age. The digital environment fails to reflect that consensus, and the cost of that failure is played out on the health and well-being of our children.

In setting out this afternoon’s debate, I shall concentrate on three areas: the nature of the digital environment, my concern about the way we conceive online harms and, finally, how we might support children to flourish. For children in the connected world, there is no off or on. Their lives are mediated by technological devices and services that capture infinitesimal detail about their activities, frame the choices available to them and make assumptions—not always accurate—about who they are. Theirs is not a world divided by real and virtual; it is a single lived experience augmented by technology. The vast majority of a child’s interactions are not deliberate decisions of a conscious mind but are predetermined. A child may consciously choose to play a game, but it is machine-engineered Pavlovian reward loops embedded in the game that keep them playing. A child may consciously opt to participate in a social group, but it is the stream of personalised alerts and the engineered measures of popularity that create the compulsive need to attend to that social group. A child may wish to look up a piece of information, but it is the nudge of promoted content and automated recommendation that largely determines what information they receive.

Those predetermined systems are predicated on a business model that profiles users for commercial purposes, yet businesses that sell devices and services in the digital environment deliver them to children with impunity—even though we know that screens eradicate the boredom and capacity for free play that very young children require to develop language, motor skills and imagination; even though we know that a single tired child, kept awake through the night by the hooks and notifications of a sector competing for their attention, affects the educational attainment of the entire class; and even though we know that for teenagers, the feedback loops of social validation and competition intrinsic to social media play an overwhelming role in their state of mind and ability to make safe choices.

The children we work with at 5Rights make the case that it is simply not possible to act your age online. As one young boy said, “Online, I am not a kid but an underage adult”. His Royal Highness the Duke of Cambridge said about the tech sector:

“Their self-image is so grounded in their positive power for good that they seem unable to engage in constructive discussion about the social problems that they are creating”,


including,

“fake news, extremism, polarisation, hate speech, trolling, mental health, privacy and bullying”.

Last year, I was in Africa when a young girl was auctioned as a bride on Facebook. I have sat with the parents of a child bullied to death online. I have been with a young girl at the devastating moment in which she realised that she had been taping sexual acts for a group, not just for the man with whom she thought she was in a relationship. I have been witness to scores of children who have ruined their family life, educational opportunities, reputation and self-esteem through overuse, misuse, misunderstandings and straightforward commercial abuse. An individual child does not, and should not be expected to, have the maturity to meet the social, sexual, political and commercial currency of the adult world.

In December, the Nurture Network, a multidisciplinary group of academics, mental health workers and child development experts, agreed that the three existing agencies of socialisation—family, friends and school—have now been joined by a fourth: the digital environment, an environment of socialisation in which the status of children is not recognised. In an interconnected world, the erosion of the privileges, protections and rights of childhood in one environment results in an erosion of childhood itself.

That brings me to my concerns about how we conceive harms. I will briefly raise three issues. First, our public discourse focuses on a narrow set of extreme harms of a violent or sexual nature. Ignoring so-called “lesser harms” misunderstands that for a child, harms are often cumulative. It fails to deal with the fact that one child will react violently to an interaction that does not harm another, or that vulnerable groups of children might merit specific and particular protection. Crucially, it ignores the fact that for most children, it is the quotidian that lowers their self-esteem, creates anxiety, and inflicts an opportunity cost in which education, relationships and physical and personal development are denuded, rendering children—or, should I say, “underage adults”?—exposed and unprotected. Children’s rights are deliberately conceived as non-hierarchical. We must take all harms seriously.

Secondly, it is not adequate to define children’s experience of the digital environment in terms of an absence of harm. As long ago as 1946, the World Health Organization declared that well-being was,

“not merely the absence of disease or infirmity”.

The NHS defines it as a feeling of “physical, emotional and psychological” well-being. We must set our sights not on the absence of harm but on a child’s right to well-being and human flourishing.

Thirdly, whether we are tackling the problems of live streaming, child sexual abuse, gaming addiction or thinking towards a new world order in which the fridge knows more about your child’s dietary tastes than you do and can exploit that fact, we must not wait until harm has been done but consider in advance the risks that children face. Technology changes fast, but the risks consistently fall into four categories: content risks, both unsuitable and illegal; contact risks, often, but not always, involving an adult; conduct risks, involving risky behaviour or social humiliation; and contract risks, such as exploitative contractual relationships, gambling, aggressive marketing, unfair terms and conditions, discriminatory profiling and so on. Most experts, including many in the enforcement community, consider that upstream prevention based on militating against risk rather than waiting for the manifestation of harm is by far the most effective approach.

There is much we can do. The Minister knows that I am not short of suggestions, but I will finish with a modest list. The digital environment is now indivisible from other environments in which our legal and regulatory arrangements embody our values. Parity of protection has been called for by the NSPCC. It was the approach taken in the Law Commission’s Abusive and Offensive Online Communications: A Scoping Report, and was articulated by the noble Lord, Lord Stevenson, in establishing that the Health and Safety at Work Act 1974 applies equally to artificial intelligence. What plans do the Government have to bring clarity to how our laws apply to the digital environment? Specifically, will the Government bring forward a harmonisation Bill to create an obligation to interpret legislation in a manner that offers parity of protection and redress online and offline, in a similar manner to Section 3 of the Human Rights Act?

Designing out known risk, often referred to as safety by design, is standard across other sectors. We like our brakes to work, our food to be free of poisons and our contracts to be fair in law. The Secretary of State has said that he is minded to introduce a duty of care on the sector. That is very welcome—but to be effective, it must be accompanied by impact assessments, design standards, transparency reporting, robust oversight and a regulator with the full toolkit of persuasion and penalty. Can the Minister confirm that the Government are planning this full suite of provisions?

The age-appropriate design code introduced by this House demands that companies anticipate the presence of children and meet their development needs in the area of data protection. I hope that the Minister will confirm the Government’s determination to produce a robust code across all areas of design agreed during the passage of the Data Protection Act. The code’s safety by design approach could and should be an exemplar of the codes and standards that must eventually form part of an online safety Bill.

Finally, companies make many promises in their published guidelines that set age limits, content rules and standards of behaviour, but then they do not uphold them. It is ludicrous that 61% of 12 year-olds have a social media account in spite of a joining age of 13, that Facebook says that it cannot work to its own definition of hate speech or that Twitter can have half a million pornographic images posted on it daily and still be characterised as a news app. Subjecting routine failure to uphold published terms to regulatory penalty would prevent companies entering into commercial contracts with underage children, drive services to categorise themselves accurately and ensure that companies say what they do, do what they said and are held to account if they fail to do it. I would be grateful if the Minister could confirm that this measure will be included in the upcoming White Paper.

Technology is often said to be neutral, and when we criticise the sector we are told that we are endangering its promise to cure cancer, educate the world and have us experience space travel without leaving our home, or threatening the future prosperity of the nation. Technology is indeed neutral, but we must ask to what end it is being deployed. It could in the future fulfil the hope of its founders and offer the beneficial outcomes for society that we all long for—but not if the price is the privileges, protections and inalienable rights of childhood. A child is a child until they reach maturity, not until the moment they reach for their smartphone.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

My Lords, this has turned into something of a “Today” programme moment, where, having been asked the question, you have no time at all to answer. I am very sorry about that but I thank everybody for their contributions. It has been a hugely interesting debate and very diverse. The one thing that I would like to say in concluding—

Social Media Services

Baroness Kidron Excerpts
Monday 12th November 2018

(5 years, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the noble Lord, Lord Stevenson of Balmacara, for introducing this timely debate and illustrating why it is so important. I also thank him for his kind words. I refer the House to my broad interests in this area.

The statutory duty of care as set out by Will Perrin and Professor Lorna Woods is an important and very welcome prospect. A duty of care is proportionate. The higher the risk, the greater the responsibility of the company to consider its impact in advance. A duty of care is a concept that users themselves can understand. It offers an element of future-proofing, since companies would have to evaluate the risk of a service or product failing to meet the standard of “reasonably foreseeable harm”. It would also ensure that powerful global companies that hide behind the status of being “mere conduits” are held responsible for the safety of the online services they provide. However, a duty of care works only if it applies to all digital services, all harms and all users.

The risks of drawing too narrowly the parameters to which services must comply is highlighted by the provisions of the Digital Economy Act 2017, which sought to restrict children’s access to pornography based on scale and yet failed to bring platforms such as Twitter within scope, despite 500,000 pornographic images being posted daily. Equally, if the duty of care applies to some harms and not others, the opportunity to develop a systemic approach will be missed. Many headlines are preoccupied with the harms associated with content or contact but there is a host of others. For example, behavioural design— otherwise known as “nudge and sludge”—is a central component of many of the services we use. The nudge pushes us to act in the interests of the online service, while the sludge features are those deliberately designed to undermine or obfuscate our ability to act in our own best interests. It is designed to be addictive and involves the deliberate manipulation of free will.

It is also necessary to consider how a duty of care characterises whom we are protecting. We know that children often experience specific harms online differently from adult users. Some categories of people whom we would not consider vulnerable in other settings become targets online—for example, female MPs or journalists. Some harms are prejudicial to whole groups. Examples are the racial bias found in algorithms used to determine bail conditions and sentencing terms in the US, or the evidence that just a handful of sleep-deprived children in a classroom diminishes the academic achievement of the entire class. Of course, there are harms to society as whole, such as the undeclared political profiling that influences electoral outcomes.

I understand that the proposal for a duty of care policy is still under consideration, but I would be grateful if the Minister would outline the Government’s current thinking about scope, including the type and size of services, what harms the Government seek to address and whether they will be restricted to harms against individuals.

When setting out their safety strategy in 2017, the Government made a commitment that what is unacceptable offline should be unacceptable online. That is an excellent place to start, not least because the distinction between online and offline increasingly does not apply. The harms we face are cross-cutting and only by seeing them as an integrated part of our new augmented reality can we begin to consider how to address them.

But defence against harm is not the only driver, we should hope that the technology we use is designed to fulfil our rights, to enable our development and to reflect the values embodied in our laws and international agreements. With that in mind, I propose four pillars of safety that might usefully be incorporated into a broader strategy: parity, safety by design, accountability and enforcement. Parity online and offline could be supported by the publication of guidance to provide clarity about how existing protections apply to the digital environment. The noble Lord, Lord Stevenson, mentioned the Health and Safety at Work Act and the Law Commission recently published a scoping report on abusive and offensive online communications.

Alongside such sector-by-sector analysis, the Government might also consider an overarching harmonisation Bill. Such a Bill would operate in a similar way to Section 3 of the Human Rights Act by creating an obligation to interpret legislation in a way that creates parity of protection and redress online and offline to the extent that it is possible to do so.

This approach applies also to international agreements. At the 5Rights Foundation we are supporting the United Nations Committee on the Rights of the Child in writing a general comment that will formally outline the relevance of the 40-plus articles of the charter to the digital environment. Clarifying, harmonising, consolidating and enhancing existing agreements, laws and regulations would underpin the parity principle and deliver offline norms and expectations in online settings. Will the Minister say whether the Government are considering this approach?

The second pillar is the widely supported principle of safety and privacy by design. In its March 2018 report Secure by Design the DCMS concluded that government and industry action was “urgently” required to ensure that internet-connected devices have,

“strong security … built in by design”.

Minimum universal standards are also a demand of the Department for Business, Energy and Industrial Strategy and the consumer organisation Which?. They are also a central concern of the Child Dignity Alliance technical working group to prevent the spread of images of child sexual abuse. It will publish its report and make recommendations on Friday.

We should also look upstream at the design of smart devices and operating systems. For example, if Google and Apple were to engineer safety and privacy by design into Android and IOS operating systems, it would be transformative.

There is also the age-appropriate design code that many of us had our names to. The Government’s response to the safety strategy acknowledges the code, but it is not clear that they have recognised its potential to address a considerable number of interrelated harms, nor its value as a precedent for safety by design that could be applied more widely. At the time, the Minister undertook that the Secretary of State would work closely in consultation with the Information Commissioner and me to ensure that the code is robust and practical, and meets the development needs of children. I ask the Minister to restate that commitment this evening.

The third pillar is accountability—saying what you will do, doing what you said and demonstrating that you have done it. Accountability must be an obligation, not a tool of lobbyists to account only for what they wish us to know. The argument made by services that they cannot publish data about complaints, or offer a breakdown of data by age, harm and outcome because of commercial sensitivities, remains preposterous. Research access to commercial data should be mandated so that we can have independent benchmarking against which to measure progress, and transparency reporting must be comprehensive, standardised and subject to regulatory scrutiny.

This brings me to enforcement. What is illegal should be clearly defined, not by private companies but by Parliament. Failure to comply must have legal consequences. What is contractually promised must be upheld. Among the most powerful ways to change the culture of the online world would be the introduction of a regulatory backstop for community standards, terms and conditions, age restrictions and privacy notices. This would allow companies the freedom to set their own rules, and routine failure by a company to adhere to its own published rules would be subject to enforcement notices and penalties.

Where users have existing vulnerabilities, a higher bar of safety by default must be the norm. Most importantly, the nuanced approaches that we have developed offline to live together must apply online. Any safety strategy worth its title must not balk at the complexity but must cover all harms from the extreme to the quotidian.

While it is inappropriate for me leap ahead of the findings of the House of Lords committee inquiry on who should be the regulator, it is clear that this is a sector that requires oversight and that all in the enforcement chain need resources and training.

I appreciate the Government’s desire to be confident that their response is evidence-based, but this is a fast- moving world. A regulator needs to be independent of industry and government, with significant powers and resources. The priorities of the regulator may change but the pillars—parity, safety by design, accountability and enforcement—could remain constant.

The inventor of the web, Sir Tim Berners-Lee, recently said that,

“the web is functioning in a dystopian way. We have online abuse, prejudice, bias, polarisation, fake news, there are lots of ways in which it is broken”.

It is time to fix what is broken. A duty of care as part of that fix is warmly welcome, but I hope that the Minister will offer us a sneak preview of a much bolder vision of what we might expect from the Government’s White Paper when it comes.

Data Protection Bill [HL]

Baroness Kidron Excerpts
Monday 14th May 2018

(6 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, the main amendments in this group relate to the representation of data subjects by not-for-profit bodies. Last time we discussed this matter, the question before us was whether those bodies should have to seek the mandate—that is, the consent—of data subjects before pursuing claims on their behalf.

As I said then,

“the Government have reflected on the principles at stake here and agree it would be reasonable for a review to be undertaken, two years after Royal Assent, of the effectiveness of”—

Clause 183—

“as it is currently drafted. The Government are fully prepared to look again at the issue”,

of representation without prior mandate in the context of that review.

“We are serious about this. We will therefore amend the Bill in the other place to provide for such a review and to provide the power for the Government to implement its conclusions”.—[Official Report, 10/1/18; col. 287.]


Commons Amendments 122 and 123 duly deliver on that promise, while Commons Amendment 121 allows the Secretary of State to make regulations to ensure that, where a not-for-profit seeks to represent a large number of data subjects in court proceedings, it can file one claim and not hundreds.

I am grateful to the noble Baroness, Lady Kidron, for her continued engagement on this subject. She and I are in total agreement that children merit specific protection in relation to their personal data, and that the review should look accordingly at the specific barriers young people face in exercising their rights. Therefore, Commons Amendment 122 makes provision for that in subsections (4), (5) and (6) of the proposed new clause. Of course, as some noble Lords have mentioned previously, such provision is not to the exclusion of other vulnerable groups in our society, and the Government fully expect that review to consider their position, too.

Commons Amendment 126 would allow Her Majesty’s Revenue & Customs to share contact detail information with the Ministry of Defence to ensure that the Ministry of Defence is better able to locate and contact members of the ex-regular reserve. The amendment does not alter the liability for ex-regular reserves, nor does it affect the rules regarding the call-out or recall of ex-regular reserves; it is simply about being better able to contact them. The security of the United Kingdom is the primary responsibility of government. Commons Amendment 126 offers us the opportunity to strengthen that security.

Finally, Commons Amendment 282 would insert a schedule making transitional, transitory and saving provision in connection with the coming into force of the Bill, including provision about subject access requests, the Information Commissioner’s enforcement powers and national security certificates. This comprehensive new schedule, running to some 19 pages, is designed to ensure a seamless shift between the 1998 Act and the new data protection law we are scrutinising today. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the Government for listening, the Bill team, the Secretary of State and the Minister, Margot James. The point is that rights are only as good as one’s ability to enact them, so I really welcome the review and I thank all concerned for the very great care and detail with which they have laid it out in the Bill.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, very briefly, we had considerable debate while the Bill was going through the House on whether we should incorporate Article 18(2) and we obviously did not prevail while the Bill was going through this House. Although this does not go as far as incorporating Article 18(2), which I regret—I would clearly like to see the whole loaf, so to speak—at least this gives the possibility of Article 18(2) being incorporated through a review. Will the Minister say when he thinks the review will be laid, in the form of a report? I am assuming that,

“within 30 months of commencement of the Bill”,

means within 30 months from 25 May this year. I am making that assumption so that we can all count the days to when the report will come back for debate in Parliament.

Data Protection Bill [HL]

Baroness Kidron Excerpts
3rd reading (Hansard): House of Lords & Report: 2nd sitting (Hansard): House of Lords
Wednesday 17th January 2018

(6 years, 4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: HL Bill 77-I Marshalled list for Third Reading (PDF, 71KB) - (16 Jan 2018)
Baroness Howe of Idlicote Portrait Baroness Howe of Idlicote (CB)
- Hansard - - - Excerpts

My Lords, I am pleased to speak to my Amendment 4, which I regard as small but important for the purposes of clarification.

Last month, there was universal support from your Lordships when my noble friend Lady Kidron introduced her excellent amendment on the age-appropriate design code, which is now the subject of Clause 124. At the time, I raised a question about the intention regarding the scope of the amendment, as there is no definition of “children” either in the amendment or in the Bill. I said that, as the amendment refers to the United Nations Convention on the Rights of the Child,

“I assume that the intention is that the age-appropriate design code of practice will cover all children up to the age of 18”.—[Official Report, 11/12/17; col. 1430.]

During the debate, my noble friend Lady Kidron said:

“The code created by the amendment will apply to all services,


‘likely to be accessed by children’,

irrespective of age and of whether consent has been asked for. This particular aspect of the amendment could not have been achieved without the help of the Government. In my view it is to their great credit that they agreed to extend age-appropriate standards to all children”.—[Official Report, 11/12/17; col. 1427.]

I was reassured by this statement about the intent of the clause but I remain concerned that there is no explicit definition in the Bill to indicate that we are indeed talking about any person under the age of 18, especially as the reference to the requirement to engage with the UN Convention on the Rights of the Child in Clause 124(4) is an obligation only to “have regard to”.

The truth is that there is no clear or consistent reference to a child or children in the Data Protection Bill. Clause 9 defines the right of a child to consent to their data’s use and says that this right starts at 13. Clause 201 covers children in Scotland, suggesting that there the right commences at the age of 12. These different approaches open up the door for arguments about the age at which the rights conferred by Clause 124 are operational for children. I would hate us to find ourselves in a position where, once this Bill was passed, a debate began about the ages at which the benefits of Clause 124 applied to children. This could result in a narrowing of the definition of children benefiting from Clause 124 so that it related only to some people under 18, rather than to all those under 18, on account of the Bill not being clear.

Years of experience have taught me that it is best to be crystal clear about what we are talking about, and that is why I have tabled this amendment. If the Government do not think it necessary, I hope the Minister will clearly state in his reply that the Government intend that Clause 124 should indeed relate to all persons under the age of 18. I look forward to hearing what he has to say. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I thank my noble friend for bringing this issue to the attention of the House. It is my understanding that, by invoking the UNCRC, we are talking about children being people under the age of 18. I would very much welcome the Minister’s saying that that extends beyond Clause 124, which we brought forward, to everywhere in the Bill that “children” is mentioned.

Lord Swinfen Portrait Lord Swinfen (Con)
- Hansard - - - Excerpts

My Lords, can the Minister tell the House at what age the United Nations considers that a child ceases to be a child?

Social Media: News

Baroness Kidron Excerpts
Thursday 11th January 2018

(6 years, 4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Moved by
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

That this House takes note of the role played by social media and online platforms as news and content publishers.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, it is a great privilege to open a debate with such a broad range of informed speakers to follow. The question in front of us produces a number of interrelated and extremely important issues. I shall not attempt to cover them all but, instead, simply to set the scene for the detailed contributions that are to follow.

The interface between humans and information, be it visual, graphic, moving images, sound or text, is as long as our history. Our understanding of what to expect from those interactions is seen through the prism of technological innovations, cultural understanding and legal frameworks. It is encapsulated by the concepts of broadcast and publishing.

In this long history, the online service providers are an anomaly. The military and academic labs where the web originated were home to groups of skilled and active participants in an open web who saw the potential of decentralised networked computers as liberating and democratising. This was a physical network; these were academics and computer scientists bound by cables, not commerce. They did not consider themselves publishers, nor responsible for the content of others.

This view was almost immediately contested and overturned by early court judgments, but founders of the nascent platform successfully fought back. Citing the First Amendment, they insisted that their network of small networks had no controlling force and that the occasional misuse or obscenity was a small price to pay for a world with no gatekeepers.

The US “safe harbor” provisions in Section 230 of the Communications Decency Act 1996 allowed online service providers to host, hold and transfer information with no liability for content. This principle was mirrored around the world, including in the e-commerce directive of 2000 that codified online service providers as “mere conduits”. This was Web 1.0.

Much of the internet’s utopian promise came true. But what nobody anticipated, including its founders, was how rapidly it would become highly commercialised. Ironically, the “safe harbor” provisions of Section 230, established to protect the common good from a few dissonant voices, now work against that common good. Those who publish online are incentivised to categorise themselves as online service providers in order to benefit from having no liability for content. It is a commercial advantage that has seen the exponential rise of a vanishingly small number of companies with unparalleled power, no collective oversight and unlimited piles of cash. This is Web 2.0, and it is in that context that we are having our debate.

Amazon has set up a movie studio. Facebook has earmarked $1 billion to commission original content this year. YouTube has fully equipped studios in eight countries. The Twitter Moments strand exists to,

“organize and present compelling content”.

Apple reviews every app submitted to its store,

“based on a set of technical, content, and design criteria”.

By any other frame of reference, this commissioning, editing and curating is for broadcasting or publishing.

In giving evidence to the Communications Committee on 19 December, representatives of Facebook and Google agreed that the vast proportion of their income comes from advertising—87% and 98% respectively. This advertising is embedded in, pops up in between and floats across the content that their users engage with. Sir Martin Sorrell, chief executive of WPP, was clear what that means when he said that,

“Google, Facebook and others are media companies … They cannot masquerade as technology companies, particularly when they place advertisements”.

In common with publishers and broadcasters, these companies use editorial content as bait for advertising. They aggregate and spread the news, and provide data points and key words: behaviours that determine what is most important, how widely it should be viewed and by whom. In common with news publishers, they offer a curated view of what is going on in the world.

The Silicon Valley companies are content creators, aggregators, editors, information cataloguers, broadcasters and publishers. Indeed, severally and together they publish far more media than any other publisher in any other context—but, in claiming to be “mere conduits”, they are ducking the responsibilities that the rest of the media ecosystem is charged with.

The media is understood to be a matter of huge public and social interest because it affects common values, certain freedoms and individual rights. For the same set of reasons, it is subject to a complex matrix of regulatory and legal frameworks. But publishing and, by extension, broadcasting are not only legal and commercial constructs but cultural constructs with operating norms that reflect a long history of societal values and expectations, one of which is that those involved are responsible for content. They are responsible because, traditionally, they make large sums of money; they are responsible because they juggle those commercial interests with editorial interests; they are responsible because, within those editorial interests, they are expected to balance freedom of expression against the vulnerabilities, sensitivities and rights of the individual; and they are responsible because they are a controlling force over the veracity, availability and quality of information that is central to the outcome of our collective civic life.

In November, there was an outcry after a journalist reported that algorithms were auto-suggesting horrific videos to young users of YouTube Kids. Google’s response was not proactively to look at the content on its kids’ channel but to ask users to flag content, thereby leaving it to pre-schoolers to police the platform. Google did not dispute that the videos were disturbing or that the channel would be better off without them, but in its determination to uphold the fallacy of being a “mere conduit”, it was prepared to outsource its responsibilities to children as young as four and five.

Whatever the protestations, this is not a question of free speech; it is a question of money. The Google representative giving evidence to the Communications Committee said that to moderate all content on YouTube would take a workforce 180,000 people. Irrespective of the veracity of that statement, for a publisher or broadcaster, checking that your content is safe for children is not an optional extra; it is a price of doing business, a cost before profit. In October last year, Google’s parent company, Alphabet, was worth $700 billion.

I am not suggesting a return to a pre-tech era; nor am I advocating censorship. The media environment has never been, and hopefully will never be, home to a homogenous worldview. Nor should one romanticise its ability to “do the right thing”. It is a changing and fraught public space in which standards and taste are hotly contested and often crushingly low. But editorial standards and oversight, retraction, industry codes, statutory regulation, legal liability, and parliamentary oversight are no hazard to free speech. On the contrary—as information technologies have become ever more powerful, in democracies we demand that they uphold minimum standards precisely to protect free speech from powerful corporate and political interests.

The advances and possibilities of the networked world will always excite and will hopefully, in time, answer some of society’s greatest needs—but these companies occupy a legal space on a false premise, giving them a commercial advantage based on their ability to publish with impunity. That in turn undermines other media, threatens plurality and increasingly contributes to an insupportable cultural environment fuelled by a business model that trades attention for advertising revenue.

Sean Parker, co-founder of Facebook, said that when setting up Facebook the question on the table was:

“'How do we consume as much of your time and conscious attention as possible?”.


The answer was that,

“we … give you a little dopamine hit every once in a while, because someone liked or commented on a photo … to get you to contribute more content … It’s a social-validation feedback loop … exploiting a vulnerability in human psychology”.

The hermetic spiral of content ends in ever more polarised views as users become blind to other perspectives, denuding us of a common space. The result is the abuse of public figures and the spread of bullying, hate and misogynist content at unparalleled levels. The ad revenue model fails to compensate content creators adequately and we have seen the wholesale collapse of other creative industries, the long-term cultural costs of which we have yet to calculate.

In the battle for our attention we have seen the weaponisation of information to political ends. While nothing new in itself, the commoditisation of political narratives and the lack of accountability has promoted a surge of fake news, locally and internationally funded, and with it comes a democratic deficit. This was frighteningly illustrated by the outcome of a Channel 4 survey last year in which fewer than 4% of people were able to correctly identify false news stories from true. The cost goes beyond the cultural and political. Our attention is secured by an eye-watering regime of data collection and with it a disturbing invasion of privacy and free will. The insights and potential for social and political control enabled by unfettered data profiling without redress or oversight undermine our human rights, our rights as citizens and the need for privacy in which to determine who we are as people.

The appropriation of our personal data is predicated on the use of intellectual property law. The very same companies that rigorously avoid editorial standards and regulatory responsibilities for content are happy to employ the protection of terms and conditions running to hundreds of pages that protect their commercial interests. The cherry picking of regulatory structures is at best hypocritical. Lionel Barber, editor of the FT, suggests that we “drop the pretence”. In a soon to be published paper from a group of industry insiders comes the suggestion of a new status of “online content provider”, with an accompanying online responsibility Bill and a new regulator. But perhaps, just as the arrival of networked computers led to a new legal status of “safe harbor”, the arrival of networked tech conglomerates requires an entirely new definition, based on the interrelation of society and technology.

Because, while big tech has yet to wake up to the societal responsibilities of its current businesses, the rest of us are hurtling towards Web 3.0: a fully networked world of smart homes and smart cities that will see the big five companies—seven if we include China—monopolise whole sectors and particular technologies, controlling both demand and supply, mediating all our behaviours and yet remaining beyond the jurisdiction of Governments.

We must never forget the extraordinary potential and social good in the technologies already invented and in use and in those still emerging, including publishing at a grand scale. However, while the internet is young, it is no longer young enough to be exempt from its adult responsibilities. This is no longer an industry in need of protection while it incubates. These are the most powerful companies in the world.

In finishing, I ask the Minister to tell the House whether the scope of the Government’s digital charter will include a review of the legal status of online service providers and an ethical framework for content. Perhaps he will also say whether he agrees with me that the same standards and responsibilities should apply to the media activities of online service providers in parity with other media players. Finally, what steps are the Government taking to create an international consensus for a global governance strategy for online service providers? I beg to move.

Viscount Younger of Leckie Portrait Viscount Younger of Leckie (Con)
- Hansard - - - Excerpts

My Lords, I may sound like a long-playing record, but in this debate we have just a few minutes to spare on timings. I ask that every Back-Bench speech concludes as the clock reaches four minutes, as otherwise the wind-up speeches may have to be shortened.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

I would like to be the first member of the data users union proposed by the noble Lord, Lord Knight. I hope that other noble Lords would like to join me. It is an excellent idea. In my excitement at starting the debate I forgot to declare my interests as set out in the register, including that as founder of 5Rights.

I think it is fair to say that you could characterise our feeling towards online services and social media as a combination of loving someone who is behaving badly, the frustration with an 18 year-old who does not quite know that they have grown up and is supposed to behave in a different way, and a palpable fury at corporate indifference on certain important subjects. However, it is too easy to just look at it that way. All those things are true and I share all those views, and I thank all noble Lords who have spoken in this fantastically interesting and progressive debate. However, what I heard most was our failure to articulate an ethical standard by which we want the industry to behave so that we can then meet it halfway. That was what came out of today’s excellent debate—questions of democracy, accountability, transparency, monopoly, tax regimes, codes of conduct and global consensus on governance. These are matters for society. If we are to have the society that lives by the values we want, we have to show leadership. I say to the Minister that I think the Government are showing leadership, which I welcome. I again thank all noble Lords for their contributions. This has been, by all standards, a wonderful debate.

Motion agreed.

Data Protection Bill [HL]

Baroness Kidron Excerpts
Moved by
109: After Clause 120, insert the following new Clause—
“Age-appropriate design code
(1) The Commissioner must prepare a code of practice which contains such guidance as the Commissioner considers appropriate on standards of age-appropriate design of relevant information society services which are likely to be accessed by children.(2) Where a code under this section is in force, the Commissioner may prepare amendments of the code or a replacement code.(3) Before preparing a code or amendments under this section, the Commissioner must consult the Secretary of State and such other persons as the Commissioner considers appropriate, including—(a) children,(b) parents,(c) persons who appear to the Commissioner to represent the interests of children,(d) child development experts, and(e) trade associations.(4) In preparing a code or amendments under this section, the Commissioner must have regard—(a) to the fact that children have different needs at different ages, and(b) to the United Kingdom’s obligations under the United Nations Convention on the Rights of the Child.(5) A code under this section may include transitional provision or savings.(6) Any transitional provision included in the first code under this section must cease to have effect before the end of the period of 12 months beginning with the day on which the code comes into force.(7) In this section—“age-appropriate design” means the design of services so that they are appropriate for use by, and meet the development needs of, children; “information society services” has the same meaning as in the GDPR, but does not include preventive or counselling services;“relevant information society services” means information society services which involve the processing of personal data to which the GDPR applies;“standards of age-appropriate design of relevant information society services” means such standards of age-appropriate design of such services as appear to the Commissioner to be desirable having regard to the best interests of children;“trade association” includes a body representing controllers or processors;“the United Nations Convention on the Rights of the Child” means the Convention on the Rights of the Child adopted by the General Assembly of the United Nations on 20 November 1989 (including any Protocols to that Convention which are in force in relation to the United Kingdom), subject to any reservations, objections or interpretative declarations by the United Kingdom for the time being in force.”

Data Protection Bill [HL]

Baroness Kidron Excerpts
Report: 3rd sitting (Hansard - continued): House of Lords
Wednesday 10th January 2018

(6 years, 4 months ago)

Lords Chamber
Read Full debate Data Protection Act 2018 View all Data Protection Act 2018 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: HL Bill 74-III Third marshalled list for Report (PDF, 153KB) - (8 Jan 2018)
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for his introduction and for paving the way to the comments I want to make. He suggested further reading but I might be able to shorten the reading list for the Minister, because I am going to cite a bit of what has been sent as part of that package. We went through most of the main issues and had a full response from Ministers the last time this was raised, in Committee. But since then we have of course amended the Bill substantially to provide for a significant amount of age-appropriate design work to be done to protect children who, either lawfully or unlawfully as it might be, come into contract arrangements with processors of their data.

That data processing will almost certainly be done properly under the procedures here. We hope that, within a year of Royal Assent, we will see the fruits of that coming through. But after that, we will be in uncharted territory as far as younger persons and the internet are concerned. They will obviously be on there and using substantial quantities of data—a huge amount, as is picked up when one sees one’s bills and how much time they spend on downloading material from the internet and has to find the wherewithal to provide for them. But I am pretty certain there will also be occasions where things do not work out as planned. They may well find that their data has been misused or sold in a way they do not like, or processed in a way which is not appropriate for them. In those circumstances, what is the child to do? This is why I want to argue that the current arrangements, and the decision by the Government not to allow for the derogation provided for in the GDPR under article 82 to apply, may have unforeseen consequences.

I am grateful to the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for supporting Amendment 175A, and I look forward to her comments later on, particularly in relation to children’s use. It is important to recognise that, if there is a derogation and it is not taken up, there has to be a good reason for that. The arguments brought up last time were largely along the lines that it would be overcomplicated to have two types of approach and that, in any case, there was sufficient evidence to suggest that individual consumers would prefer to be represented when they do so—of course, that falls away when we talk about children.

In Amendment 175A, we are trying to recognise two things: first, the right of adults to seek collective redress on issues taken up on their behalf by bodies that have a particular skill or knowledge in that area and, secondly, to do this without the need to form an association with an individual or group, or a particular body that has a responsibility for it. The two parts of the amendment will provide a comprehensive regime to allow victims of data breaches to bring proceedings to vindicate rights to proper protection of their personal data, always bearing in mind that children will have the additional cover provided by theirs being a third-party involvement. We hope that there will not be serious breaches of data protection. We think that the Bill is well constructed and that in most cases it will be fine, but the possibility that it will happen cannot be ignored. This parallels other arrangements, including those in the Consumer Rights Act 2015, which apply to infringements of competition law—not a million miles away from where we are here—and for which there is a procedure in place.

To anticipate where the Government will come from on this, first, I think they will say that there is a lot going on here and no evidence to suggest that it should work. I suggest to them that we would be happy with a recognition that this issue is being applied elsewhere in Europe and that there is a discrepancy if it is not in Britain. Secondly, there may be a good case for waiting some time until we understand how the main provisions work out. But a commitment to keep this under review, perhaps within a reasonable time after the commencement of the procedures—particularly in relation to children and age-appropriate design—to carry out a formal assessment of the process and to consider its results would, I think, satisfy us. I accept the argument that doing too much too soon might make this difficult, but the principle is important and I look forward to the responses.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I too want to speak to this amendment, to which I have added my name, and I acknowledge and welcome the support of the Information Commissioner on this issue. I support the collective redress of adults but I specifically want to support the noble Lord, Lord Stevenson, on this question of children.

At Second Reading and again in Committee I raised the problem of expecting a data subject who is a child to act on their own behalf. Paragraph (b) of proposed new subsection (4B) stipulates that,

“in the case of a class consisting of or including children under the age of 18, an individual may bring proceedings as a representative of the class whether or not the individual’ s own rights have been infringed".

This is an important point about the right of a child to have an advocate who may be separate from that child and whose own rights have not been abused. Children cannot take on the stress and responsibility of representing themselves and should not be expected to do so, nor should they be expected to police data compliance. Children whose data is processed unlawfully or who suffer a data breach may be unaware that something mischievous, harmful or simply incorrect has been attached to their digital identity. We know that data is not a static or benign thing and that assumptions are made on what is already captured to predict future outcomes. It creates the potential for those assumptions to act as a sort of lead boot to a child’s progress. We have to make sure that children are not left unprotected because they do not have the maturity or circumstances to protect themselves.

As the noble Lord, Lord Stevenson, said, earlier this evening, the age-appropriate design code was formally adopted as part of this Bill. It is an important and welcome step, and I thank the Minister and the new Secretary of State Matt Hancock, whose appointment I warmly welcome, for their contribution to making that happen. Children’s rights have been recognised in the Bill, but rights are not meaningful unless they can be enacted. Children make up nearly one-third of all users worldwide, but rarely do they or the vast majority of their parents have the skills necessary to access data protection.

The amendment would ensure that data controllers worked to a higher standard of data security when dealing with children’s data in the first place. Rather than feeling that the risk of a child bringing a complaint was vanishingly low, they would know that those of us who advocate for and protect the rights of children were able to make sure that their data was treated with the care, security and respect that we all believe it deserves.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I am very grateful to noble Lords for their comments. Although I have to say at the outset that we have some reservations about these amendments, I think we might be able to find a way forward this evening. I have listened to the noble Lords, Lord Stevenson and Lord Clement-Jones, and taken their remarks on board, but I have especially listened to the noble Baroness, Lady Kidron, who spoke about children. We have some experience of her input in this Bill. I obviously take a lot of notice of what the noble Lords, Lord Stevenson and Lord Clement-Jones, say but, as you know, familiarity and all that, so I have certainly listened especially to the noble Baroness, Lady Kidron.

The Government are sympathetic to the idea of facilitating greater private enforcement, but we continue to believe that the Bill as drafted provides significant and sufficient recourse for data subjects. In our view, there is no need to invoke article 80(2) of the GDPR, with all the risks and potential pitfalls that that entails. To recap, the GDPR provides for, and the Bill allows, data subjects to mandate a suitable non-profit organisation to represent their interests following a purported infringement. The power will, in other words, be in their hands. They will have control over which organisation is best placed to represent their interests, what action to take and what remedy to seek. The GDPR also places robust obligations on the data controller to notify the data subject if there has been a breach which is likely to result in a high risk to the data subject’s rights and freedoms. This is almost unprecedented and quite different from, say, consumer law where compulsory notification of customers is rarely proportionate or achievable.

These are very significant developments from the 1998 Act and augment a rapidly growing list of enforcement options available to data subjects. That list already includes existing provisions for collective redress, such as group litigation orders, which were used so effectively in the recent Morrisons data breach case, and the ability for individuals and organisations to independently complain to the Information Commissioner where they have concerns about how personal data is being processed.

What these initiatives have in common is that they, like the GDPR as a whole, seek to empower data subjects and ensure they receive the information they need to enforce their own data rights. By comparison, Amendments 175 and 175A would go much further. I stress that, as I have already said, we are not against greater private enforcement, and I have borne in mind the points the noble Baroness made about children. We also have reservations about the drafting and purpose of these amendments, all of which I could of course go through at length, if the House wishes, but in view of what I am about to say, I hope that will not be necessary.

Since Committee, the Government have reflected on the principles at stake here and agree it would be reasonable for a review to be undertaken, two years after Royal Assent, of the effectiveness of Clause 173 as it is currently drafted. The Government are fully prepared to look again at the issue of article 80(2) in the context of that review. We are serious about this. We will therefore amend the Bill in the other place to provide for such a review and to provide the power for the Government to implement its conclusions.

In view of that, I would be very grateful if the noble Lord will withdraw his amendment this evening and other noble Lords do not press theirs.

Baroness Kidron Portrait Baroness Kidron
- Hansard - -

Before the Minister sits down, can I get absolute reassurance from him that this is not pushing it into the future, where it will languish? Will the Government be looking to this review to actually solve the problem that we have put forward on behalf of children?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

It absolutely will not and cannot languish, because we are going to put in the Bill—so on a statutory basis—that this has to be reviewed in two years. It will not languish. As I said, if we were just going to kick it into the long grass, I would not have said what I just said, which everyone can read. We would not have put it in the Bill and made the commitments we have made tonight.