Freedom of Expression (Communications and Digital Committee Report)

Baroness O'Neill of Bengarve Excerpts
Thursday 27th October 2022

(1 year, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness O'Neill of Bengarve Portrait Baroness O'Neill of Bengarve (CB)
- View Speech - Hansard - -

My Lords, this is a rich, detailed and informative report, yet one underlying issue has perhaps gone to the margins: the focus on freedom of expression. Nowadays, we often use the term “freedom of expression” as though it were a synonym for freedom of speech. I note that communication involves two parties—not merely those who express themselves, the originators, but the recipients. This shift has been a feature of 20th-century discussions. When we shifted human rights documents to focus on freedom of expression rather than free speech, perhaps we did not notice that this marginalises the position of recipients and privileges originators. In short, there is a difference between expression and communication. Freedom of expression is not enough for a democratic culture in which free communication is respected and required.

As we well know, new communications technologies have often fundamentally disrupted communication. We can think all the way back to what Plato tells us of Socrates writing about writing, to realise how old this is. Similar things happened with the advent of printing and then, of course, of broadcasting. The remedies were often extremely slow, which is a salutary lesson for us in contemplating the recommendations of this report. How fast could it be done? How much of a change would it achieve?

This time, as I mentioned, we have new technologies that privilege the originators and expand their freedom of expression—at least in theory. That is no bad thing, but it might leave the recipients in a problematic position, receiving content from they know not where or whom. That is where the problem begins: we do not who the originators of this communication are. Very often, this is a source of difficulty.

Unsurprisingly, some norms and standards that have mattered greatly for communication will be ignored if we are thinking mainly about freedom of expression. Norms that can be ignored might include—this is just a smattering; there are many others—honesty, accuracy, civility, reliability and respect for evidence. I could go on. Noble Lords will note that they are not only ethical but epistemic norms. These are the bedrock of good communication.

So, stressing the rights of originators too much is likely to land us with some difficulty. Digital communication empowers originators, and this can be at the expense of recipients. Let us remember that some of the originators are not you, me and our fellow citizens seeking to express ourselves, but tech companies, data brokers and other actors in the digital space who relish the thought that they have freedom of expression, because it enables them to do things they perhaps ought not to do.

It follows that remedying the situation will be multiply difficult and probably slow, but the one thing it must not be is a set of remedies that protect originators at the expense of recipients. Remedies must concentrate on removing the cloak of anonymity that currently protects so many originators and ensuring that what they do can be seen to be something they did. That means removing anonymity from the tech companies, the data brokers and indeed the many other sources that are polluting communication at present.

I suppose that this empowers some originators, but I doubt whether concentrating on those will get us there. The important thing is to regulate data brokers, tech companies, Governments and cartels: those who pollute the online space.

Online Harms

Baroness O'Neill of Bengarve Excerpts
Monday 8th April 2019

(5 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness O'Neill of Bengarve Portrait Baroness O'Neill of Bengarve (CB)
- Hansard - -

My Lords, I too welcome this White Paper. We have heard it heralded from the Front Bench week after week, and it is great to see it arrive. However, it deals with only part of the problem. That is, it is a paper about the private harms that may be done—for example, by cyberbullying, fraud or extremist material. All of those matter, but there is another set of harms: harms to public goods, democracy, culture and the standards of the media. The Digital, Culture, Media and Sport Committee in the other place recently had an interesting report on disinformation and fake news which discussed some of those harms—including those which I can loosely indicate by referring to the Cambridge Analytica scandal.

We are beginning to understand that there are people campaigning within democracies that our regulation cannot reach. The electoral commissioner cannot reach those harms. Is the proposal to reach those harms as well, or is that for another day? I fear that if we do not deal with those harms relatively soon, we will regret it. Political campaigning may be undertaken not only by legitimate, registered political parties and individuals, but also by non-citizens, other states, businesses and the security apparatuses of other states. I believe these public, online harms to democracy should be of the utmost concern to us, but they are little discussed in this White Paper.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I agree that those are serious issues and need to be addressed. We have made it clear in the White Paper the harms that are in scope, but have also been very open about those that are not. We have said that we are addressing some of the really serious issues on the internet which the noble Baroness describes as private harms. We have said that we cannot deal with everything, but we are dealing with matters such as disinformation and potential assaults on democracy. We do not want to duplicate within one big White Paper, followed by legislation, all the harms connected to the internet. We have said that we are not dealing with competition law, intellectual property violation, fraud, data protection and so on, but I absolutely accept that they are very important issues. The Cabinet Office is due to report on them soon, and it is right that that department, which has responsibility for the constitution, should be dealing with it. We have not neglected those problems.

Data Protection (Charges and Information) (Amendment) Regulations 2019

Baroness O'Neill of Bengarve Excerpts
Monday 18th February 2019

(5 years, 2 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Ludford Portrait Baroness Ludford (LD)
- Hansard - - - Excerpts

My Lords, I just want to add to what my noble friend Lord McNally has said. I am glad that this matter is being cleared up, because we had very confusing advice a few months ago. I also want to note, as one of the people who was involved in the European Parliament’s proceedings on the GDPR, that it is a UK decision to impose a fee on data controllers. The mandatory requirement was removed from the GDPR, and it is a unilateral UK decision to fund the ICO in this way so that, in effect, data controllers in the UK will not feel the change which perhaps will be felt by data controllers in other EEA states, where Governments make a decision to fund their data protection authorities from, for instance, general taxation. I realise that that decision was made in the Digital Economy Act rather than in last year’s Data Protection Act, but it is imposed not by Brussels but by Whitehall and Westminster.

Baroness O'Neill of Bengarve Portrait Baroness O'Neill of Bengarve (CB)
- Hansard - -

My Lords, these amendments represent a little island of calm in a turbulent ocean. For once, I am referring not to Brexit or the backstop but, rather, to the fact that we are in the middle of some very turbulent changes in our regimes for the protection of data and privacy and many other aspects of communication. This morning, we saw the publication of the report of the Digital, Culture, Media and Sport Committee of the other place on disinformation and “fake news”. In so far as I have got into the report—which is not very far—it is very welcome in that it represents a much broader view of the threats to democracy from the present regime for controlling the use of data. There is much more to be said, and I hope that the Minister will be able to say something about the ways in which the broader picture will be taken into account. These amendments do not need changing because of the broader picture, but it is curious to fiddle with the small stuff when such major and serious issues are happening in this domain.

Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- Hansard - - - Excerpts

My Lords, this seems a sensible measure and the issues have been well rehearsed. There was one area where there was some confusion in my mind, and I hope that noble Lords will not mind my bringing it to their attention now. I, too, am looking forward to not having to pay £40—that is good news, but in exempting Members from both Houses, candidates and so on from the need to pay that charge, we recognise that many of us have other duties and obligations not related to our being Members of this House. We are in employment, we run things and so on, and we handle people’s data other than in the sense that has been described. I guess they will have to pay their £40 or whatever it is, but my confusion lies in the hinterland between those two modes of operation: information gained in respect of activities of one kind can without too much imagination become useful in respect of those of another kind. I wonder whether some thought has been given to handling that kind of confusion and, if so, how. It would be helpful if the Minister could say something about that; otherwise, this seems like common sense and we would have no hesitation in wanting it to go forward.

Children and Young People: Digital Technology

Baroness O'Neill of Bengarve Excerpts
Thursday 17th January 2019

(5 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness O'Neill of Bengarve Portrait Baroness O'Neill of Bengarve (CB)
- Hansard - -

My Lords, the topic of this debate is often understood in ways suggesting that what is at issue is either a generic problem with the use of online technologies or a more specific problem arising from the use of social media. I declare an interest as someone who does not use social media, but whose life is greatly dependent on digital technologies. We are mistaken if we focus excessively on social media.

The generic problems for children and young people are said to include too much screen time, loss of sleep, educational damage, less social life and less exercise—I agree. The specific problems of social media use by children and young people, but not them only, are said to include a lurid list that runs across the risks of cyberbullying, a loss of privacy, exposure to porn and extremist propaganda and lots more. I agree entirely that each of these can damage young people; for that matter, they can damage older people as well. But the tech companies may, even if pretty belatedly, conclude that failure to curb these harms is damaging their reputations and commercial interests, so they will do more to prevent these harms. How successful they will be remains to be seen. So far, moves to take down harmful material have not been wholly successful.

However, these may not be the most damaging harms done by digital technologies and, more specifically, not the harms which most damage young people. The harms I have mentioned are all private harms in the economist’s sense of the term: they are harms suffered by individuals who are bullied or whose privacy is invaded, or whose education is damaged. There is a second range of less immediately visible harms that arise from digital media. These are public harms that damage public goods, notably cultures and democracy.

There is a large and growing body of knowledge about ways in which digital technologies are used to subvert democratic processes, including elections and referenda. It has happened in many jurisdictions. Such use of technology is cheap and its influence can be purchased and peddled by those who are not citizens, including corporations and states, among them hostile states and their intelligence services. Moreover, it can be done anonymously. Our electoral law, which regulates party-political expenditure during campaigns, is pathetically inadequate for dealing with hidden digital persuasion.

Equally, there is now substantial evidence of the use of digital technologies to undermine the reliability of news and information. This has often been hailed as a point of pride. When Mr Mark Zuckerberg first propounded his now infamous slogan “Move fast and break things”, one may assume that he took it that everything that would be broken would be something unjust and exclusionary that obstructed the dissemination of knowledge and information to the public. In the event, the digital revolution has swept away not only the wicked intermediaries, the censors, but essential intermediaries without whom we would have no serious journalism, no editorial judgment nor reliable ways of telling whether we were encountering fake news or the real thing. The wholesale destruction of intermediaries is a form of cultural vandalism, damaging to all but the perpetrators and the hidden persuaders, and in particular to young people.

I am all for pursuing the agenda of protecting young people from the private harms inflicted by uses of digital or of other technologies, but I think that we short-change the next generation if we do not protect them also from the public harms that such technologies enable. Protection from them will be far more difficult, I suspect, because it will not be in the commercial interest of the big tech companies that organise the data obtained from many sources—not, by the way, always social media—and package it for sale to those who pay to target specific groups for political and commercial purposes cheaply and, once again, anonymously.

In September 2018, Sir Tim Berners-Lee expressed his disappointment about what has happened to the web in these words:

“I’ve always believed the web is for everyone. That’s why I and others fight fiercely to protect it. The changes we’ve managed to bring have created a better and more connected world. But for all the good we’ve achieved, the web has evolved into an engine of inequity and division; swayed by powerful forces who use it for their own agendas”.


We have been warned.

Centre for Data Ethics and Innovation

Baroness O'Neill of Bengarve Excerpts
Wednesday 21st November 2018

(5 years, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

The board of the centre will be able to cope with whichever way round the wording is. It will deal with the balance and the tensions between ethics and innovation—and indeed innovation and ethics.

Baroness O'Neill of Bengarve Portrait Baroness O’Neill of Bengarve (CB)
- Hansard - -

My Lords, yesterday evening the British-American Parliamentary Group and Ditchley met to discuss these topics. It was an interesting meeting, but it did reveal how readily innovation drives ethics. I say this as an academic philosopher, and it is quite important. The innovation questions are of great importance, but they are not the only questions, and I hope that steps will be taken to ensure that there is suitable rigour in the analysis of the ethical issues. The debate is full of pitfalls and inadequacies, including phrases such as “communication ethics” and “data ethics”, which ultimately mean nothing. Ethics is about what you do: it is not about data and communication. So I hope that there will be room for that sort of rigour on this advisory—and ultimately statutory—body.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I completely agree with the noble Baroness. In dealing with modern technology, we often forget the very important point she makes. Ethics is about how you live your life and deal with things in a way that has a moral basis. I absolutely accept that, in dealing with modern technology and especially things such as AI, ethics is a very important component. That is precisely why they have also included not just technical people but parliamentarians and professional philosophers, to consider and to make sure that those aspects are given sufficient weight.

The Politics of Polling (Political Polling and Digital Media Committee Report)

Baroness O'Neill of Bengarve Excerpts
Tuesday 3rd July 2018

(5 years, 9 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness O'Neill of Bengarve Portrait Baroness O’Neill of Bengarve (CB)
- Hansard - -

My Lords, the report of the Select Committee on Political Polling and Digital Media, of which I had the honour to be a member, was intended to address one pretty urgent but relatively well-defined topic and then one less well-defined topic, which to me is probably even more important. The urgent topic was to inquire into why the polling organisations provided estimates which in the event turned out not to be as accurate as had been expected in two general elections and the referendum campaign. That was very well defined. The less well-defined topic concerned the role of digital media in political campaigning. So, the remit was actually quite complicated and the Select Committee rather short-lived. For that reason, I am particularly grateful to our chair—the noble Lord, Lord Lipsey—the noble Baroness, Lady Jay, and the clerks for handling a very complicated set of topics that did not entirely gel.

As the report makes clear, the committee concluded that, in the main, problems of recent political polls were probably not due to deficiencies in the conduct of polls by polling companies. That is solid and reassuring, but it is not a reason for complete satisfaction because we also reported that pollsters were encountering greater reluctance to respond, public confidence in what polls report was declining and there were considerable problems with the use of polling results by parts of the media.

The report’s recommendations address some of these issues. They include greater co-ordination between the industry, the professional body—the British Polling Council—and the Market Research Society, as well as between the Electoral Commission and media regulators. They are measured on proportionate suggestions and it is good to see that the Government are taking them fairly seriously. However, the recommendations do not address the wider issues raised by the spread and power of digital media that bear on political polling. I think that this is because we found the evidence patchy and difficult to assess in the brief time available. Indeed, in some cases witnesses suggested in evidence taken in private that matters were worse than they would, or perhaps could, say in public.

As the topic is vast, I will speak only about a few relevant matters. First, digital media include social media but not all digital media are social media. That is fundamental. Social media content is posted by individuals and controversy arises at two points. The first, better-known issue is that content posted by individuals may mislead or harm. Your Lordships’ House has had considerable opportunities to discuss some of the harm that can be done to individuals by certain uses of social media, such as fraud, cyberbullying, trolling, defamation and many more.

Of course, such action also goes on without the support of digital technologies and is usually criminalised. The difference and the difficulty with content posted on social media by individuals is that it may be posted anonymously, so sanctions are very hard to impose. There is a big debate to be had about the effects of social media use that targets individuals and the limits of arguments for permitting anonymously posted content. Anonymity is often supported with claims that it is needed for whistleblowing. That is incorrect; I think that confidentiality is much more relevant than anonymity to whistleblowing, if you want it to work. The second reason for anonymity is to report news under oppressive regimes. Thank God we are not facing that. This is hardly an argument for permitting anonymity, whatever the communication. The rise of anonymous posting is in itself a social phenomenon about which we need to think intensively and urgently.

The second way in which the use of social media can lead to harm is when posted content is organised to reach some but not others, thereby exerting some control over what individuals receive. Targeted advertising and messages may shape the content that individuals receive and can thereby add or limit content that supports—or, alternatively, seeks to undermine—a given cause. We did not obtain any solid evidence of the extent to which the content that individuals receive has been subject to control or influence. That was one of the big gaps in our evidence. Evidently, if we imagine a wide-open conversation of mankind, we can tell ourselves that the more voices are included, the better—for social life and democracy. However, if the spectrum of choices or positions that are heard is being shaped by other considerations and is often selected to support a cause, or limit support for another cause, then fundamental questions arise about the feasibility of democracy in the age of social media, and now to digital media that are not social media.

There is one more effect that social media have. Social media also monetise the data that individuals supply by using those data to organise and target advertising—by which, of course, the companies secure their revenue. Once again, there are legitimate reasons for concern. There is no reason to suppose that the content that is distributed by social media will secure any even or unbiased distribution of information or evidence to electors. In fact, we have good evidence of the contrary happening, although I think not yet evidence of the scale, the effects or the effectiveness with which this is happening. We just know from some empirical studies that there is uneven distribution of content. These, I think, are reasons why the report could not offer a more systematic account of the effects of digital media, especially social media.

However, digital media go further. Digital media include not just social media but other digital enterprises where the content is not posted by individuals; it is made available by organisations; created, we may say, by organisations; and, indeed, invented by organisations sometimes. Some of these organisations, of course, have clear political purposes, including, very frequently, undisclosed and sometimes malign purposes. It is often hard to detect the source or the allegiance of digital media. Here, a blogger may be indistinguishable from a journalist and probably calls himself or herself a journalist. Here, discipline, let alone credentials, may be wholly absent. Here, there is no editor. Yet we talk about digital media as though they consist of professional journalists who are disciplined by editors who seek to provide reliable content for others.

We talk about digital technologies as if they can be regulated. This may be the deepest of our difficulties. It is often said these days that what we need to do with digital media is to make sure that they are not treated as platforms but as publishers. If they were publishers they would, for example, be subject to the law on defamation, to take one simple example. As platforms, they are not. Nor, of course, are the individuals who post stuff anonymously subject to the law of defamation. This is an extraordinary escape from legal and regulatory discipline. Can it be remedied? Until about a year ago I thought so.

I think we face two major obstacles in addressing what digital media can do. One is the jurisdictional problem. It is extraordinarily easy for these technologies to shift their supposed location: they have very little fixed infrastructure and they can move, as we see by the fact that they pay so little tax. They can move their headquarters where they choose. If we seek to regulate them, it is quite likely that they will find more convenient jurisdictions in which to operate. The other reason why I suspect they cannot be regulated as publishers is that being a publisher is, as many of us know, pretty arduous. You have to read the stuff. There is too much, however, that is posted; they could not carry out the due diligence that is the daily work of publishers.

We talk as if we still lived in a world in which journalism can be reliably distinguished from self-expression, in which political advertising can be identified by seeing who paid the bill. I think that is given the lie by the fact that what we are actually regulating is the paid-for advertising of the political parties during election campaigns, a very narrow form of control when all sorts of other things are going on. I do not think it will prove viable for much longer to regulate only advertising by political parties during election campaigns and to turn a blind eye to all the other advertisers using the same technologies and spreading what they choose to spread. Political persuasion is now cheap and it can be done by those who have no business doing it. We are all aware that the mighty Facebook apparently did not realise that it was hosting political advertisements that had been funded from Russia. I think that is a warning call for all of us. If we are to retain democracy we have to find ways of detecting and ending practices of this sort.

Cambridge Analytica

Baroness O'Neill of Bengarve Excerpts
Monday 19th March 2018

(6 years, 1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I want to put on the record that we absolutely agree with the noble Baroness that if these allegations—and at the moment they are allegations—are correct, that will be truly shocking. The new Data Protection Bill will bring forward stronger enforcement powers, and, as we have said, we might strengthen them even further. It is very important to consider that some people have said that the powers in the new Data Protection Bill are too burdensome. That shows exactly why we need strengthened individual data subjects’ rights and the means to protect them. The privacy of individual data subjects must be taken extremely seriously, and the Bill will do that. Of course, the Information Commissioner will certainly take seriously any links that she finds between any data breaches and elections, and I confirm to the noble Baroness that we will too.

Baroness O'Neill of Bengarve Portrait Baroness O'Neill of Bengarve (CB)
- Hansard - -

My Lords, the Minister has, very understandably, spoken as though the problem that we are addressing is breach of privacy, and that is of course what data protection legislation is intended to achieve. However, does he not think that new uses of data, including personal data, by digital media and specifically by social media are evading the way in which we would like elections to be conducted and enabling data use that is not merely a breach of privacy but a breach of public interest?

Social Media: News

Baroness O'Neill of Bengarve Excerpts
Thursday 11th January 2018

(6 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness O'Neill of Bengarve Portrait Baroness O'Neill of Bengarve (CB)
- Hansard - -

My Lords, my noble friend Lady Kidron has introduced a debate that is not just timely but urgent. It is different from earlier debates and discussions we have had, which focused largely on social media issues—although I know there has been a great deal of discussion of social media today—and that is not something one can take lightly. The wanton or malicious uses of digital technologies, particularly social media, can spread content that harms other individuals. The list is very long—cyberbullying, fraud, grooming, trolling, extreme pornography and endless sorts of breaches of privacy and confidentiality.

However, today I am going to focus not on harms that individuals may do to other individuals using these technologies but on ways in which digital technologies may spread content that harms public culture, and thereby civic and civilised life—and, ultimately, democracy itself.

I have time to mention only a few examples. First, there is the harm to electoral process and public debate. In this country we regulate expenditure on advertising by political parties during elections quite closely, but advertising by others and disseminating content that is not labelled as a political advertisement—whether by individuals, corporations or foreign states—is unregulated. This used not to be a problem. Such advertising was unlikely, it was costly and it could not effectively be provided from afar—but this has changed. Some noble Lords will remember the lurid and mendacious material that was “hosted”—in the pretty vocabulary that is used—online on websites run from Macedonian villages which were provided with particularly provocative and damning content during Mr Trump’s election campaign. Digital content can be algorithmically distributed without any indication of provenance and without any means of complaint, redress or correction over any distance and at very low cost compared with traditional advertising. The present situation makes a mockery of our tight regulation of party political expenditure on elections. The committee of the noble Lord, Lord Bew, might want to look at this one.

The second example concerns the debate about publishers and platforms. It is an important debate and we can all see why those who run online platforms are not always in a good position to exercise the responsibility of publishers. That is, as it were, their get-out card. However, what they are doing is hosting a large amount of anonymously posted content, resulting in irresponsibility at two levels: at the level of the platform and at the level of the individual who posts content. Is this acceptable? Well, people always invoke the argument of free speech, which we should take seriously. There may be a good case for protecting anonymous postings on matters of public interest under repressive regimes. That is a much-cited special case, but it is just that: a special case. There is no generic case for exempting from accountability those who post content anonymously or for protecting them if they damage, defame or discredit others, reveal personal information and the rest of it.

I think that democracy will fail if we find that when we talk about public affairs, what is going on is the equivalent of hiding behind hedges in order to throw stones more effectively. There is perhaps a case for holding online platforms responsible in the way that publishers are responsible, if not for all the content they carry then at least for any content posted without a verifiable indication of its source. That verifiable indication would mean that the individual carried liability, which would be better than the present situation.

The third example is that of monopoly providers, about which others have spoken. This is a serious issue because these are enormous companies and, given that they are digital intermediaries, they can shift jurisdictions very quickly. This has wide and deep effects on public culture. We need to think very hard about anti-monopoly provisions in this area.

Data Protection Bill [HL]

Baroness O'Neill of Bengarve Excerpts
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I shall speak to Amendment 153ZA in my name and that of my noble friend Lord Kennedy of Southwark. I support the amendment tabled by the noble Lords, Lord Clement-Jones and Lord Paddick, which is important. We look forward to hearing what the Minister says in response.

Our amendment is in two halves. The first probes the question of what happens in cases where the data controller relies on derogations or limitations provided for under the GDPR that have been brought, directly or indirectly, into UK law through the existence of the GDPR after 25 May 2018 or through secondary legislation, whichever is appropriate. It asks whether there is a need for a bit more guidance on the commissioner’s duties, in that she may wish to look at the proportionality of such reliance by the data controller—in other words, whether it is appropriate relative to the overall aims and objectives placed on the data by the data controller—and whether it is appropriate under the GDPR or its subsequent limitation or derogation. It also asks whether adequate systems are in place to make sure the rights of data subjects are safeguarded. This may seem to be gold-plating, but it is important to understand better how the mechanics of this works in practice. These are very important issues.

The second part returns to an issue we touched on earlier in Committee, but about which there is still concern. We have again had representations on this issue. The amendment is framed as a probing amendment, but it comes back to familiar territory: what will happen in later stages of the life of the Bill as we leave the EU and are required to make sure our own legislative arrangements are in place? At present, the GDPR has an extraterritorial application so that even when companies are not established in the EU they are bound by the GDPR where they offer goods or services to EU citizens or monitor their behaviour. As well as requiring that lawful processing of data is not excessive, data controllers are required to keep data secure.

So far, so good. The important point is that under the GDPR at present—there is no derogation on this—it is necessary for such companies to make sure they have what is called a representative in the EU. This would be a physical office or body, staffed so that where EU citizens wish to take up issues that affect them, such as whether the data is being properly controlled or whether it has been processed legally, contact can be made directly. But under the Bill as I understand it, and I would be grateful if the Minister could confirm what exactly the situation is, after the applied GDPR comes in the requirement for a company to make sure it has a representative in the UK—in the GDPR, it is for a company to have a representative in the EU—will be dropped. If that is right, even if the operating company is well-respected for its data protection laws or is in good standing as far as the EU is concerned, any individual based in the UK would obviously have much more difficulty if there is no representative, such as in a situation with different foreign laws, where an individual would probably rely on an intermediary who may not see non-nationals as a sufficiently high priority. If things do not work out, the individual may have to have recourse to law in a foreign court. This will make it very difficult to enforce new rights.

Is it right that the Government will not require foreign companies operating in the UK after Brexit to have a representative? If it is, how will they get round these problems? I look forward to hearing what the Minister says on these points.

Baroness O'Neill of Bengarve Portrait Baroness O’Neill of Bengarve (CB)
- Hansard - -

My Lords, I have a question about proposed new subsection (2) in Amendment 153, which says that,

“personal data must not be processed unless an entry in respect of the data controller is included in the register”.

That goes a certain distance, but since enormous amounts of personal data in the public domain are not in the control of any data controller, it is perhaps ambiguous as drafted. Surely it should read, “Personal data must not be processed by a data controller unless an entry in respect of the data controller is included in the register”. If that is the intention, the proposed new clause should say that. If it is not, we should recognise that controlling data controllers does not achieve the privacy protections we seek.

Baroness O'Neill of Bengarve Portrait Baroness O’Neill of Bengarve
- Hansard - -

Subsection (2) of Amendment 153:

“Subject to subsection (3), personal data must not be processed unless an entry in respect of the data controller is included in the register maintained by the Commissioner”.


That would be an adequate formulation if all the personal data being processed was within the control of some data controller. Since much of it is not, the drafting does not quite meet the purpose.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Lords for introducing these amendments. Perhaps I may begin by referring to Amendment 153. The requirement set out in the Data Protection Act 1998 for the Information Commissioner to maintain a register of data controllers, and for those controllers to register with the commissioner, was introduced to support the proper implementation of data protection law in the UK and to facilitate the commissioner’s enforcement activity. At the time when it was introduced, it was a feasible and effective measure. However, in the intervening 20 years, the use of data in our society has changed beyond all recognition. In today’s digital age, in which an ever-increasing amount of data is being processed, there has been a correspondingly vast increase in the number of data controllers and the data processing activities they undertake. There are now more than 400,000 data controllers registered with the Information Commissioner, a number which is growing rapidly. The ever-increasing amount and variety of data processing means that it is increasingly difficult and time consuming for her to maintain an accurate central register giving details on the wide range of processing activities they undertake.

The Government believe that the maintenance of such an ever-growing register of the kind required by the 1998 Act would not be a proportionate use of the Information Commissioner’s resources. Rather, as I am sure noble Lords will agree, the commissioner’s efforts are best focused on addressing breaches of individuals’ personal data, seeking redress for the distress this causes and preventing the recurrence of such breaches. The GDPR does not require that a register similar to that created by the 1998 Act be maintained, but that does not mean there is a corresponding absence of transparency. Under articles 13 and 14 of the GDPR and Clauses 42 and 91 of the Bill, controllers must provide data subjects with a wide range of information about their processing activities or proposed processing activities at the point at which they obtain their data.

Nor will there be absence of oversight by the commissioner. Indeed, data controllers will be required to keep records of their processing activities and make those records available to the Information Commissioner on request. In the event of non-compliance with such a request, the commissioner can pursue enforcement action. The only material change from the 1998 Act is that the Information Commissioner will no longer have the burden of maintaining a detailed central register that includes controllers’ processing activities.

I turn now to Amendment 153ZA which would give the Information Commissioner two new duties. The Government believe that both are unnecessary. The first new duty, to verify the proportionality of a controller’s reliance on a derogation and ensure that the controller has adequate systems in place to safeguard the rights of data subjects, is unnecessary because proportionality and adequate safeguards are core concepts of both the GDPR and the Bill. For example, processing is permissible only under a condition listed in Schedule 1 if it is necessary for a reason of substantial public interest. Any provision to require the commissioner to enforce the law is at best otiose and at worst risks skewing the commissioner’s incentives to undertake enforcement action. Of course, if the noble Lord feels that the Bill would benefit from additional safeguards or proportionality requirements, I would be happy to consider them.

The second new duty, to consult on how to support claims taken by UK residents against a data controller based in another territory who has breached their data protection rights, is in our view also unnecessary. As made clear in her international strategy, which was published in June, the Information Commissioner is very aware of the need for international co-operation on data protection issues, including enforcement. For example, she is an active member of the Article 29 Working Party and the Global Privacy Enforcement Network, and her office provides the secretariat for the Common Thread Network, which brings together Commonwealth countries’ supervisory authorities. Only last month, her office led an international sweep of major consumer websites, in which 23 other data protection regulators from around the world participated. Clause 118 of the Bill and article 50 of the GDPR require her to continue that important work, including through engaging relevant stakeholders in discussion and activities for the purpose of furthering international enforcement. Against this background, the Government do not feel that additional prescriptive requirements would add value.

--- Later in debate ---
Lord Puttnam Portrait Lord Puttnam (Lab)
- Hansard - - - Excerpts

My Lords, I support this amendment and identify myself totally with the remarks of the noble Lord, Lord Clement-Jones. I am trying to be practical, and I am possibly even pushing at an open door here. I have a facsimile of the 1931 Highway Code. The introduction by the then Minister says:

“By Section 45 of the Road Traffic Act, 1930, the Minister of Transport is directed to prepare a code of directions for the guidance of road users … During the passage of the Act through Parliament, the opinion was expressed almost universally … that much more could be done to ensure safety by the instruction and education of all road users as to their duties and obligations to one another and to the community as a whole”.


Those last few words are very important. This must be, in a sense, a citizens’ charter for users—a constantly updated notion—of the digital environment to be sure of their rights and of their rights of appeal against misuse. This is exactly where the Government have a duty of care to protect people from things they do not know about as we move into a very difficult, almost unknown digital environment. That was the thinking behind the 1931 Highway Code, and we could do a lot worse than do something similar. That is probably enough for now, but I will undoubtedly return to this on Report.

Baroness O'Neill of Bengarve Portrait Baroness O’Neill of Bengarve
- Hansard - -

My Lords, I support the spirit of this amendment. I think it is the right thing and that we ultimately might aspire to a code. In the meantime, I suspect that there is a lot of work to be done because the field is changing extremely fast. The stewardship body which the noble Lord referred to, a deliberative body, may be the right prelude to identifying the shape that a code should now take, so perhaps this has to be taken in a number of steps and not in one bound.

Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - - - Excerpts

My Lords, I too support the amendment. Picking up this last point, I am looking to see whether the draft clause contains provisions for keeping the code under review. A citizens’ charter is a very good way of describing the objective of such a code. I speak as a citizen who has very frequently, I am sure, given uninformed consent to the use of my data, and the whole issue of informed consent would be at the centre of such a code.

Data Protection Bill [HL]

Baroness O'Neill of Bengarve Excerpts
Baroness O'Neill of Bengarve Portrait Baroness O'Neill of Bengarve (CB)
- Hansard - -

I thank the Minister for giving way. Is he suggesting that the aim should be to adapt children to the realities of the online world and the internet service providers, rather than to adapt the providers to the needs of children?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I am not an expert on education, but I do not think that “adapting” children is a recognised educational aspiration. We are trying to make children aware of the issues involved in the online world. We all accept that they are technically skilful, but they may not have the maturity to make the right decisions at certain times in their lives. As I said, we are trying to pitch it so that, as children develop, they are introduced to different things along the way. I hope that that answers the noble Baroness.

We are working with social media and technology companies, subject experts, law enforcement, English schools and teaching bodies to ensure these subjects are up to date with how children and young people access content online and the risks they face. We will also consider how best to support schools in the delivery of these new subjects. It is important to note that education on data processing does not exist in a vacuum but is viewed as a part of a wider programme of digital learning being promoted to improve user awareness of online safety and build digital capability. As such, we think that legislation focusing solely on data processing would risk detracting from the broader issues being tackled.

I am grateful to noble Lords for their amendment: it has prompted an interesting debate and raised issues which have gone beyond data protection, on which of course we are concentrating in the Bill. I hope that I have reassured the noble Lord that the Government take the issue of educating young people seriously, particularly in data protection matters. Not only do they already feature in the curriculum but we are considering how we might strengthen this teaching as a key part of our wider online safety work. With that reassurance, I hope that the noble Lord will feel able to withdraw the amendment.