Debates between Lord Whitty and Lord Ashton of Hyde during the 2017-2019 Parliament

Mon 13th Nov 2017
Data Protection Bill [HL]
Lords Chamber

Committee: 3rd sitting (Hansard): House of Lords

Data Protection Bill [HL]

Debate between Lord Whitty and Lord Ashton of Hyde
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I begin by repeating, almost word-for-word, the noble Lord, Lord Kennedy: engaging voters is important in a healthy democracy. In order to do that, political parties, referendum campaigners and candidates will campaign using a variety of communication methods. However, they must comply with the law when doing so, and this includes the proper handling of the personal data they collect and hold.

Noble Lords will be aware that the Information Commissioner recently announced that she was conducting an assessment of the data protection risks arising from the use of data analytics, including for political purposes. She recognises that this is a complex and rapidly evolving area where organisations use a person’s internet or public profile to target communications or messaging. The level of awareness among the public about how data and analytics work and how their personal data is collected, shared and used through such tools is low. What is clear is that these tools have a significant potential impact on an individual’s privacy, and the Government welcome the commissioner’s focus on this issue. It is against this backdrop that we considered the amendments of the noble Lord.

The amendments seek to amend a processing condition relating to political parties in paragraph 17. The current clause permits political parties to process data revealing political opinions, provided that it does not cause substantial damage or substantial distress. This replicates the existing wording in the Data Protection Act 1998. I have said that political campaigning is a vital democratic activity but it can also generate heated debated. Removal of the word “substantial” could mean that data processing for political purposes which caused even mild offence or irritation becomes unlawful. I am sure noble Lords would agree that it is vital that the Bill, while recognising the importance of adequate data protection standards, does not unduly chill such an important aspect of the UK’s democracy. For that reason I ask the noble Lord to withdraw the amendments.

I thank the noble Lord for allowing me to reply later to his list of questions. I found it difficult to copy them down, let alone answer them all, but I take the point. In many instances we are all in the same boat on this, as far as political parties are concerned. I shall of course be happy to meet with him, and I take the point about who should attend. I am not sure it will be next week, when we have two days in Committee, but we will arrange it as soon as possible. I will have to get a big room because my office is too small for all the people who will be coming. I take the points the noble Lord made in his questions and will address them in the meeting.

The noble Baroness, Lady Hamwee, asked whether the Electoral Commission had been consulted. It did not respond to the Government’s call for views which was published earlier this year, and we have not solicited any views explicitly from it beyond that.

The noble Baroness also asked about the provision, acquisition and use of a marked electoral register within paragraph 17 of Schedule 1. As she explained, the marked register shows who has voted at an election but does not show how they voted. As such, it does not record political views and does not contain sensitive data—called special categories of data in the GDPR —and, as the protections for sensitive data in article 9 of the GDPR are not relevant, Schedule 1 does not apply.

Lastly, the noble Baroness asked why Members of the House of Lords are not within the definition of elected representatives. Speaking as an elected Member of the House of Lords—albeit with a fairly small electorate—I am obviously interested in this. I have discovered that none of us, I am afraid, are within the definition of elected representatives in the Bill. We recognise that noble Lords may raise issues on an individual’s behalf. Most issues will not concern sensitive data but, where they do, in most cases we would expect noble Lords to rely on the explicit consent of the person concerned. This arrangement has operated for the past 20 years under the current law, and that is the position at the moment.

I hope I have tackled the specific items relating to the amendments. I accept the points made by the noble Lord, Lord Kennedy, about the electoral issues that need to be raised in general.

Lord Whitty Portrait Lord Whitty (Lab)
- Hansard - -

I fully support my noble friend’s assertions and the Minister’s response. It is very important that registered political parties can operate effectively. I wonder whether, in the discussions he is proposing to undertake, the Minister will also address the issue of other organisations and political parties attempting to influence the political process. I do not think I need to spell it out, in view of recent news, but the use of social media by organisations that are not covered by our electoral law or by registration as a political party must not have the same provisions that registered political parties would have under the Bill or my noble friend’s amendments. I wonder if that could be addressed directly in these discussions.

Data Protection Bill [HL]

Debate between Lord Whitty and Lord Ashton of Hyde
Monday 13th November 2017

(6 years, 11 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Whitty Portrait Lord Whitty (Lab)
- Hansard - -

My Lords, my name is attached to two of these amendments. This is a very difficult subject in that we are all getting used to algorithmic decisions; not many people call them that, but they are what in effect decide major issues in their life and entice them into areas where they did not previously choose to be. Their profile, based on a number of inter-related algorithms, suggests that they may be interested in a particular commercial product or lifestyle move. It is quite difficult for those of my generation to grasp that, and difficult also for the legislative process to grasp it. So some of these amendments go back to first principles. The noble Baroness, Lady Hamwee, said that the issue of human rights trumps everything. Of course, we all agree with that, but human rights do not work unless you have methods of enforcing them.

In other walks of life, there are precedents. You may not be able to identify exactly who took a decision that, for example, women in a workforce should be paid significantly less than men for what were broadly equivalent jobs; it had probably gone on for decades. There was no clear paper trail to establish that discrimination took place but, nevertheless, the outcome was discriminatory. With algorithms, it is clear that some of the outcomes may be discriminatory, but you would not be able to put your finger on why they were discriminatory, let alone who or what decided that that discrimination should take place. Nevertheless, if the outcome is discriminatory, you need a way of redressing it. That is why the amendments to which I have added my name effectively say that the data subject should be made aware of the use to which their data is being made and that they would have the right of appeal to the Information Commissioner and of redress, as you would in a human-based decision-making process that was obscure in its origin but clear in relation to its outcome. That may be a slightly simplistic way in which to approach the issue, but it is a logical one that needs to be reflected in the Bill, and I hope that the Government take the amendments seriously.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, who introduced this interesting debate; of course, I recognise his authority and his newfound expertise in artificial intelligence from being chairman of the Select Committee on Artificial Intelligence. I am sure that he is an expert anyway, but it will only increase his expertise. I thank other noble Lords for their contributions, which raise important issues about the increasing use of automated decision-making, particularly in the online world. It is a broad category, including everything from personalised music playlists to quotes for home insurance and far beyond that.

The noble Lord, Lord Stevenson, before speaking to his amendments, warned about some of the things that we need to think about. He contrasted the position on human embryology and fertility research and the HFEA, which is not exactly parallel because, of course, the genie is out of the bottle in that respect, and things were prevented from happening at least until the matter was debated. But I take what the noble Lord said and agree with the issues that he raised. I think that we will discuss in a later group some of the ideas about how we debate those broader issues.

The noble Baroness, Lady Jones, talked about how she hoped that the repressive bits would be removed from the Bill. I did not completely understand her point, as this Bill is actually about giving data subjects increased rights, both in the GDPR and the law enforcement directive. That will take direct effect, but we are also applying those GDPR rights to other areas not subject to EU jurisdiction. I shall come on to her amendment on the Human Rights Act in a minute—but we agree with her that human beings should be involved in significant decisions. That is exactly what the Bill tries to do. We realise that data subjects should have rights when they are confronted by significant decisions made about them by machines.

The Bill recognises the need to ensure that such processing is correctly regulated. That is why it includes safeguards, such as the right to be informed of automated processing as soon as reasonably practicable and the right to challenge an automated decision made by the controller. The noble Lord, Lord Clement-Jones, alluded to some of these things. We believe that Clauses 13, 47, 48, 94 and 95 provide adequate and proportionate safeguards to protect data subjects of all ages, adults as well as children. I can give some more examples, because it is important to recognise data rights. For example, Clause 47 is clear that individuals should not be subject to a decision based solely on automated processing if that decision significantly and adversely impacts on them, either legally or otherwise, unless required by law. If that decision is required by law, Clause 48 specifies the safeguards that controllers should apply to ensure the impact on the individual is minimised. Critically, that includes informing the data subject that a decision has been taken and providing them 21 days within which to ask the controller to reconsider the decision or retake the decision with human intervention.

I turn to Amendments 74, 134 and 136, proposed by the noble Lord, Lord Clement-Jones, which seek to insert into Parts 2 and 3 of the Bill a definition of the term,

“based solely on automated processing”,

to provide that human intervention must be meaningful. I do not disagree with the meaning of the phrase put forward by the noble Lord. Indeed, I think that that is precisely the meaning that that phrase already has. The test here is what type of processing the decision having legal or significant effects is based on. Mere human presence or token human involvement will not be enough. The purported human involvement has to be meaningful; it has to address the basis for the decision. If a decision was based solely on automated processing, it could not have meaningful input by a natural person. On that basis, I am confident that there is no need to amend the Bill to clarify this definition further.

In relation to Amendments 74A and 133A, the intention here seems to be to prevent any automated decision-making that impacts on a child. By and large, the provisions of the GDPR and of the Bill, Clause 8 aside, apply equally to all data subjects, regardless of age. We are not persuaded of the case for different treatment here. The important point is that the stringent safeguards in the Bill apply equally to all ages. It seems odd to suggest that the NHS could, at some future point, use automated decision-making, with appropriate safeguards, to decide on the eligibility for a particular vaccine—