Earl of Lytton Portrait The Earl of Lytton (CB)
- Hansard - - - Excerpts

I realise that, in rising to speak on this particular part of the Bill, I depart slightly from the purpose of the noble Lord, Lord Stevenson—but I thank him for raising the issue all the same.

Of course, we are dealing with the overview of the Bill. The noble Lord, Lord McNally, almost wrote my introduction. What has worried me for some considerable time, notwithstanding the Bill’s provisions that provide for data subject to error correction, is the manifest inclusion of data in the data processing function, which is broadly drawn—namely, the inclusion of information that is knowingly false or recklessly included in that process, and which can affect the life chances of individuals. We know of significant and high-profile circumstances in which false information has been included and has either affected a significant class of people or has seriously damaged the life prospects of individuals.

Given that the collection of data is part of the processing function, it seems to me that very little is being said about responsibility for those sorts of errors—in other words, the things that one could or should have realised were incorrect or where there was a disregard for the norms of checking information before it got into data systems. We heard at Second Reading how difficult it is to excise that information from the system once it has got in there and been round the virtual world of information technology.

Could the noble Lord, Lord Stevenson, or the Minister in replying, say whether there is anything apart from the Bill—I do not see it there at the moment—that enables there to be some sort of sanction, for want of a better word, against knowingly or recklessly including data that is false and which affects the life chances and prospects of individuals because it is capable of being identified with them and can be highly damaging? That is something that we may need to look at further down the line. If I am speaking in error, I shall stand corrected.

Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - -

My Lords, I say to my noble friend Lord McNally that it is even worse having people say to you, “You’re a lawyer, you must understand this”, when too often you do not.

I have a question for the Minister. Am I right in thinking that the Charter of Fundamental Rights will apply to all member states after Brexit? Is it not the objective that we are on all fours with them as other users of data and, therefore, if there is no provision such as the ones that we have been debating contained in the Bill, how will that affect the adequacy arrangements?

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I want to say a couple of words about privacy. A very important basic point has been raised here. I am not going to argue with lawyers about whether this is the right way in which to do it, but the right to privacy is something about which people feel very strongly—and you will also find that the Open Rights Group and other people will be very vociferous and worry about it, as should all of us here. When we go out and do things on the internet, people can form some interesting conclusions just by what we chance to browse on out of interest, if they can record that and find it out. I became very aware of this, because I have been chairing a steering group that has been producing, along with the British Standards Institution, a publicly available specification, PAS 1296, on age verification. It is designed to help business and regulators to comply with Section 3 of the Digital Economy Act, which we passed just the other day, which is about protecting children online. The point is to put age verification at the front of every website that could be a problem. We want it to be anonymous, because it is not illegal for an adult to visit sites like that; if it was recorded for certain people in certain jobs, it could destroy their careers, so it must be anonymous. So a question arises about trying to put in the specification a right to privacy.

One thing that we have to be very careful about is not to interpret laws or regulations or tread on the toes of other standards. Therefore, when this Bill and the GDPR are passed, we must make sure that people processing any of that material ensure that any data is kept completely secure, or anonymised, or is anonymous in the first place. Websites, first of all, should not know the identity of a temporary visitor when they get verified—there are ways of doing that—so that there are rights to privacy. The thing about the right to privacy is that it is a right that you, the individual, should have. The GDPR and this Bill are about how you process data; in other words, it is about what you do with the data when you have it. The legislation builds in lots of safeguards, but there is nothing that says, when you decide what data to keep or whatever it is, that people should have a right to know that it will not be revealed to the general world.

The question is where we should put it in. People used to think that Article 8 of the European Convention on Human Rights covered them, but I realised just now that it covers only your relationship with Governments. What about your relationship with other corporates, other individuals or ordinary websites? It should cover everybody. So there is an issue here that we should think about. How do we protect ourselves as individuals, and is this the right place to do it? I think that this is probably the only place where we can put something in—but I leave that to the very bright lawyers such as the noble Lord, Lord Pannick, to think about.