Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - -

My Lords, it is a pleasure to follow the previous speakers, including my noble friend the Minister, the other Front-Benchers and the noble Baroness, Lady Kidron.

I start by thanking the House of Lords Library for its briefing—it was excellent, as usual—and the number of organisations that wrote to noble Lords so that we could understand and drill down into some of the difficulties and trade-offs we are going to have to look at. As with most legislation, we want to get the balance right between, for example, a wonderful environment for commerce and the right to privacy and security. I think that we in this House will be able to tease out some of those issues and, I hope, get a more appropriate balance.

I refer noble Lords to my interests as set out in the register. They include the fact that I am an unpaid adviser to the Startup Coalition and have worked with a number of think tanks that have written about tech and privacy issues in the past.

When I look at the Bill at this stage, I think that there are bits to be welcomed, bits that need to be clarified and bits that raise concern. I want to touch on a few of them before drilling down—I will not drill down into all of them, because I am sure that noble Lords have spoken or will speak on them, and we will have much opportunity for further debate.

I welcome Clause 129, which requires social media companies to retain information linked to a child suicide. However, I understand and share the concern of the noble Baroness, Lady Kidron, that this seems to be the breaking of a promise. The fact is that this was supposed to be about much more data and harms to children and how we can protect our children. In some ways, we must remember the analogy about online spaces: when we were younger, before the online age, our parents were always concerned about us when we went beyond the garden gate; nowadays, we must look at the internet and the computers on our mobile devices as that garden gate. When children leave that virtual garden gate and go through into the online world, we must ask whether they are safe, in the same way that my parents worried about us when, as children, we went through our garden gate to go out and play with others.

Clauses 138 to 141, on a national underground asset register, are obviously very sensible; that proposal is probably long overdue. I have questions about the open electoral register, in particular the impact on the direct marketing industry. Once again, we want to get the balance right between commerce and ease of doing business, as my noble friend the Minister said, and the right to privacy.

I have concerns about Clauses 147 and 148 on abolishing the offices of the Biometrics Commissioner and the Surveillance Camera Commissioner. I understand that the responsibilities will be transferred, but, in thinking about the legislation that we have been talking about in this place—such as the Online Safety Act—I wonder about the amount of powers that we are giving to these regulators and whether they will have the bandwidth for them. Is there really a good reason for abolishing these two commissioners?

I share the concerns of the noble Lord, Lord Knight, about access to bank accounts. Surely people should have the right to know why their bank account has been accessed and have some protection so that not just anyone can access it. I know that it is not just anyone but there are concerns about this, and people have to be clearer on the rules.

I have talked to the direct marketing industry. It sees the open electoral register as a valuable resource for businesses in understanding and targeting customers. However, it tells me that a recent court case between Experian and the ICO has introduced some confusion on the use of the register for business purposes. It is concerned that the Information Commissioner’s Office’s interpretation, requiring notification to every individual for every issue, presents challenges that could cost the industry millions and make the open electoral register unusable for it, perhaps pushing businesses to rely more on large tech companies. However, I understand that, at the same time, this may well be an issue where there are clear concerns about privacy.

Where there is no harm, I would like to understand the Government’s thinking on some of that—whether it is going too far or whether some clarification is needed in this area. Companies say they will be unable to target prospective customers; some of us may like that, but we should also remember that there is Clause 116 on unlawful direct marketing. The concern for many of us is that while it is junk if we do not want it, sometimes we do respond to someone’s direct marketing. I wonder how we get that balance right; I hope we can tease some of that out. If the Government agree with the interpretation and restrictions on the direct marketing industry, I wonder whether they can explain some of the reasons behind it. There may very well be good reasons.

I also want to look at transparency and data usage, not just for AI but more generally. It is obvious in the Government’s own AI White Paper that we want a pro-innovation approach to regulation, but we are also calling for transparency at a number of levels: of datasets and of algorithms. To be honest, even if we are given that transparency, do we have the ability to understand those algorithms and datasets? We still need that transparency. I am concerned about undermining the principle, and particularly weakening subject access requests.

I am also interested in companies that, say, have used your data but have refused an application and then tell you that they do not have to tell you why they refused that application. Perhaps this is too much of a burden to companies, but I wonder whether we have a right to know which data was being accessed when that decision was made. I will give a personal example; about a year ago, I applied for an account with a very clever online bank and was rejected. It told me I would have a decision within 48 hours; I did not. Two weeks later, I got a message on the app that said I had been rejected and that under the law it did not have to tell me why. I wrote to it and said, “Okay, you don’t have to tell me why, but could you delete all the data you have on me—what I put in?”. It said, “Oh, we don’t have to delete it until a certain time”. If we really own that data, I wonder whether there should be more of an expectation on companies to explain what data and information they have to make those decisions, which can be life changing for many people. We have heard all sorts of stories about access to bank accounts and concerns about digital exclusion.

We really have to think about how much access individuals can have to the data that is used to refuse them, but also the data when they leave a service or stop being a user. I also want to make sure that there is accountability. I want to know, in Clause 12, about “reasonable and proportionate search”; what does that mean, particularly when it is processed by law enforcement and intelligence services? I think we need further clarification on some of this for our assurance.

We also have to recognise that, if we look at the online environment of the last 10, 15 or 20 years, at first we were very happy to give our data away to social media companies because we thought we were getting a free service, connecting with friends across the world et cetera. Only later did we realise that the companies were using this data and monetising it for commercial purposes. There is nothing wrong with that in itself, but we have to ask whose data it is. Is it my data? Does the company own it? For those companies that think they own it, why do they think that? We need some more accountability, to make sure that we understand which data we own and which we give away. Once again, the same thing might happen—you might stop being a user or customer of a service, or you might be rejected, but it is not there.

As an academic, I recognise the need for greater access to data, particularly for online research. I welcome some of the mechanisms in the Online Safety Act that we debated. Does my noble friend the Minister believe that the Bill sufficiently addresses the requirements and incentives for large data holders to hold data for academic research with all the appropriate safeguards in place? I wonder whether the Minister has looked at some of the proposals to allow this to happen more, perhaps with the information commission acting as an intermediary for datasets et cetera. Once again, I am concerned about giving even more power to the information commission and the bandwidth to do all this stuff, including all the powers we are giving.

On cookie consent, I understand the annoyance of cookies. I remember the debates about cookie consent when I was in the European Parliament, but at the time we supported it because we thought it was important for users to be told what was being done with their information. It has become annoying, just like those text messages when we go roaming; I supported that during the roaming debates in the European Parliament because I did not want users to say they were not warned about the cost of roaming. The problem is that they become annoying; people ignore them and tick things on terms and conditions without having read them because they are too long.

When it comes to some of the cookies, I like the idea about exemptions for prior consent—a certain opt-out where there is no real harm—but I wonder whether it could be extended, for example so that cookies to understand the performance of advertising and to help companies understand the effectiveness of advertisements are exempt from the consent requirements. I do not think this would fundamentally change the structure of the Bill, but I wonder whether we have the right balance here on harm, safety and the ability of companies to test the effectiveness of some of their direct marketing. Again, I am just interested in the Government’s thinking about the balance between privacy and commerce.

Like other noble Lords, I share concerns about the powers granted to the Secretary of State. I think they lack the necessary scrutiny and safeguards, and that there is a risk of undermining the operations of online content and service providers that rely on these technologies. We need to see some strengthening here and more assurances.

I have one or two other concerns. The Information Commissioner has powers to require people to attend interviews as part of an investigation; that seems rather Big Brother-ish to me, and I am not sure whether the Information Commissioner would want these abilities, but there might be good reasons. I just want to understand the Government’s thinking on this.

I know that on Report in the other place, both Dawn Butler MP and David Davis MP raised concerns about retaining the right to use non-digital verification systems. We all welcome verification systems, but the committee I sit on—the Communications and Digital Committee—recently wrote a report on digital exclusion. We are increasingly concerned about digital exclusion and people having a different level of service because they are digitally excluded. I wonder what additional assurances the Minister can give us on some of those issues. The Minister in the other place said:

“Individual choice is integral … digital verification services can be provided only at the request of the individual”.—[Official Report, Commons, 29/11/23; col. 913.]


I think that any further verification would be really important.

The last point I turn to is EU adequacy. Let me be quite clear: I do not believe in divergence for the sake of divergence, but at the same time I do not believe in convergence or harmonisation for the sake of convergence and harmonisation. We used to have these debates in the European Parliament all the time. There are those expressing concerns about EU data adequacy, and we have to split them into two groups—one is those people who really still wish we were members of the EU, but there are also those for whom this is irrelevant, and for whom this really is about the privacy and security of our users. If the EU is raising these issues in its agreements, we can thank it for doing that.

I obviously was involved in debates on the safe harbour and the privacy shield. As noble Lords have said, we thought we had the right answer; the Commission thought we had the answer, but it was challenged by courts. I think this will have to be challenged more. Are we diverging just for the sake of divergence, or is there a good reason to diverge here, particularly when concerns have already been raised about security and privacy?

I end by saying that I look forward to the maiden speech of the noble Lord, Lord de Clifford. I thank noble Lords for listening to me, and I look forward to working with noble Lords across the House on some of the issues I have raised.