Data Protection Bill [HL]

2nd reading (Hansard): House of Lords
Tuesday 10th October 2017

(6 years, 6 months ago)

Lords Chamber
Data Protection Act 2018 View all Data Protection Act 2018 Debates Read Hansard Text Read Debate Ministerial Extracts
Second Reading
15:54
Moved by
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

That the Bill be now read a second time.

Lord Ashton of Hyde Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Ashton of Hyde) (Con)
- Hansard - - - Excerpts

My Lords, I am delighted to be moving the Second Reading today and look forward gratefully to the help of my right honourable friend the Minister of State at the Home Office and my noble friends Lady Chisholm and Lady Vere.

New technologies have started innumerable economic revolutions, and the pace of change continues to accelerate. It is 20 years since we passed the last Data Protection Act, and since then we have seen the explosive growth of the world wide web, the rise of social media and faster and faster connectivity, powering new devices like the smartphone. The nature of developing technologies such as artificial intelligence and machine learning suggests that continuing transformation and change is the norm.

This has not escaped the notice of your Lordships’ House. Earlier this year we debated many of these issues in the new Digital Economy Act. We have a new Select Committee to examine artificial intelligence, chaired by the noble Lord, Lord Clement-Jones, who is not able to be in his place today as the committee is hearing evidence this afternoon. In March, the Communications Committee published a timely report on growing up with the internet, and just before the Summer Recess the EU Select Committee gave us a very helpful report on data protection. Just yesterday I moved the Second Reading of the Telecommunications Infrastructure (Relief from Non-Domestic Rates) Bill, which will help pave the way for a full-fibre future and 5G. Personal data is the fuel of all these developments. Data is not just a resource for better marketing, better service and delivery. Data is used to build products themselves. It has become a cliché that data is the new oil.

Twenty years ago data protection rights were used to obtain a copy of your credit record or to find out what information about you a public authority had collected. Today we worry daily about cyberattacks, identity theft and online crime. But we are fortunate that our existing laws have protected us well. For all the technological change I have described, we have successfully preserved our rights and freedoms, and we have strong oversight in the shape of an internationally respected Information Commissioner.

Looking ahead, we have three objectives. First, with all this change we need to maintain trust. Data must be secure, with transparency over how they are used and a proportionate but rigorous enforcement regime in place. Secondly, we must support future trading relationships. The free flow of data across international boundaries, subject to safeguards, must be allowed to continue. Thirdly, we must ensure that we can continue to tackle crime in all its guises and protect national security, making sure that our law enforcement agencies can work in partnership domestically as well as internationally.

The Data Protection Bill meets these objectives. It will empower people to take control of their data, support UK businesses and organisations through the change, ensure that the UK is prepared for the future after we have left the EU, and, most importantly, it will make our data protection laws fit for the digital age in which an ever increasing amount of data is being processed. The Bill meets and exceeds international standards, and, with its complete and comprehensive data protection system, will keep the UK at the front of the pack of modern digital economies.

The Bill makes bespoke provision for data processing in three very different situations: general data processing, which accounts for the vast majority of data processing across all sectors of the economy and the public sector; law enforcement data processing, which allows the effective investigation of crime and operation of the criminal justice system while ensuring that the rights of victims, witnesses and suspects are protected; and intelligence services data processing, which makes bespoke provision for data processed by the three intelligence agencies to protect our national security.

The reform of protections for the processing of general personal data will be of greatest interest to individuals and organisations. We are setting new standards for protecting this data in accordance with the general data protection regulation, known as the GDPR. Individuals will have greater control over and easier access to their data. They will be given new rights and those who control data will be more accountable.

In our manifesto at the general election we committed to provide people with the ability to require major social media platforms to delete information held about them, especially when that information related to their childhood. The new right to be forgotten will allow children to enjoy their childhood without having every personal event, achievement, failure, antic or prank that they posted online to be digitally recorded for ever more. Of course, as new rights like this are created, the Bill will ensure that they cannot be taken too far. It will ensure that libraries can continue to archive material, that journalists can continue to enjoy the freedoms that we cherish in this country, and that the criminal justice system can continue to keep us safe.

The new right to data portability—also a manifesto commitment—should bring significant economic benefits. This will allow individuals to transfer data from one place to another. When a consumer wants to move to a new energy supplier, they should be able to take their usage history with them rather than guess and pay over the odds. When we do the weekly supermarket shop online, we should be able to move our shopping list electronically. In the digital world that we are building, these are not just nice-to-haves; they are the changes that will drive innovation and quality, and keep our economy competitive.

The Bill will amend our law to bring us these new rights and will support businesses and others through the changes. We want businesses to ensure that their customers and future customers have consented to having their personal data processed, but we also need to ensure that the enormous potential for new data rights and freedoms does not open us up to new threats. Banks must still be allowed to process data to prevent fraud; regulators must still be allowed to process data to investigate malpractice and corruption; sports governing bodies must be allowed to process data to keep the cheats out; and journalists must still be able to investigate scandal and malpractice. The Bill, borrowing heavily from the Data Protection Act that has served us so well, will ensure that essential data processing can continue.

Having modernised our protections for general data, in Part 3 the Bill then updates our data protection laws governing the processing of personal data by the police, prosecutors and other criminal justice agencies. The Bill will strengthen the rights of data subjects while ensuring that criminal justice agencies can continue to use and share data to investigate crime, bring offenders to justice and keep communities safe. The Bill does not just implement the recent directive on law enforcement data protection; it ensures that there is a single domestic and transnational regime for the processing of personal data for law enforcement purposes across the whole of the law enforcement sector.

People will have the right to access information held about them, although there are carefully constructed exemptions to ensure that investigations, prosecutions and public safety are not compromised. People will always have the right to ensure that the data held about them is fair and accurate, and consistent with the data protection principles.

Part 4 protects personal data processed by our intelligence agencies. We live in a time of heightened and unprecedented terrorist threat. We are all grateful for the work done to protect us, especially by those whom we see every day protecting us in this House. The intelligence services already comply with robust data-handling obligations and, under the new Investigatory Powers Act, are subject to careful oversight. My noble friend Lady Williams signed the latest commencement order in August to bring into force provisions relating to the oversight of investigatory powers by the Investigatory Powers Commissioner and the other judicial commissioners.

Data processing by the intelligence agencies requires its own bespoke data protection regime, not least because the GDPR standards were not designed for this kind of processing and data processing for national security purposes is outside the scope of EU law. That is why this part of the Bill will instead be aligned with the internationally recognised data protection standards found in the draft modernised Council of Europe Convention for the Protection of Individuals with Regard to the Processing of Personal Data.

Noble Lords will be familiar with the role of the Information Commissioner, whose role is to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals. The Bill provides for her to continue to provide independent oversight, supervising our systems of data protection, but we are also significantly enhancing her powers. Where the Information Commissioner gives notices to data controllers, she can now secure compliance, with the power to issue substantial administrative penalties of up to 4% of global turnover. Where she finds criminality, she can prosecute.

The Bill modernises many of the offences currently contained in the Data Protection Act, as well as creating two new offences. First, as recommended by Dame Fiona Caldicott, the National Data Guardian for Health and Care, the Bill creates a new offence of the unlawful re-identification of de-identified personal data. To elaborate, huge datasets are used by researchers, as well as by those developing new methods of machine learning, and these are often pseudonymised to protect individual privacy. We need to ensure that those who seek to gain through re-identification are clear that we will not tolerate assaults on individual privacy, nor on the valuable data assets that are fuelling our innovative industries.

Secondly, the Bill creates a new offence of altering or destroying personal data to prevent individuals accessing it. Such an offence is already in place in relation to public authorities, but now it will apply to data controllers more generally. We are equipping the commissioner with the powers to deal with a wider range of offending behaviour.

Cybersecurity is not just a priority for the Government but a deep running concern of this House. Effective data protection relies on organisations adequately protecting their IT systems from malicious interference. Our new data protection law will require organisations that handle personal data to evaluate the risks of processing such data and implement appropriate measures to mitigate those risks. Generally, that means better cybersecurity controls.

Under the new data protection framework, if a data breach risks the rights and freedoms of an individual, data controllers—both for general data and law enforcement purposes—are required to notify the Information Commissioner within 72 hours of the breach taking place. In cases where there is a high risk, businesses must notify the individuals concerned. This landmark change in the law will put the need for serious cybersecurity at the top of every business priority list and ensure that we are safer as a nation.

As we move into the digital world of the future, the Data Protection Bill will both support innovation and provide assurance that our data is safe. It will upgrade our legislation, allowing the UK to maintain the gold standard in this important field. Of critical importance, strong protections of personal data are the key to allowing free flows of data to continue between the EU and UK as we build a new partnership. I look forward to hearing noble Lords’ comments on the Bill. I beg to move.

16:08
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I thank the Minister for his comprehensive introduction to the Bill. I look forward to working with him, in what seems to be a never-ending stream of legislation from the previously rather quiescent DCMS. This is our sixth Bill together, and long may it continue.

The Minister mentioned his talented team joining him on the Front Bench—this is a joint venture between the DCMS and the Home Office. On my side, I am joined by my noble friend Lord Kennedy and supported by my noble friends Lord Griffiths and Lord Grantchester.

I congratulate the Bill team on the excellence of the paperwork that we have received—I am sure everybody has read it, word for word, all the way through; it is worth it. They are obviously ahead early in the “Bill team of the year” stakes, a prize which they won easily last time on the Digital Economy Bill, and they are building on that.

We also welcome the chance to debate the excellent House Of Lords EU Committee report, not least because of the substantial weight of evidence that it has brought to this debate, which I will refer to later.

This is a tricky Bill to get hold of, first because of its size and volume. It is a bulky package and it is not even complete because we are told to expect a large number of amendments still being processed and not yet available which may—who knows?—change it substantially. Even without that, it has 300 paragraphs and 18 schedules, one of which helpfully signposts the way that the Government intend to make changes to the Bill so that the GDPR becomes domestic law when we leave the EU, even though the amendments to make that happen will actually be made by secondary legislation. This is “Hamlet” without the prince.

The GDPR itself, which runs to 98 paragraphs—or articles, as it calls them—and which will be the new data-processing law that comes into force in May 2018 whether or not we in Parliament have agreed it, is not actually printed in the Bill. That therefore raises the concern that—post Brexit, courtesy of another, separate Bill, probably by secondary legislation—the regulations will become UK law without ever having been scrutinised by either House of Parliament. I wonder if other noble Lords share my feeling that this is a bad precedent and, if so, what we might do about it. I suspect that this decision might have made sense were we to stay in the EU but we are going to leave, so there is a gap in our procedures here. That is compounded by the fact that this is a Lords starter Bill that comes to us without the benefit of consideration in the other place, and particularly without the usual evidence-taking sessions that ensure that a Bill meets the needs of those affected by it.

I have a suggestion: given the expertise displayed in the EU Committee report HL Paper 7 that we are debating in parallel today, could the authorities arrange for that committee to look carefully at the Bill and at the GDPR in its printed form and arrange for that committee to bring forward either a report or simply a testimony about what the GDPR contains, how it is reflected in the Bill and how it works? It would help the House to do the job that we ought to be doing of scrutinising this legislation. I gather that the committee is due to meet shortly and perhaps the noble Lord, Lord Jay, who speaks in a few minutes, might respond if he can. I am sorry for embarrassing him if he is not prepared for that.

The Government claim that the Bill,

“will bring our data protection laws up to date”,

and,

“ensure that we can remain assured that our data is safe as we move into a future digital world”.

We will probe that rather florid assertion in Committee over the next few weeks, paying particular reference to the needs of business to have certainty about the rules that will be applied in this key sector of our economy in the medium and long term and the need for consumers, particularly vulnerable people and children, to be better supported and protected in this brave new digital world. What we are embarking on here is the precursor to the legislative nightmare that will accompany all our Brexit discussions. As we will hear from the noble Lord, Lord Jay, and others from the EU Committee who considered this, the key issues are what will happen if we leave the Common Market and the customs union, and whether there are any ways in which the Government can secure unhindered and uninterrupted flows of data between the UK and EU post Brexit. The report concludes that,

“any arrangement that resulted in greater friction around data transfers between the UK and the EU post-Brexit could hinder police and security cooperation. It could also present a non-tariff barrier to trade, particularly in services, putting companies operating out of the UK at a competitive disadvantage”. 

In his opening remarks, the Minister said all the right things about the Government’s commitment to unhindered and uninterrupted flows of data post Brexit, but the Bill comprehensively fails to set out how they plan to deliver that outcome. Worse, it may contain measures in Parts 3 and 4 that make it impossible to achieve the “adequacy” agreement, which is the only card that they have left to play post Brexit. You could not make it up.

Some 43% of EU tech companies are based in the UK and 75% of the UK’s data transfers are with EU member states. Even if the Bill successfully aligns UK law with the EU data protection framework as at 25 May 2018, that does not mean that the Bill makes proper provision for the future. On the UK’s exit from the EU, the UK will need to satisfy the European Commission that our legislative framework ensures an “adequate level of protection”, but achieving a positive adequacy decision for the UK is not as uncontentious as the Government think. Under article 45, the GDPR requires the European Commission to consider a wide array of issues such as the rule of law, respect for fundamental rights, and legislation on national security, public security and criminal law when it makes its decision. As has already been pointed out by several commentators, the current surveillance practices of the UK intelligence services may jeopardise a positive adequacy decision, as the UK’s data protection rules do not offer an equivalent standard of protection to that available in the rest of the EU. We will need to pursue this disjuncture in Committee.

The Government seem to have lost sight of the need to ensure continuity during the transition period and afterwards. Surely they must have measures in place to reassure businesses that they will pass the adequacy test and ensure “stability and certainty”, particularly for SMEs, as pointed out by the European Union Committee. If there was any doubt about the importance of this, I draw the attention of your Lordships to a briefing from the ABI which states that the ability to transfer data between firms in different jurisdictions is of particular importance to our insurance and long-term saving providers, who rely on data to provide their customers with the best products at the best price. The association goes on to say that:

“Losing the ability to access, and make use of, European and international data flows risks isolating the UK from the increasingly globalised market. Creating a system where UK insurers have to abide by dual or multiple regulatory systems in order to transfer data internationally will create inefficiencies, legal uncertainty, and risks damaging the global competitiveness of UK insurance”.


My second point was also raised by the European Union Committee. It is about how to establish sustainable longer-term arrangements, about which the Bill is remarkably silent. Even if the UK’s data protection rules are aligned with the EU regime to the maximum extent possible at the point of Brexit, once we leave the EU, policies will be developed within the EU 27 without our input. The EU will inevitably amend or update its rules either by new regulations or by case law derived from ECJ/EU decisions. This is of course a toxic issue for Brexiteers, but it needs to be addressed in the Bill and, no doubt, in many other areas. Perhaps a way forward here would be for the Information Commissioner to have a duty placed on her to make regulations which reflect the changes taking place in the EU, or the Bill could provide for some form of lock-step arrangement under which statutory instruments would be triggered when UK laws need to be amended. We will look at this again in Committee.

I turn now to data protection. Effective, modern data protection laws with robust safeguards are central to securing the public’s trust and confidence in the use of personal information within the digital economy, the delivery of public services and the fight against crime. Ensuring that the public can trust that their data is handled safely, whether in the public or the private sector, is important for everyone. If we cannot get this right in the Bill, people will not benefit to the fullest extent possible from the new data-handling services which are coming on stream now and in the future. We welcome the Government’s decision—a rather surprising one—to gold-plate some of the requirements of the legal enforcement directive, particularly the fact that the Bill will ensure that for the first time the data protection regime applies to the intelligence services. Indeed, as the Information Commissioner has observed, including these provisions in a single piece of primary legislation is welcome, although there is a need for much more detail about how this will work in practice.

My point on this is that there seems to be an imbalance in the Bill, with much more consideration being given to the rights of data subjects. At a time of increasing concern about the use and misuse of personal data, is there not a need for a broader and far more ambitious set of regulatory structures for data capitalism, as it is now called? The big tech companies have for far too long got away with the conceit that they are simply neutral platforms. They are not; they are active media and information companies, and their stock market valuations are based on the data flows they generate and how they can be monetised. With that role surely should come broader societal responsibilities, but the Bill does not go into this area at all. There is nothing about regulating fake news, no attempt has been made to ensure that data companies are covered by competition and other regimes which apply to media companies, and there are no proposals to deal with the allegations being made about undue influence by social media companies and others on politics and elections both here and in the US. We will certainly table amendments in this area.

On more concrete issues about the rights of data subjects, we have a number of issues to pursue, although today I shall concentrate on only three: children and the “age of consent”, the rights of data subjects in relation to third-party use of their data, and the proper representation of data subjects. I shall end with some thoughts on the Leveson report and its implications for this Bill.

The Bill proposes to set the age at which children can consent to the processing of their data through “information society services” which include websites and social media platforms at 13 years. That is a surprising decision and no credible evidence has been adduced to support it. Understandably, there is much concern about this low age limit, particularly as the general data protection regulation gives discretion in a range up to 16 years of age. Last month, the Children’s Commissioner for England said:

“The social media giants have … not done enough to make children aware of what they are signing up to when they install an app or open an account”.


These are often the first contracts a child signs in their life, yet,

“terms and conditions are impenetrable, even to most adults”.

I think we can all say “Hear, hear” to that. The commissioner also said:

“Children have absolutely no idea that they are giving away the right to privacy or the ownership of their data or the material they post online”.


Setting an age limit of 13, or even 16, would almost certainly be illegal under the UN Convention on the Rights of the Child, to which the UK is a signatory. Perhaps the Government could respond on that point.

The Children’s Society argues that if companies continue to rely on their current practices—whereby they allow only over-13s to have an account but have no age verification process to check that children who are consenting are the age they state themselves to be—then there will continue to be widespread breaches of both the companies’ own rules and this new Data Protection Act. In the Bill, it is unclear how breaches will be handled by the Information Commissioner and what penalties will be put in place for those companies failing to verify age properly.

There is also no consideration in the Bill about capacity, rather than simply age, or protection for vulnerable children. Although there are arguments for setting the age limit higher—or indeed lower—there is surely a need both for proper evidence to be gathered and for a minimum requirement for companies to have robust age verification systems and other safeguards in place before any such legislation is passed. We will pursue that. There is also the question of the overlap this derogation has with the right to be forgotten, which the Minister mentioned. That right kicks in only at age 18; we need to probe why that is the case and how that will work in practice.

During Committee, we want to check that the current rules affecting data subjects’ personal data are unchanged by the new laws. Taking the data of workers and prospective workers as an example, there are concerns about where personal data has been collected: it should be gathered, used and shared by employers only following affirmative, meaningful consent. The recent disgraceful cases of blacklisting come to mind in that respect, and we are also concerned about whistleblowers’ rights. The House has been very strong on that point.

Concern about the increasing use of algorithms and automatic data processing needs to be addressed, perhaps requiring recording, testing and some level of disclosure about the use of algorithms and data analysis, particularly when algorithms might affect employment or are used in a public policy context. Related to that is the question of the restriction on data subjects’ rights in relation to processing data contained in documents relating to criminal investigations. Here, we agree with the Information Commissioner that the provision, as drafted, restricts not just access rights but the right to rectification, the right to erasure and the restriction of processing. We welcome greater clarification on the policy intent behind this as we go into Committee.

We welcome the Government’s proposal for an offence of knowingly or recklessly re-identifying de-identified personal data without the data controller’s consent. The rapid evolution of technology and growth in the digital economy has led to a vast increase in the availability and value of data. There is a clear need for robust safeguards against misuse in this area.

On representation, we welcome the provision in article 80(1) of the GDPR which gives greater ability for civil society and other representative bodies to act on behalf of citizens and mirrors consumer rights in goods and services. However, article 80(2) contains a provision that the Government have chosen not to implement, under which consumer groups that operate in the privacy field can act on behalf of data subjects without a particular complainant. We think that this super-complainant system would help to protect anonymity and create a stronger enforcement framework. We know we are supported in that belief by the Information Commissioner.

The wider question here is perhaps whether data subjects in general, particularly vulnerable ones, have sufficient support in relation to the power of media companies that want to access and use their data. Does any of us know what really happens to our data? The Information Commissioner’s Office already has a huge area of work to cover and may struggle to cover all its new responsibilities. Having a better system for dealing with complaints submitted by civil society bodies may be a good first step, but I wonder whether we might think harder about how this will be organised—perhaps modelled on the Caldicott data guardians.

Finally, there has been a lot of debate since the publication of the Leveson report on the cultural practices and ethics of the press, particularly on the role of a future regulatory framework. There has been far less discussion on Lord Leveson’s recommendations to extend data protection regulation. I reassure the Government that we do not see this Bill as an opportunity to rerun many of the excellent debates or table amendments that we have already considered in your Lordships’ House in recent years. Of course, much remains to be done in this field, and the Government’s lack of action is a national disgrace and a flagrant betrayal of the victims who trusted them and gave them a once-in-a-generation chance to sort out the situation, which they have comprehensively failed to take. However, if amendments of this type come forward, we will consider them on their merits, although a better approach would be for an all-party consensus to try to bridge the gap once and for all between the press and Parliament. I hope to have further discussions on this point.

I give notice that we will table amendments which probe why the Government have decided not to bring forward the Leveson recommendations covering: exemptions from the Data Protection Act 1998, available for investigative newsgathering by journalists; extending the scope for statutory intervention over the press by the Information Commissioner; and changes to the power, structure, functions and duties of the ICO relevant to the press. We will also probe whether the Government intend to implement amendments previously made to Section 55 of the Data Protection Act by virtue of Section 77 of the Criminal Justice and Immigration Act 2008, which would allow terms of imprisonment of up to two years to be imposed for offences of unlawfully obtaining disclosure of personal data. As the Information Commissioner has previously noted, this has much wider application than just to the press, because there is an increasing number of cases of blagging and unauthorised use of personal data which must be stopped.

The Government have set themselves a very tight timetable to pass this Bill into law before the end of April 2018. We will support the main principles of the Bill, but, as indicated above, many areas need to be scrutinised in depth before we can agree to them. I hope that we can gather more evidence and find a way of bringing Hamlet back into the play by looking in detail at the GDPR before it becomes the law of the land. If data is the new oil, we owe it to the country and particularly our children to get this right and to get our laws fit for the digital age.

16:26
Lord McNally Portrait Lord McNally (LD)
- Hansard - - - Excerpts

My Lords, I am delighted to follow the noble Lord, Lord Stevenson, in this debate. I am a little puzzled, because some months ago I took part in a rather emotional debate where we said farewell to him on the Front Bench and, since then, they seem to have been working him harder than ever. As the Minister will already have gathered from his intervention, although he can look to the noble Lord’s support for the Bill, in many parts it will be like Lenin’s support for the social democrats: like a rope supports the hanging man. We will look forward to working with the noble Lord, Lord Stevenson, on many of the points that he has raised, not least on part 2 of Leveson.

I open this debate for the Liberal Democrats because, as the Minister has already explained, my noble friend Lord Clement-Jones is chairing the Committee on Artificial Intelligence this afternoon. He will return to the fray later in the Bill’s passage to do a lot of the heavy lifting with my noble friend Lord Paddick.

While wishing the Bill well, our approach will be to try to ensure that individuals have to the maximum extent possible control of their own data and that data are used responsibly and ethically by individuals and by both public and private bodies. This will be of particular concern in law enforcement areas where, for example, the use of algorithms throws up concerns about profiling and related matters.

It is clear that the Brexit decision and timetable will cast a long shadow as we debate the Bill. The Information Commissioner, Elizabeth Denham, has already warned that data adequacy status with the EU will be difficult to achieve within the Government’s Brexit timetable and a major obstacle has been erected by the Government themselves. The European withdrawal Bill makes it clear that the EU Charter of Fundamental Rights will not become part of UK law as part of the replication process, yet Article 8 of the charter relating to personal data underpins the GDPR. How then will we secure adequacy without adhering to the charter?

As the noble Lord, Lord Stevenson, indicated, there are many other issues relating to the GDPR and Brexit, particularly the need to examine and test the derogations in the Bill, which I am sure will be raised by colleagues and others and which we will probe further in Committee.

While referring to the Information Commissioner, I put on record our view that the Information Commissioner’s Office must continue to be adequately funded and staffed during this period of great uncertainty. The biggest changes since our debates on the Data Protection Act 1998, or even the early stages of the GDPR, which I was involved in as a Minister at the MoJ from 2010 to 2013, is that the threat to civil liberties and personal freedoms now comes not only from agencies of the state but from corporate power as well.

A week today, on 17 October, the Royal Society of Arts will host a discussion entitled “The Existential Threat of Big Tech”. The promotion for this event says:

“The early 21st century has seen a revolution in terms of who controls knowledge and information. This rapid change has profound consequences for the way we think. Within a few short decades the world has rushed to embrace the products and services of four giant corporations: Amazon, Facebook, Apple and Google. But at what cost?”.


That question prompts an even more fundamental question. We have become accustomed to the idea that some financial institutions are too big to fail. Are we approaching a situation where these global tech giants are too big to regulate? As a parliamentarian and democrat, every fibre of my being tells me that that cannot be so. We have to devise legislation and have the political courage to bring the global tech giants within the compass of the rule of law, not least in their roles as media operators, as the noble Lord, Lord Stevenson, indicated.

These modern tech giants operate in a world where the sense of privacy which was almost part of the DNA of my own and my parents’ generation is ignored with gay abandon by a generation quite willing to trade their privacy for the benefits, material and social, that the new technology provides. That is why we are so indebted to the noble Baroness, Lady Lane-Fox. Her speech in the debate she initiated in this House on 7 September is required reading in approaching the Bill. That speech contains her oft-repeated warning about sleepwalking to digital disaster, but it also robustly champions the opportunities open to a digitally literate society. I know that she will have an ally in my noble friend Lord Storey in championing better and earlier digital education in schools. The noble Lord, Lord Puttnam, recently pointed out that Ofcom already has an existing statutory duty to promote digital education. It will be interesting to learn how Ofcom intends to fulfil that obligation.

The elephant in the room always in discussing a Bill such as this is how we get the balance right between protecting the freedoms and civil liberties that underpin our functioning liberal democracy while protecting that democracy from the various threats to our safety and well-being. The sophisticated use of new technologies by terrorist groups and organised crime means that we have to make a sober assessment of exactly what powers our police and security services need to combat the terrorist attack and disrupt the drug or people trafficker or the money launderer. The fact that those threats are often overlapping and interconnected makes granting powers and achieving appropriate checks and balances ever more difficult.

On the issue of crime fighting, I recently attended a conference in the Guildhall, sponsored by the City of London Corporation, the Atlantic Council and Thomson Reuters. Its title was “Big Data: A Twenty-First Century Arms Race”. It could have been called “Apocalypse Now”, as the threat to business, the state and the individual was outlined, from existing technologies and from those fast approaching and identified. I was encouraged that there seemed to be an appetite in the private sector to co-operate with the police and government to ensure that big data can be effectively tamed to ensure better compliance, improve monitoring and reporting and prevent illicit financial flows. I will be interested to know whether the Government have a similar appetite for public/private co-operation in this area.

One point was made with particular vigour by Thomson Reuters. With offerings such as World-Check, it plays a key role in Europe and globally in helping many private sector firms and public authorities identify potential risks in their supply chains, customers and business relationships. It made it clear that it will be needing a number of clarifications in the Bill so that it will be able to continue to provide its important services, and we will probe those concerns and the concerns of others in the private sector in Committee.

In Committee we will also seek to raise concerns brought to us by Imperial College London and others about the efficacy of Clause 162 on the re-identification of de-identified personal data. We will need to probe whether the clause is the best way of dealing with the problem it seeks to address. I notice that the noble Lord, Lord Stevenson, gave it his approval, as did the Information Commissioner, but it is a legitimate question.

There is no doubt that the greater transparency and availability of data provided by government has contributed to citizens’ better understanding of and access to government information and services, but public concerns remain about the use of data in certain sectors. For example, although there are clear benefits to medical research from giving researchers access to anonymised medical data, it remains a matter of concern to the public, the media and the profession itself. Your Lordships will have received a briefing from the BMA on the matter and I am sure probing amendments will be required in Committee.

I am by nature an optimist, so I believe the noble Baroness, Lady Lane-Fox, when she tells us, as she did in this House a month ago, that,

“we can harness the power of these technologies to address the other great challenges we face”.—[Official Report, 7/9/17; col. 2110.]

In my youth I read Robert Tressell’s The Ragged Trousered Philanthropists, a parable about how working men were complicit in their own exploitation. We are in danger of becoming the 21st century’s ragged trousered philanthropists if we do not have a framework of law by which we can constrain big data from misusing the information we so profligately provide every day in every way.

I do not believe that sprinkling Bills with Henry VIII clauses is an answer to the challenge of future-proofing. Perhaps there is a case for expanding the remit of the National Data Guardian to act as an early warning system on wider data abuse—or that of the Information Commissioner or our own Select Committee—but there is a need. I fear that without some permanent mechanism in place, we will be for ever running up the down escalator trying to match legal protections to technical capacity. But that is no excuse for not trying to improve the Bill before us. We will work with others so to do. Looking at the speaking list, the Minister is not going to be short of good and expert advice on how to do that.

16:37
Lord Jay of Ewelme Portrait Lord Jay of Ewelme (CB)
- Hansard - - - Excerpts

My Lords, it is always a pleasure to follow the noble Lord, Lord McNally. It is always a good thing when one optimist follows another. As chairman of the EU Home Affairs Sub-Committee, I will speak mainly about the EU Committee’s report on the EU data protection package, which we are debating alongside the Second Reading of the Data Protection Bill.

I understand that it is unusual procedure to debate a committee report alongside a Bill but I believe that it makes sense on this occasion. As the noble Lord, Lord Stevenson, said, the committee meets shortly—indeed, tomorrow—and I am sure it will consider his proposal, but taking into account how that would fit in with the traditional role of the committee and the programme we already have before us, I am sure the noble Lord will forgive me if I do not go further than that at this stage. We have not yet received a response to our report from the Government, which we await with keen anticipation, but we are pleased that this Second Reading debate has given us an opportunity to bring the EU Committee’s findings to the attention of the House.

In their recent Brexit position paper, The Exchange and Protection of Personal Data—A Future Partnership Paper, the Government said that they wanted to maintain free and uninterrupted data flows with the EU after we leave; and in proposing a new security and criminal justice treaty between the UK and the EU in her recent Florence speech, the Prime Minister laid out her ambition for a model underpinned by, among other things, high standards of data protection. Our report supports this objective: free and uninterrupted data flows matter to us all. But the committee was struck by the absence of clear and concrete proposals for how the Government plan to deliver that objective. The stakes are high, not least because the introduction of greater friction in data transfers could present a real barrier to future trade. It is hard to overstate the importance of cross-border data flows to the UK economy. Getting on for half of all large EU digital companies are based in the UK, and three-quarters of the UK’s cross-border data flows are with EU countries. What is more, any impediments to data flows following our withdrawal from the EU could seriously hinder police and security co-operation, and that means that lives, not just money, are at stake.

In our report, we considered four elements of the EU’s data protection package: the general data protection regulation—the GDPR—which the Data Protection Bill seeks to transpose into UK law; the police and criminal justice directive; the EU-US privacy shield, and the EU-US umbrella agreement. Both the regulation and the directive will enter into force in May 2018, while we are still a member of the EU. The agreements with the US are already in force, but will cease to apply to the UK after our withdrawal. Our report considers the Government’s policy options both short and long term.

The committee wanted first to look at possible data protection arrangements once the UK becomes a third country outside the EU, and we heard evidence on two broad options. The first option is for the UK Government to secure a so-called adequacy decision from the European Commission which would certify that the UK offered a standard of protection that was “essentially equivalent” to EU data protection standards. To date, the Commission has adopted 12 such decisions. The second option would be for individual data controllers and processors to adopt their own safeguards using tools such as standard contractual clauses and binding corporate rules. Our report comes to a clear conclusion that this second option would be less effective. The tools available to individual data controllers, including small businesses, are bureaucratic and would be vulnerable to legal challenges. We therefore agree with the Information Commissioner that the Government should seek an adequacy decision for the UK as a whole. This should offer certainty for businesses, particularly SMEs. It would also follow the approach taken by Switzerland, which has secured an adequacy decision from the EU. I am therefore pleased that the Government’s position paper also calls for a future relationship that builds on the adequacy model.

But there is a fly in this particular ointment. The general data protection regulation only provides for adequacy decisions for third countries, not countries leaving the EU. Decisions also follow a lengthy procedure, so the chances of having an adequacy decision in place by March 2019 are small. So to avoid a cliff edge, we will need transitional arrangements. The Government’s position paper acknowledges this but lacks detail. I hope that in responding to this debate the Minister will update us on the Government’s thinking on transition and perhaps provide some more of that detail. In particular, I hope that as a Home Office Minister she can comment on the risks facing law enforcement. One of the most striking findings in our inquiry was that as a third country the UK could find itself held to higher standards of data protection than as a member state. This will be the case both when the European Commission considers an adequacy decision and when the UK’s data retention and surveillance regime is tested before the Court of Justice, at which point we will no longer be able to rely on the national security exemption enjoyed by member states under the EU treaties. The United States has fallen foul of EU data protection law in the past, and it is not impossible that the United Kingdom will do the same when it is no longer a member state.

On a related theme, the committee also considered whether the UK’s data protection regime would continue to be influenced by EU legislation after withdrawal. What we found was that the general data protection regulation will continue to apply to transfers of personal data from the EU to the UK, significantly affecting UK businesses that handle EU data. If we obtain an adequacy decision, the rulings of the new European Data Protection Board and the Court of Justice will have an effect, albeit indirectly, by altering the standards that the UK will need to maintain an adequate level of protection. This means that there will be no clean break. We will also continue to be affected by EU rules on the onward transfer of personal data to third countries. This could be a particular problem in the field of security, whereby our approach to sharing personal data with, say, the United States could put any adequacy decision at risk. In summary, it seems likely that EU and UK data protection practices will need to remain alive long after we leave the EU.

The Bill that we are debating today reflects a comprehensive EU data protection regime which has been heavily influenced over the years by the United Kingdom. Withdrawal from the EU means that we stand to lose the institutional platform from which we have exercised that influence. The committee’s report therefore concludes that the Government must aim to retain the UK’s influence wherever possible, starting by securing a continuing role for the Information Commissioner’s Office on the European Data Protection Board. I am glad that the Government’s data protection position paper spells out our aim to do just that, but in the longer term, the Government will also need to find a way to work in partnership with the EU to influence the development of data protection standards at both the EU and the global level. The continued success of our commercial and security relations with the EU will depend on that.

16:47
Lord Archbishop of York Portrait The Lord Bishop of Chelmsford
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Jay, for enabling us to discuss the EU data protection package alongside the Data Protection Bill, but I will address my comments to the Bill.

Although I also welcome the rights and protections for children that the Bill offers, not least the right to be forgotten, there is one very important point of detail where reconsideration is urgently needed, which has already been mentioned by the noble Lord, Lord Stevenson, namely the age of consent for children to give their personal information away online in exchange for products and services without a parent or guardian needing to give their permission. The proposals in Clause 8, as we have already heard, set this age of consent at 13. However, a recent YouGov survey of the public commissioned by the BCS, the Chartered Institute for IT, shows very little support for this. Indeed, a whopping majority of 81% thought the age should be set at either 16 or 18. The Bill’s Explanatory Notes state that the Government have chosen this age—the youngest possible allowed under the incoming GDPR rules—because it is,

“in line with the minimum age set as a matter of contract by some of the most popular information society services which currently offer services to children (e.g. Facebook, Whatsapp, Instagram)”.

In other words, a de facto standard age of consent for children providing their personal information online has emerged, and that age has been set by the very companies that profit from providing these services to children. It might be that 13 is an appropriate age for consent by children to give their information away online, but surely that should be decided in other ways and with much greater reference to the public, and I do not think this has happened. It is certainly at odds with the results of this recent survey.

Moreover, Growing Up with the Internet, the recently published report of the Select Committee on Communications, on which I am privileged to serve, examined the different ways in which children use the internet through the different stages of childhood. We received lots of evidence that lumping together all young people between the ages of 13 and 18 was really not helpful, and that much more research was needed. To bow to the commercial interests of Facebook and others therefore feels at the very least premature, and the example of its usefulness given in the Explanatory Notes—that this would somehow ease access to,

“educational websites and research resources”,

so that children could “complete their homework”—somewhat naïve, particularly in the light of other conclusions and recommendations from the Growing Up with the Internet report, not least that digital literacy, alongside reading, writing and arithmetic, should be considered a “fourth R”; that the Government should establish the post of a children’s digital champion at the centre of government; that children must be treated online with the same rights, respect and care that has been established through regulation offline; and that all too often commercial considerations seem to be put first. So 13 might be the right age but it might not, and at the very least, further consultation with the public and with parents is needed.

16:52
Baroness Neville-Jones Portrait Baroness Neville-Jones (Con)
- Hansard - - - Excerpts

My Lords, it is a great pleasure to follow the right reverend Prelate, who has touched on one of the points that have attracted most attention since the Bill was published and began to generate comment. I also hope that the committee of the noble Lord, Lord Jay, might be able to give us some kind of report and assessment on GDPR because, while I think the Bill is important in its own right, it is quite awkward to discuss it in the absence of a very important part of the regulations that will apply in this country or any assessment of the linkages or potential disparities that may exist between the two. I beg that the committee might consider this a priority.

I think the House will agree that this is an important use of legislation, and its scope is—necessarily, I think—very large. There is no real activity in society these days that does not generate data that is processed in some way. Because of the scale of data creation—the figures are extraordinary—usage continues to grow exponentially and personal data is extremely bound up in all that. All of us are affected by the data world. It is increasingly obvious that the functioning of the economy and of public services depends on the availability, accuracy and security of data. It is also key to wealth creation. It has become very clear in the series of strategies that the Government are producing at the moment that data lies absolutely at the heart of the way in which this country will be able to make its way forward and remain a prosperous society, and therefore that we have to get the regulation of data right. It is the basis on which we will advance general knowledge and welfare in society.

The Government have produced a Bill that enables us to tackle detail, and it is the detail on which this House will focus in later stages. It is impossible in a discussion of this kind to do justice to all the angles. I shall in later stages want to focus on the cyber and national security elements, but today I shall focus on what I regard as a potential opportunity, provided we get the regulatory framework right. That is research, which has not featured much so far in our deliberations.

The abundance of datasets that society simply has not had before opens up to us the possibility of types of research which can lead us to enormous discovery and greater beneficial activity and welfare. For instance, it will enable medicine to be put on an essentially personalised rather than generic basis, and the UK should have a huge advantage in the longitudinal data that the NHS possesses, which no other country can rival. It ought to be something where we can make a real pitch for both advancing welfare and increasing wisdom, knowledge and wealth in our society. Obviously, that depends on the use of data being proper and the regulation of it not getting in the way, which is not a theoretical issue. Existing legislation, which comes largely from the EU, combined with the way in which the precautionary principle has sometimes been applied, means that some kinds of trials in some fields in this country have now become so difficult to conduct within the EU that companies engaging in them have decamped elsewhere—often to the United States—to the intellectual and commercial impoverishment of Europe. That is a practical illustration of how important it is to get the balance between trying to regulate against abuse and the opportunities that you should leave open.

As the UK leaves the EU, it will be essential—I use the word “essential”—for the UK to be able to demonstrate adequacy. I hope the Government will assure us on that point and produce the necessary regulatory framework to enable it to happen. Some very big issues here have already been mentioned and I will not repeat them. Adequacy does not mean that the UK should simply cut and paste all EU legal provisions where reliance on national law and derogations are real options in front of us. There are some where we should be availing themselves of them. Nor do we need to make privacy safeguards—which are very important—so demanding that they become self-defeating, standing in the way of benefiting patients, in the case of medicine, and the community more generally.

The Government have made it clear that they want the Bill to support research, which is extraordinarily welcome. I hope that when she replies, the Minister will be able to say something about how the Government will approach the changes that will be needed to deal with research issues in the UK. The Bill classes universities as public bodies, and universities lie at the core of the research community. It is fair enough for universities to be classed as public bodies—that is what they are—but the legislation then denies them the right to invoke public interest, or even legitimate interest, as a basis for their research, and thus obliges them to seek explicit consent when using data at every stage of processing. This becomes very onerous if you are doing a long study. That may on the face of it seem reasonable but, in practice, it can do real harm. The whole point of research is that often at the outset it cannot be 100% certain where it may lead or whether further processing or trials may be necessary. You can get a situation in which unexpected and unplanned-for research is available and could yield real dividends. That is especially true of interventional research. If, as a result of wanting to take it to a further stage, the data processing demands that there should be another round of explicit consent, you get into a situation whereby universities—unlike some of the public bodies in government, which do not have to follow this procedure—have to go round again to all those who offered their personal data in the first place. Seeking the consent of holders of the data anew may simply not be possible, especially in long-term research projects. People move house or become incapable; they also die.

Even if those problems can be overcome—and I think they are real—there is a question of proportionality. Why make consent so onerous that it makes research too difficult in practice and too costly to engage in? There needs to be greater proportionality on this issue and greater alignment between the various bodies that use data in this way, and there needs to be some alternative to consent as the basis for engaging in some kinds of research. Numerous government mechanisms are available, not least ethics committees, which are a key component of modern research and could provide the necessary safeguards against abuse. I recognise that there need to be safeguards, but I suggest that we should use some imagination in how they could be brought about.

In this country, we are very rich in research conducted by voluntary, not-for-profit and charitable bodies. They often supplement what the public sector and universities are unable or unwilling to do, but they do not find a place in this legislation, which posits that all research of value is conducted by “professional bodies”—a definition that excludes many organisations doing valuable work under the terms of the existing law. That law is to be tightened up, which may create difficulties. I am associated with one such organisation, and I want to give a tiny illustration of the problems that arise as a result of being outside the field of professional bodies.

I am involved with an organisation called Unique, which deals with rare genetic disorders, whereby datasets to be useful have to be gathered globally. The number of people with those afflictions is so tiny in any given population that you have to go across the globe to connect useful datasets, which means in turn that you come up against some of the provisions that govern transnational transmission of data. However, the rarity of such individual disorders also makes every patient’s data precious to other affected individuals, because it is potentially a very tight community. No other organisation is dealing with that affliction in that way, and Unique can give support and advice to otherwise lonely parents and their equally isolated medics, who turn to Unique for information about alike cases. There is a network there.

By insisting on onerous consent regimes, we are in danger of disabling such organisations from continuing their pioneering work. In Unique, it is not uncommon for parents who have not been in touch for a long time suddenly to turn to it with a request for help. Try telling families, many of whom are not in the UK but are in third countries, who are coping with the daily stress of caring for a disabled child or adult, that they must be sure to keep up online with the stringent requirements of UK data legislation and that failing to do so will mean that they run the severe risk of no longer being able to get the kind of individualised attention and support that they seek from the very organisations set up to help them. The problem is that the law will lay down the need for the regular reconsultation and re-consent of individuals in very precise ways, and that such individuals might not reply, not understanding the potential hazards involved in failing to do so. One might say that data anonymisation might solve the problem. It solves some problems, but it creates new ones in an organisation set up for certain purposes where the idea is that one fellow sufferer can help another. So piling difficulties on small organisations—there are other difficulties that I have not even mentioned—might lead ultimately to an unwanted outcome, which will be a reduction in effectiveness.

I am not pleading for essential provisions on privacy to be disregarded. That would not be a sensible plea. However, I suggest that we are still in the foothills of the data-driven world and, while it is right to demand rigorous standards and strict enforcement, that is not the same as passing narrow and inflexible legislation that will have unwanted and unnecessary side-effects. The research base of this country needs a wider base for lawful consent and this legislation should recognise that not all valuable research fits into normal categories. I would like the Government to think about the possibility that they should allow for the creation of governance and accountability regimes that will fit special circumstances—and I am sure that we will come across others as we go through this legislation. The existence of the Information Commissioner should not result just in enforcing the law effectively and well; it should provide an opportunity for creativity under her auspices and the ability to create variations on governance regimes where they are needed.

17:06
Baroness Ludford Portrait Baroness Ludford (LD)
- Hansard - - - Excerpts

My Lords, I welcome the modernisation of data protection law that the Bill represents and the intention to comply with EU law in the regulation and directive—which of course we must do while we are still in the EU. I am particularly concerned with the future and the prospects for an adequacy decision from the Commission if we find ourselves outside both the EU and the EEA. A failure to get such a decision would be extremely harmful for both businesses and other organisations and for law enforcement.

I will look briefly at the past. In 2013 in the European Parliament I was one of the lead MEPs establishing the Parliament’s position on the regulation. I believe that we did a decent job—that was before the negotiations with the Council, which watered it down somewhat. The Government rightly acknowledge that the new system will build accountability with less bureaucracy, alleviating administrative and financial burdens while holding data controllers more accountable for data being processed—backed up by the possibility of remedies for abuse including notable fines. But the purpose is to provide incentives to build in privacy from the beginning through such instruments as data protection impact assessments and having a data protection officer, through data protection by design and default—thereby avoiding getting to the point of redress being necessary. As an aside, the routine registration with the Information Commissioner’s Office will be abolished, and I am not aware of how the ICO will be funded in future, because that was a revenue stream.

I will say briefly that the new rights that are in the regulation include tougher rules on consent, so we should see the end of default opt-ins or pre-selected tick boxes. That will probably be one of the most visible things for consumers; I hope that it does not become like the cookies directive, which has become a bit of a joke. The need for explicit consent for processing sensitive data is important, as is the tightening of conditions for invoking legitimate interests.

There are several matters which will give improved control over one’s own data, which is very important. There is also the right to be told if your data has been hacked or lost—so-called data breach notification—and a strengthened ability to take legal action to enforce rights. All these are considerable improvements. However, I am rather concerned about the clarity of this very substantial Bill. It is explained that the format is chosen to provide continuity with the Data Protection Act 1998, but whether or not as a result of this innocent, no doubt valuable, choice, it seems to me that some confusion is thereby created.

First, there is the fact that the GDPR is the elephant in the room—unseen and yet the main show in town. You could call it Macavity the cat. The noble Lord, Lord Stevenson, dubbed the Bill Hamlet without the Prince. Traces exist without the GDPR being visible. Is the consequent cross-referencing to an absent document the best that can be done? I realise that there are constraints while we are in the EU, but it detracts from the aims of simplicity and coherence. Apparently, things are predicted to be simpler post Brexit, at least in this regard, when the GDPR will be incorporated into domestic law under the withdrawal Bill in a “single domestic legal basis”, according to the Explanatory Memorandum. Does that mean that this Bill—by then it will be an Act—will be amended to incorporate the regulation? It seems odd to have more clarity post Brexit than pre-Brexit. It would no doubt be totally unfair to suggest any smoke-and-mirrors exercise to confuse the fact of the centrality of EU law now and in the future.

Secondly, we seem to have some verbal gymnastics regarding what “apply” means. The departmental briefing says that the Bill will apply GDPR standards, but then we have the so-called “applied GDPR” scheme, which is an extension of the regulation in part 2, chapter III. Can the Minister elaborate on precisely what activities part 2, chapter III covers? The Bill says that manual unstructured files come within that category. I do not know how “structured” and “unstructured” are defined, but what other data processing activities or sectors are outside the scope of EU law and the regulation, and are they significant enough to justify putting them in a different part?

Looking forward, I want to mention some of what I see as the possible weaknesses in the Bill which might undermine the potential for an adequacy decision for data transfers to the EU and the EEA. The future partnership paper published in August, which has already been mentioned by the noble Lord, Lord Jay, referred to a UK-EU model which could build on the existing adequacy model. Can the Minister explain what that really means? As the noble Lord, Lord Jay, said, while national security is outside EU law, when it comes to assessing the adequacy of our level of data protection as a third country, we could find ourselves held to a higher standard because the factors to be taken into account include the rule of law and respect for human rights, fundamental freedoms and relevant legislation, including concerning public security, defence, national security, criminal law and rules for the onward transfer of personal data to another third country. Therefore, our data retention and surveillance regime, such as the bulk collection of data under the Investigatory Powers Act, will be exposed to full, not partial, assessment by EU authorities. This will include data transfers, for instance to the United States, which I would expect to be very much under the spotlight, and could potentially lead to the same furore as other transatlantic transfers. I lived through a lot of that. I remember that in 2013 there was a lot of flak about the actions of the UK, but nothing could be done about it because we are inside the EU. However, in the future it could.

There are also a number of aspects in the Bill in which the bespoke standards applied to intelligence agencies are less protective than for general processing, such as data breach reporting and redress for infringement of rights. We will need to give serious thought to the wisdom of these, looking to the future. This will not just be a snapshot on Brexit day or even on future relationship day, because at issue will be how our standards are kept up to scratch with EU ones. The fact that with another part of their brain the Government intend to decline to incorporate the European Charter of Fundamental Rights into UK domestic law, with its Article 8 on data protection, will not help the part of the governmental brain which looks forward to the free flow of data exchange with the EU. Our Government seem to be somewhat at cross purposes on what their future intentions are.

I will highlight, rather at random, some other examples which need reflection. We may need seriously to look at the lack of definition of “substantial public interest” as a basis for processing sensitive data, or even of public interest. I think the noble Lord, Lord Stevenson, mentioned the failure or the non-taking-up of the option under Article 80(2) of the regulation to confer on non-profit organisations the right to take action pursuing infringements with the regulator or court. This omission is rather surprising given that a similar right exists for NGOs, for instance, for breach of other consumer rights, including financial rights. Perhaps the Minister could explain that omission.

There is also concern that the safeguards for profiling and other forms of automated decision-making in the Bill are not strong enough to reflect the provisions of Article 22 of the GDPR. There is no mention of “similar effects” to a legal decision, which is the wording in the regulation, or of remedies such as the right of complaint or judicial redress.

Very significant is the power for the Government under Clause 15 to confer exemptions from the GDPR by regulation rather than put them in primary legislation. That will need to be examined very carefully, not only for domestic reasons but also because it could undermine significantly an adequacy assessment in the future.

I will make one or two points in the health and research area. The Conservative manifesto commitment to,

“put the National Data Guardian for Health and Social Care on a statutory footing”,

is not fulfilled in the Bill; perhaps the Minister could explain why not. I would also expect clarification as the Bill proceeds on whether Clauses 162 and 172 sufficiently protect patients’ rights in the use or abuse of medical records. We know this is a sensitive issue given the history in this area, particularly of care data and other attempts to inform patients.

As a final point, I am glad that the research community was broadly positive about the compromises reached in the GDPR, although they were less explicit than the Parliament’s position. That leads to some uncertainty. I took note of what the noble Baroness, Lady Neville-Jones, said. Therefore, close examination will be merited of whether the Bill provides a good legal framework with sufficient legal basis for research, which many of us have all sorts of interests in promoting, balanced with a respect for individual rights. I very much hope this will be explored carefully at future stages.

17:18
Lord Patel Portrait Lord Patel (CB)
- Hansard - - - Excerpts

My Lords, many of my comments on the Bill are about data collection, usage and storage, particularly as it applies to research and, in particular, health research. In that respect, I will reference many of the comments on research made by the noble Baroness, Lady Neville-Jones, including health research generally and health research for people with rare conditions and how that data might be collected.

Given the rapid advances of data science and our capacity to collect, process and store vast quantities of data, such as genomic data for individuals, ensuring that data subjects have clear rights regarding how their data is used is vital. The recently published life sciences industrial strategy acknowledges both that fact and the significant potential of the data held within the healthcare system, especially for delivering better care and for the research sector.

The importance of getting the governance of personal data right is increasingly being recognised. The Royal Society and the British Academy recently published a report on data governance, calling for careful stewardship of data to ensure that the power and value of data are harnessed in such a way as to promote better human health and human benefit.

The Government have indicated that they recognise the importance of maintaining data flows across borders post Brexit, and that is positive. For instance, three-quarters of the health-related data flow from the UK is to the EU. As far as research is concerned, the relevant provisions of the Data Protection Bill mirror the GDPR and so should not generate problems for international collaborative research as it stands. However, it is imperative that international research that requires the transfer of personal data can continue without disruption post Brexit, and the example of rare diseases used by the noble Baroness, Lady Neville-Jones, is absolutely appropriate. In such situations, research often has to be co-ordinated and conducted across many countries, as there are few individuals with a particular condition in each country. My noble friend Lord Jay referred to the need for adequacy arrangements, and I think that that applies particularly in this area. Therefore, my question to the Minister is: will the UK, as a third country, seek an adequacy decision from the EU for data transfers in this respect?

I now come to Clause 7, which refers to alternatives to consent. The noble Baroness, Lady Neville-Jones, referred briefly to the problems that arise. For many uses of personal data, explicit consent is absolutely the right legal basis for processing that data, and it is positive that, with the GDPR, data subjects’ rights have been strengthened. Medical research will usually rely on a person providing informed consent for ethical reasons, but it is essential that there are alternatives to consent as a legal basis. That is because GDPR-compliant explicit consent sets a high bar for information provision that it may not always be feasible to meet. In many research resources, such as biobanks—I hope that my noble friend Lady Manningham-Buller will refer to that as the chairman of the Wellcome Trust, which is responsible for initiating the UK Biobank—the participants give consent for their pseudonymised data to be used.

In some studies it is not possible to seek consent, either because a very large sample size is needed to generate a robust result, and that would be practically difficult to obtain, or because seeking consent would introduce bias. The use of personal health data without specific explicit consent is sometimes essential for research for the health of the population. If researchers could not process medical records for research without specific explicit patient consent, they could not run cancer registries, which are extremely important in recording all cases of cancer; they could not monitor the hazards of medical procedures, such as the recently discovered implications of CT scans for long-term disease development; they could not assess the unexpected side-effects of routinely prescribed medicines; and they could not identify sufficiently large numbers of people with a particular disease to invite them to take part in trials for the treatment of that disease. The example I would give is the recruitment of 20,000 suitable people for the Heart Protection Study on statins, which has helped transform medical practice throughout the world. I am sure that many noble Lords use statins. This began with the identification of 400,000 patients with a hospital record of arterial disease and that information could not have been accessed without their permission. There are good examples of how this provision would cause a problem as it is enunciated in Clause 7.

We have a well-established, robust system of governance and oversight for non-consensual medical research in the UK; for example, through the Health Research Authority, a confidentiality advisory group, advising on Section 251 approvals to override the common law duty of confidentiality. Patient groups actively advocated for research exemptions during the passage of the GDPR—for example, through the Data Saves Lives campaign. I hope that, in Committee, we might get an opportunity to explore this further to see whether we can somehow modify the Bill to make this possible.

I come now to the public interest issues in the same clause. I understand that the Government intend the functions listed in Clause 7 not to be exhaustive, and to allow, for example, research conducted by universities or NHS trusts to use the public interest legal basis. Again, the noble Baroness, Lady Neville-Jones, briefly touched on that. It would provide much-needed clarity and assurance to the research community, particularly to those in the universities, if this could be made explicit in the Bill. A huge amount of research will rely on public interest as a legal basis. The Government have recognised the value of making better use of data for research, and the recent life sciences industrial strategy confirms the tremendous potential benefits for patients and the public if we can unlock the value of data held by public authorities and promote its use in the public interest.

There is currently a highly risk-averse culture in data protection, driven in part because people are unclear about the rules and what they can or cannot do with data for their purposes—hence I referred to the need for better governance of the data. This is why the public interest legal basis matters so much for research. The DP Bill is an opportunity to set out very clearly what the legitimate basis for processing personal data can be. Setting out a clear public interest function for research will give researchers confidence to know when they are operating within the law. If necessary, any specification of research in Clause 7 could be qualified by safeguards to ensure that the legal basis is used only when appropriate.

Can the Minister confirm that research conducted by, for example, universities or hospitals could use the public interest legal basis for processing personal data? Again, we may have an opportunity to explore this further in Committee.

I come now briefly to Clause 18 and the issue of safeguards. Where exemptions from data subject rights exist for research, robust safeguards to protect data subjects’ rights and interests are essential. Clause 18 transposes Section 33 of the Data Protection Act into the new Bill, but it will have wider application than it did in the Data Protection Act. Under the Data Protection Bill, all medical research undertaken without consent as the legal basis will be subject to the safeguards of Clause 18. Clause 18 prohibits the processing of personal data to support measures or decisions with respect to particular individuals. This is clearly problematic for any research that involves an intervention for an individual, which forms the bedrock of our understanding of a vast range of treatment for diseases.

Let me give the House some brief examples. Clinical trials and other interventional research will be undertaken with the consent of patients, which is ethically essential. However, the standard of consent may not be GDPR compliant as it is not always possible to specify how the data might be used beyond the purpose of the trial itself. Consent is therefore not the appropriate legal basis for much interventional research. This means that the safeguards built into the Data Protection Bill for processing or research purposes will apply. Clause 18 should not apply to interventional research. That research requires the processing of personal data to make decisions about the data subject as that is part of the necessary research design and oversight. If researchers cannot process data in that way, they will not be able to process information about a patient’s condition to assess whether they are eligible to participate in a clinical trial. They will not be able to process information about a patient’s condition to determine to which arm of the trial they should be allocated. They will not be able to remove individuals from a clinical trial if evidence arises of potential adverse effects during the course of the trial. There are significant implications.

A potential solution to this problem would be to modify Clause 18 to exempt research that has been approved by an ethics committee or some other such established safeguard. Implementation of the GDPR through the Data Protection Bill is an opportunity to provide clarity for researchers about the legal basis for processing personal data and the requirements of accountability, transparency and safeguards. At present, there is a great deal of conflicting advice about the implications of the GDPR and there is a risk that organisations will adopt an unnecessarily conservative approach to data protection for fear of committing breaches.

I should like to make two minor points. The Government have committed themselves in their response to Caldicott 3 to putting the National Data Guardian on a statutory footing by 2019. Do the Government intend to table an amendment to do that in this Bill? If they do not, the opportunity will be lost.

Lastly, the noble Lord, Lord Stevenson of Balmacara, mentioned the age of consent for children. The age of 13 seems a ridiculously low age for consent and I would support any amendments that he might introduce.

17:31
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord and listen to his important comments on health data and particularly consent. I thought how brave he was with his data machine. I would worry that my pearls of wisdom would disappear somewhere into the ether, but luckily that did not happen to him.

This is a welcome and necessary Bill. It is not perfect, but I leap to its defence in at least one respect—namely; the absence of the GDPR regulations themselves from the Bill. On the Government’s website, there is a truly helpful document, the Keeling schedule, which sets out how the GDPR intersects with the text of this Bill. After noble Lords have read it a few times, it comes close to being comprehensible.

I will touch on one or two of the imperfections of the Bill that have been drawn to noble Lords’ attention by bodies such as ISACA, techUK, Citibank, Imperial College and others, and I am grateful to them for doing that. I declare my interest as chairman of the Information Assurance Advisory Council and my other interests as in the register. While the Bill has its flaws, I am sure that in Committee and on Report we shall be able to see whether improvements might be made.

The Commission says that the aim of the new rules is to,

“give citizens back control over their personal data, and to simplify the regulatory environment for business”.

The Commission has estimated that this would lead to savings of around €2.3 billion a year for businesses. But while the rules might make things simpler for businesses in that respect, it is possible that they will also make it easier for citizens to demand to know what information is held on them in paper form as well as in digital form. In fact, that is one of the main purposes of the Bill. So we might find that businesses have more rather than less to do. I wonder whether that has been costed. It is a good thing that citizens should find out what information people hold on them, but we should not pretend that the exercise will be free of cost to businesses. The Federation of Small Businesses estimates an additional cost of £75,000 per year for small businesses, and obviously much more for larger ones.

The Bill contains a bespoke regime for the processing of personal data by the police, prosecutors and other criminal justice agencies for law enforcement purposes. The aim of this, which is laudable, is to,

“ensure that there is a single domestic and trans-national regime for the processing of personal data for law enforcement purposes across the whole of the law enforcement sector”,

but what is the law enforcement sector? To what extent do banks, for example, fall into the law enforcement sector? They have obligations under the anti-money laundering rules to pull suspicions together and to share those across borders—not just across European borders but globally. How are those obligations tied in with the GDPR obligations in the Bill? Businesses, especially banks, will need to understand the interplay between the GDPR regulations, the anti-money laundering regulations and all of the others. The Government would not, I know, want to create the smallest risk that by obeying one set of laws you disobey another.

That sort of legal understanding and pulling things together will take time. It will take money and training for all organisations. There is a real concern that too many organisations are simply hoping for the best and thinking that they will muddle through if they behave sensibly. But that is not behaving sensibly. They need to start now if they have not started already. The Federation of Small Businesses says that:

“For almost all smaller firms, the scope of the changes have not even registered on their radar. They simply aren’t aware of what they will need to do”.


Yet it goes on to say that,

“full guidance for businesses will not be available until next year, potentially as late as spring. The regulator cannot issue their guidance until the European Data Protection Board issue theirs”,

so there is a lot of work to be done.

I shall touch on three other issues at this stage of the Bill. The first is Clause 15, which would allow the alteration of the application of the GDPR by regulations subject to affirmative resolution and that could include the amendment or repeal of any of the derogations contained in the Bill. I share the concern expressed by the noble Baroness, Lady Ludford, on that and we will need to look at it.

Secondly, there are various issues around consent. The only one that I will mention is that the Bill provides that the age of consent for children using information society services should be 13. The right reverend Prelate the Bishop of Chelmsford mentioned the YouGov survey about that. I actually believe that the Government have this right. It recognises the reality of today’s social media and the opportunities that the digital world brings, and the Bill also protects young people to some extent by the right to have information deleted at the age of 18. TechUK agrees and so does the Information Commissioner. But if the public do not—and from the sounds of the YouGov survey they do not—there is a lot of work to be done in explaining to people why the age of 13 is the right one.

There is a technical issue that I simply do not understand. The GDPR rules state that the minimum age a Government can set for such consent is 13, and in this Bill, as we know, the Government have gone for the minimum, except in Scotland. Scotland is dealt with in Clause 187 of the Bill and there it seems that the minimum age is 12, unless I have this completely wrong. How do the Government square that with the GDPR minimum of 13?

My final point echoes one raised by the noble Lord, Lord McNally, relating to the issue of the re-identification of personal data which has been de-identified, as set out in Clause 162. The clause makes it a crime to work out to whom the data is referring. The very fact that this clause exists tells us something: namely, that whatever you do online creates some sort of risk. If you think that your data has been anonymised, according to the computational privacy group at Imperial College, you will be wrong. It says:

“We have currently no reason to believe that an efficient enough, yet general, anonymization method will ever exist for high-dimensional data, as all the evidence so far points to the contrary”.


If that is right, and I believe it is, then de-identification does not really exist. And if that is right, what is it in terms of re-identification that we are criminalising under this clause? In a sense, it is an oxymoron which I think needs very careful consideration. The group at Imperial College goes on to suggest that making re-identification a criminal offence would make things worse because those working to anonymise data will feel that they do not have to do a particularly good job. After all, re-identifying it would be a criminal offence, so no one will do it. Unfortunately, in my experience that is not entirely the way the world works.

We can come back to all of these issues in Committee and consider them further, and I look forward to the opportunity of doing so. This is not just a worthwhile Bill; it is an essential and timely one, and I wish it well.

17:41
Baroness Howe of Idlicote Portrait Baroness Howe of Idlicote (CB)
- Hansard - - - Excerpts

My Lords, I have spoken extensively about the imperative to maximise online safety for children and of the need to provide the right tools to empower parents to help keep their children safe online. This will continue to be my priority as we discuss the Data Protection Bill at all its stages. Parents often feel that their children know rather more about accessing the technology than they do, but they still have a role and responsibility to guide their children, and this extends to the topic before us today—the child’s personal data.

During the extensive debates in this House on the Digital Economy Bill, we discussed what young people below the age of 18 should and should not see, and we voted to require a code of practice for the providers of online social media platforms, which is now Section 103 of the Act. In all our discussions about children during those debates, we were referring to individuals under the age of 18, and there was no dispute on the point. I am disappointed that nowhere in the Data Protection Bill’s 208 pages is a child defined as a person under the age of 18.

This Bill puts before us another dividing line between childhood and the influence of parents, the effect of which is nothing if not confusing. Clause 8 states that a child of 13 years can consent to providing data to information services; that is, they can sign up to social media sites and so on. By contrast, the default in the European General Data Protection Regulation is that a child should be 16 years old to be able to give “digital consent”.

The Explanatory Notes state of the age of 13:

“This is in line with the minimum age set as a matter of contract by some of the most popular information society services which currently offer services to children”.


These are contracts driven by decisions under United States federal law in the form of the Children’s Online Privacy Protection Act of 1998. However, the world of technology and what is at our children’s fingertips has changed significantly since 1998. What might have seemed good then does not mean that it is now.

Furthermore, given all the concerns expressed over recent months about the actions of social media sites, the current contracts of these sites should not be driving government policy; rather, the primary factor should be what is best for children and young people, and what is best should be established through a solid evidence base. I hope that the Minister will set out the Government’s evidence-based reasoning for using the age of 13 and tell us what evidence has been collected by the DCMS from children’s charities and those representing parents and others with an interest in these matters.

Choosing the right age for children to consent to signing up to these websites is far from a straightforward issue. I am aware that there is concern among children’s charities that setting the age of digital consent at 16 could lead to an increase in the grooming of young people by abusers, something that none of us in this House would wish to see. The Children’s Society has said that, if Parliament sets the age in Clause 8 at 16, significant changes should be made to the grooming and sexual offences legislation.

I have also received briefing material from BCS, The Chartered Institute for IT, which suggests that there is significant public support for the age being 16 or 18 and very little support for the age being 13. I understand that parents favour firmly the age of 18, so clearly there is a lot of room for discussion, and no doubt we will have it during Committee. In this context, I would like to suggest that the Government should launch an immediate public consultation on this point so that the House can make a fully informed decision before the Bill moves to the other place. Right now, either end of the age spectrum looks like it has dangers.

I also hope that the Minister will set out some clarification of the intentions of the Bill in relation to the consent of children. Paragraph (6) in Clause 8 includes an exemption for “preventive or counselling services”. Does that mean that a child could give their consent to these websites before the age of 13 or not at all? What is defined as a “preventive or counselling service”?

Clause 187 gives further criteria for the consent of children, but only children in Scotland, where a child’s capacity to exercise their consent should be taken into account, with the expectation that a child aged 12 or over is,

“presumed to be of sufficient age and maturity to have such an understanding”.

The Explanatory Notes to the Bill state that this clause must be read with Clause 8, which provides that the age limit is 13. Is Clause 187 intended to say that the age of digital consent cannot go below 13, which is the position of Article 8(1) of the GDPR, or that there might be circumstances when a child who is 13 cannot consent for genuine reasons? Either of these scenarios seems to give rise to confusion for children, parents and the websites that children access.

After all the detailed discussions about age verification that we had earlier in the year, there is an argument for age verification to apply to Clause 8. How will websites that require a child to verify that they are 13 years old ensure that the child is the age that they say they are without some requirement for the site to prove the age of the child? This is surely a meaningless provision. I hope that when the Minister comes to reply, he will set out the Government’s position on this matter and explain what penalties a website which breaches this age requirement will face.

Finally, I hope that the Minister will give us an update on the publication of the Green Paper on internet safety and how the digital charter that was announced in the Queen’s Speech will play into this Bill during its passage through this House and on to the other place.

17:49
Baroness Jay of Paddington Portrait Baroness Jay of Paddington (Lab)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Howe, and to recognise her expertise in discussing the issues around children’s protection. I share many of her ideas. I welcome the Bill, and echo other noble Lords in recognising that it has enormous significance and is very timely. I am grateful for the clear explanation of the EU Committee’s report, which showed the complexities of the continuing interrelationships between this country’s legislation and that of Europe and the way in which we will have to deal with that for many years to come.

At this stage, it is worth reminding ourselves—or at least reminding myself—that we are talking about so many areas of our society today and so many aspects of 21st century life which we are aware that not all of us understand. I know there are many experts in this field. I refer in particular to the noble Baroness, Lady Lane-Fox, who will speak after me, when I say that there are people who clearly understand all the implications of the wider digital economy. However, I put myself among the majority of the population when I say that, although I am aware of the vast number of ways in which the digital revolution impacts on and, perhaps somewhat frighteningly, dominates our everyday lives, it is almost impossible for most of us to know how and by whom our personal data is being collected, with whom it is shared and to whom it is probably sold. Therefore, robust protection of privacy and the ethical regulation of data are essential if we are to continue with our democratic principles.

My noble friend on the Front Bench, Lord Stevenson, has already referred to some of the gaps that he sees in this legislation; no doubt those will be referred to and returned to at a later stage. I am concerned that the way some of the Bill is drafted already suggests that we are once again moving into that area where the role of this House and the other place is diminished by so much secondary legislation being proposed. I do not apologise for raising yet again, as I have in previous debates, what I see as a paradox: so much of the support for Brexit depended on the restoration of parliamentary sovereignty to Westminster, yet when we come to look at the detail of some of the Bills to implement some of the implications of Brexit—particularly in this kind of complex area—we find that the presentation is often based on secondary legislation where the role of this House, particularly in scrutinising and revising, and that of the other place, is somewhat diminished. It seems an extraordinary paradox to me.

Noble Lords have already referred to Clause 15, which is particularly worrisome in this area. It would clearly permit alterations by the affirmative action procedure. It will be important, when we debate the detail of the Bill, to recognise that professional bodies are already mentioning that as a concern. As was mentioned briefly by a previous speaker—I think it was the noble Lord, Lord McNally—I draw the attention of the House to the British Medical Association having drawn particular attention to the potential problem of regulations being altered in this way. Noble Lords will be aware that the security of sensitive healthcare information is clearly essential to good medical practice. The BMA is now concerned that the centrally important trust in doctor/patient relationships may be threatened in future if changes in data sharing can be fast-tracked without proper scrutiny through the secondary legislation process. Again, the House will be aware that, as the law stands, healthcare information has special protection through the common-law duty of confidentiality. I hope it will be possible for the Government to assure the House, at the earliest opportunity, that the proposed regulatory powers will not be overridden in that way, and in particular that that crucial safeguard will continue to exist. It may be possible to give a general assurance on the general procedures on regulation.

I turn to some of the questions which arise from what I describe as general ignorance about the uses and abuses of personal data in the global digital economy. My noble friend Lord Puttnam, who is unavoidably away today—and who is a greater expert and far more authoritative in this field than me—wanted to contribute to the debate by suggesting some ways of improving the situation of so-called digital literacy by means of the Bill. With his permission, I will mention his proposals, which I am sure he will return to at the later stages. It is, of course, completely extraordinary to me that when my noble friend Lord Puttnam and I worked together in 2003 on the Communications Bill, that Act contained no reference to the internet. In the 14 years since, we have all become familiar with so many digital concepts: standardised algorithms, bots, big data and what is increasingly referred to as “data capitalism”. We are familiar with the words, but I am not sure that we all understand their implications for privacy and personal data.

It has been said this afternoon that national Governments now face the legal and technical challenge of trying to regulate international communication and information flows, which are largely controlled by a handful of American-based internet corporations. In this parliamentary Session, I have the privilege of sitting on your Lordships’ Select Committee on Political Polling and Digital Media. We are investigating the questions of accuracy and transparency thrown up by using internet data in politics. We are only beginning to uncover the complexities and threats that the new systems create. Again, in this context, in the last year we have all heard about so-called fake news and possibly even Kremlin-inspired online intervention in western democracies. Only yesterday, there were reports of operatives using individual Facebook accounts to generate support for President Trump; but is it possible to influence effectively, or control, any of that in the public interest? As a good democrat, the noble Lord, Lord McNally, remains optimistic, but I find it very hard to see how an individual Government can act legislatively to moderate the growing tsunami of online data exchange—and how through the law we can protect individuals from manipulation and exploitation.

A possible route that, optimistically, could influence behaviour and protect citizens from the most egregious breaches of their privacy is through public education. That is obviously a long-term project. Creating better-informed consumers who understand how their shared personal data may be used, and what may happen to data when it is passed on, would clearly be an advantage. That is important when we are talking—as the noble Baroness, Lady Howe, and other contributors did before me—about young people growing up with the internet. They are the greatest users of every type of social media but, although they may be technically adept, they are often the most ignorant about what they are signing up to or giving away when they use seductive sites or post so much information online.

I welcome the provision in the Bill that allows young people to remove content—the right to be forgotten. However, I share the concerns of the noble Baroness, Lady Howe, the right reverend Prelate and others about the age of consent being 13. As a grandmother, as they say, I would be very happy to see that age raised. As referred to by the right revered Prelate, who is not in his place, it is interesting that, when surveyed, 81% of the general public wanted to try to raise that age. I hope we will return to this issue at a later stage.

It is important to look at some of the fundamental issues about how we can achieve better public education in this field. Do we need to think again about how to achieve a digitally literate population in the true sense, which in turn could hopefully influence the attitudes and actions of the big tech companies and change the opinion of the world? That may be a more sensible way to proceed than continuing to make what may be vain attempts to regulate the ever-expanding web. The House will remember, as the noble Lord, Lord McNally, has already said, that in the original Communications Act 2003, Ofcom was given the specific duty of promoting “media literacy”. In that Act—perhaps I may quote from it—the duty is very broadly based. First, it is,

“to bring about, or to encourage others to bring about, a better public understanding of the nature and characteristics of material published by means of the electronic media”.

Secondly, it is,

“to bring about, or to encourage others to bring about, a better public awareness and understanding of the processes by which such material is selected, or made available, for publication by such means”.

However, since the passage of the Bill, Ofcom seems largely to have interpreted these responsibilities in rather a narrow and perhaps pragmatic way. For example, it has asked how we can ensure that the elderly population has appropriate access to digital technology and how internet drop-out areas, or areas where it is difficult to achieve broadband, can be improved?

My noble friend Lord Puttnam is therefore proposing that in Part 5 of the Bill, which covers the Information Commissioner, a wider duty be placed on the commissioner to act with Ofcom, and indeed with the Department for Education and the DCMS, on the use and abuse of personal data. He sees this as something that could be included by amendment in the “general functions” of the commissioner or established under a separate code of practice. He suggests that a code of practice could, for example, confer special responsibilities on the big technology giants to engage in the collaborative development of digital media skills. It does not seem naively optimistic to think that this type of statutory leverage could be influential. It could be a useful exercise of “soft power” to achieve more informed and responsible internet use by both providers and consumers. Effective and proper digital literacy is an approach that would avoid the continuing search for a national regulatory solution to some of the problems of the global digital economy—it may be long-term but it seems worth undertaking. I am sure my noble friend Lord Puttnam will table amendments in Committee.

I welcome the Government’s intention to update and strengthen a robust system of data protection. It is certainly an ambition that has recently been made more difficult both by corporately owned global technology giants which transcend the authority of national Governments and by the huge expansion of internet technology. I am glad that the Bill has started in this House, as I am sure it will, as always, be improved by your Lordships’ scrutiny and revision.

18:02
Baroness Lane-Fox of Soho Portrait Baroness Lane-Fox of Soho (CB)
- Hansard - - - Excerpts

My Lords, happy Ada Lovelace Day. How prescient of the Whips and the Minister to pick today for Second Reading. To remind colleagues who might be wondering: she was one of the great innovators of computing in the 19th century. She worked with Charles Babbage on his computational engine, she was the first to recognise that the machine had applications beyond pure calculation, and, in fact, she probably created the first algorithm intended to be carried out by that machine. As part of that, she is often regarded as the first to recognise the full potential of computing, so it could hardly be more apt to pick today for this Second Reading debate, in which we are probably looking at the consequences of the work that she started all those years ago.

The Government’s ambition is to,

“make Britain the best place to start and run a digital business; and … the safest place in the world to be online”,

as detailed in the Conservative manifesto. This Bill is intended to,

“ensure that our data protection framework is suitable for our new digital age, and cement the UK’s position at the forefront of technological innovation, international data sharing and protection of personal data”.

This aspiration to be the best, to make the UK a world leader and set a precedent for good regulation of our digital worlds, is admirable, but that means that the Bill must set the bar high. It must be the very best it can be, especially as we head towards Brexit, where having the highest standards around the collection and use of data will be vital not just to digital businesses but to our continued ability to trade. This Bill must be the foundation for that. There is much that is good in the Bill, but I do not believe that it is yet the best that it can be.

I must start with a confession. Despite the kind references today to my career and supposed expertise, I found this Bill incredibly hard to read and even harder to understand. I fear that we will not do enough to stop the notion, referred to by the noble Lord, Lord McNally, that we are sleepwalking into a dystopian future if we do not work hard to simplify the Bill and make it accessible to more people, the people to whom I feel sure the Government must want to give power in this updated legislation. Let us ensure that the Bill is a step forward for individual power in the rapidly changing landscape in which we sit, a power that people understand and, importantly, use. Let us make it an indicator to the world that the UK balances the importance of tech start-ups, innovation, foreign investment and big businesses with consumer and citizen rights.

The Government should be commended for getting ahead of movements that are growing all over the world to free our data from the tech giants of our age. As data becomes one of our most valuable resources—as we have heard, the new oil—individuals have begun to want a stake in determining for themselves when, how and to what extent information about them is held and communicated to others. So I welcome the clear data frameworks, which are important not only for the best digital economy but for the best digital society.

I agree with much that has been said today but want to make three specific points on the Bill. First, from any perspective, the GDPR is difficult to comprehend, comprising sweeping regulations with 99 articles and 173 recitals. The Bill contains some wonderful provisions, of which my favourite is:

“Chapter 2 of this Part applies for the purposes of the applied GDPR as it applies for the purposes of the GDPR … In this Chapter, “the applied Chapter 2” means Chapter 2 of this Part as applied by this Chapter”.


Giving people rights is meaningful only if they know that they have them, what they mean, how to exercise them, what infringement looks like and how to seek redress for it. There are questions about the practical workability of a lot of these rights. For example, on the right to portability, how would the average person know what to do with their ported data? How would they get it? Where would they keep it? There was a funny example in a newspaper recently where a journalist asked Facebook to send them all the data that it had collected over the previous eight years and received a printed copy of 800 pages of data—extremely useful, as I think you will agree. What about your right to erase your social media history? I should declare my interest as a director of Twitter at this point. How can you remove content featuring you that you did not post and in which people may have mentioned you? What happens as the complexity of the algorithm becomes so sophisticated that it is hard to separate out your data? How does the immense amount of machine learning deployed already affect your rights, let alone in the future?

Awareness among the public about the GDPR is very low—the Open Data Institute has done a lot of work on this which is soon to be published. It is very unlikely that ordinary people understand this legislation. They will have no understanding of how their rights affect them. A lot of education work needs to be done.

For businesses, too, the learning curve is steep, especially for foreign investors in European companies. Some are betting that the sheer scope of the GDPR means that the European regulators will struggle to enforce it. When the GDPR came up at a recent industry start-up event, one industry source said that none of the people to whom they had spoken could confidently say that they had a plan. Every online publisher and advertiser should ensure that they do, but none of them is taking steps to prepare.

So much has been done by this Government on building a strong digital economy that it is important to ensure that small and start-up businesses do not feel overwhelmed by the changes. What substantial help could be planned and what education offered? What help is there with compliance? By way of example, under Clause 13, companies have 21 days to show bias from algorithms, but what does this mean for a small AI start-up which may be using anonymised intelligence data to build a new transport or health app? What do they have to think about to make good legal decisions? As my noble friend Lord Jay so brilliantly argued, how can we ensure post-Brexit legislative certainty for them in building global successful businesses?

This brings me to my second question: why has the right of civil groups to take action on behalf of individuals been removed from the UK context for the GDPR? Instead, the Bill places a huge onus on individuals, who may lack the know-how and the ability to fight for their rights. As has been mentioned, article 80(1) of the GDPR allows for representative bodies—for example, consumer groups—to bring complaints at the initiation of data subjects. Article 80(2) allows those groups to bring complaints where they see infringements of data rights without an individual having to bring the case themselves. These give consumers power. It supports their rights without them having to specifically understand that the rights exist, or how to exercise them. Unfortunately, article 80(2) is an optional clause and the UK has omitted it. This omission is worrying, given how stretched the ICO’s resources are and the impact this could have on its support for the public. Granting rights over data to individuals is meaningless if individuals lack the understanding to exercise those rights and there is no infrastructure within civic society to help them exercise those rights. However, we have many organisations in this country—Citizens Advice, Which?—which have these kinds of rights of free-standing action in relation to other regulations. There does not seem to be any good reason why the UK has chosen not to take up the option in EU law to allow consumer privacy groups to lodge independent data protection complaints as they can currently do under consumer rights laws.

Citizens face complex data trails. It is impossible for the average person to be able to know which organisations hold their personal data. Enabling privacy groups to take independent action will ensure these rights are enforced. As it stands, under the Bill the ICO is currently the main recourse for this.

Resourcing the ICO, Part 5 of the Bill, is essential and my third main area of interest. The ICO has considerable responsibilities and duties under the Bill towards both business and individuals: upholding rights, investigating reactively, informing and educating to improve standards, educating people and consumer groups, and maintaining international relationships. I feel exhausted thinking about it. The ICO’s workload is vast and increasing. It lacks sufficient resources currently. In March 2017, the Information Commissioner asked Parliament if it could recruit 200 more staff but the salaries it offers are significantly below those offered by the private sector for roles requiring extremely high levels of skills and experience. These staff are going to become ever more important and more difficult to recruit in the future.

The ICO currently funds its data protection work by charging fees to data controllers. It receives ring-fenced funding for its freedom of information request work from the Government. This income can increase the number of data controllers only as it increases: it is not in line with the volume or complexity of work, and certainly not with that in the Bill. Perhaps it is time for another method of funding, such as statutory funding.

Finally, I would like briefly to add my thoughts on how the Bill affects children. As many noble Lords have said, the YouGov poll does indeed say that 80% of the public support raising the age to 18—currently it is 13, as detailed by the Government. However, there are many other surveys, particularly one by the Children’s Society, which show that 80% of 13 year-olds currently have a social media account and 80% of people under 13 have lied or twisted their age in order to establish one. This is the realpolitik in the war of understanding the internet with our children. I respectfully disagree with the noble Baroness, Lady Howe, and others in the Chamber: I feel strongly that it is wrong to place policing at the heart of how we deal with relationships between children and the internet. We need to take a systems-based approach. I have seen my godchildren set up fake accounts and whizz around the internet at a speed I find alarming. We have to deal on their terms. We have to help educators, parents and people supporting children, not use the long arm of the law.

There are many anomalies, as has already been detailed, as well as discrepancies with Scotland, differences between parental oversight and explicit giving of consent, problems with data collection and how the digital charter will work, and so on, and those are all important. However, I am optimistic too—I always am—and there is much to welcome in the Bill. I am particularly optimistic if we can work in tandem on the wider digital understanding of our society, as so brilliantly detailed by the noble Baroness, Lady Jay. I wish I could discuss the important themes in the Bill with Ada Lovelace, but in her absence I will have many good discussions with people in this Chamber so that we can all work hard to ensure that citizens and consumers reap the benefits of the Bill.