Margot James debates involving the Home Office during the 2017-2019 Parliament

Data Protection Bill [ Lords ] (Eighth sitting)

Margot James Excerpts
Thursday 22nd March 2018

(6 years, 1 month ago)

Public Bill Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
The Secretary of State vividly and colourfully said in The Times, and in his podcast with Nick Robinson, which comes out tomorrow, that the wild west is over and a new order will descend. The new clause urges the Government to put some deeds behind those grand words.
Margot James Portrait The Minister of State, Department for Digital, Culture, Media and Sport (Margot James)
- Hansard - -

I agree with everything the right hon. Gentleman has said, except that I do not think the Bill is the place for his proposals. The e-commerce directive and the Electronic Commerce (EC Directive) Regulations 2002, which transpose it into UK law, regulate services that are

“normally provided for remuneration, at a distance, by means of electronic equipment…and at the individual request of a recipient of a service”.

Those services are known as information society services.

However, questions relating to the processing of personal data by information society services are excluded from the scope of the e-commerce directive and hence excluded from the scope of the 2002 regulations. That is because the processing of personal data is regulated by other instruments, including, from May, the GDPR. The review of the application and operation of the 2002 regulations solely in relation to the processing of personal data, as proposed by new clause 13, would therefore be a speedy review to undertake.

However, that does not address the substance of the right hon. Gentleman’s concern, which we have already discussed in a delegated legislation Committee earlier this month. As I said then, the Government are aware of his concern that the e-commerce directive, finalised in 2000, is now outdated, in particular with regard to its liability provisions.

Those provisions limit, in specified circumstances, the liability that service providers have for the content on their sites. That includes social media platforms where they act as hosts. Social media companies have made limited progress on a voluntary basis, removing some particularly harmful content quickly and, in recent years, consistently. However, as we have seen in the case of National Action and its abhorrent YouTube videos, and many other lower-profile cases, there is a long way to go. We do not rule out legislation.

The Government have made it clear through our digital charter that we are committed to making the UK the safest place to be online, as well as the best place to grow a digital business. As the Prime Minister has said, when we leave the EU we will be leaving the digital single market, including the e-commerce directive. That gives us an opportunity to make sure that we get matters corrected for the modern age: supporting innovation and growth, and the use of modern technology, but doing so in a way that commands the confidence of citizens, protects their rights and makes their rights as enforceable online as they currently are offline.

The UK will be leaving the digital single market, but we will continue to work closely with the EU on digital issues as we build up our existing strong relationship in the future economic partnership. We will work closely with a variety of partners in Europe and further afield. Alongside that, our internet safety strategy will tackle the removal of harmful but legal content. Through the introduction of a social media code of practice and annual transparency report, we will place companies under an obligation to respond quickly to user reports and to ensure that their moderation processes are fit for purpose, with statutory backing if required. We have demonstrated that in the example of the introduction of age verification for online pornography.

There is an important debate to be had on the e-commerce directive and on platform liability, and we are committed to working with others, including other countries, to understand how we can make the best of existing frameworks and definitions. Consideration of the Bill in Committee and on Report are not the right places for that wide debate to be had. For those reasons, I request that the right hon. Gentleman withdraw the clause.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I admire the Minister’s concern and ambition for administrative tidiness. She reminds me of an old quote by Bevin, who said once, “If you are a purist, the place for you is not a Parliament; it is a monastery.”

Margot James Portrait Margot James
- Hansard - -

A nunnery.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

In the case of the Minister, a nunnery, although Bevin was less enlightened than the hon. Lady. Here is a Bill; here is a new clause; the new clause is within scope. The object of the new clause is to deliver a Government objective, yet it is rejected. That is hard logic to follow. We have had the tremendous assurance, however, that there will be nothing less than a code of practice, so these huge data giants will be shaking in their boots in California, when they wake up. They will be genuinely concerned and no doubt already planning how they can reform their ways and stop the malpractice that we have grown all too used to. I am afraid that these amount to a collection of warm words, when what the country needs is action. With that in mind, I will push the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 17 is in my name and that of my right hon. Friend the Member for Birmingham, Hodge Hill. I do not take it personally that my other hon. Friends have not signed up to it; that was probably my fault for not asking them to do so in advance.

The new clause would bring a statutory footing to the data and artificial intelligence ethics unit, which I am very pleased that the Government have now funded and established, through the spring statement, in the Minister’s Department. It comes off the back of conversations with the Information Commissioner in Select Committee about the differing roles of enforcing legislation and of having a public debate about what is right and wrong and what the boundaries are in this ever-changing space. The commissioner was very clear that we need to have that debate with the public, but that it is not for her to do it. The ICO is an enforcer of legislation. The commissioner has a lot on her plate and is challenged by her own resource as it is. She felt that the new unit in the Department would be a good place to have the debate about technology ethics, and I support that assertion.

With no disrespect to any colleagues, I do not think that the House of Commons, and perhaps even the Select Committees to a certain extent, necessarily has the time, energy or resource to get into the real detail of some of the technology ethics questions, nor to take them out to the public, who are the people we need to be having the debate with.

The new clause would therefore establish in law that monitoring, understanding and public debate obligation that I, the ICO and others agree ought to exist in the new data ethics unit, but make it clear that enforcement was reserved for the Information Commissioner. I tabled the new clause because, although I welcome the Government’s commitment to the data and AI ethics unit, I feel that there is potential for drift. The new clause would therefore put an anchor in the technology ethics requirement of the unit so that it understands and communicates the ethical issues and does not necessarily get sidetracked into other issues, although it may seek to do that on top of this anchor. However, I think this anchor needs to be placed.

Also, I recognise that the Minister and the Secretary of State supported the recommendation made previously under the Cameron Government and I welcome that, but of course, with an advisory group within the Department, it may be a future Minister’s whim that they no longer wish to be advised on these issues, or it may be the whim of the Treasury—with, potentially, budget cuts—that it no longer wishes to fund the people doing the work. I think that that is not good enough and that putting this provision in the Bill would give some security to the unit for the future.

I will refer to some of the comments made about the centre for data ethics and innovation, which I have been calling the data and AI ethics unit. When it was first discussed, in the autumn Budget of November 2017, the Chancellor of the Exchequer said that the unit would be established

“to enable and ensure safe, ethical and ground-breaking innovation in AI and data-driven technologies. This world-first advisory body will work with government, regulators and industry to lay the foundations for AI adoption”.

Although that is a positive message, it says to me that its job is to lay the foundations for AI adoption. I agree with that as an aim, but it does not mean that at its core is understanding and communicating the ethical challenges that we need to try to understand and legislate for.

I move on to some of the documents from the recruitment advertising for personnel to run the unit from January of this year, which said that the centre will be at the centre of plans to make the UK the best place in the world for AI businesses. Again, that is a positive statement, but one about AI business adoption in this country, not ethical requirements. It also said that the centre would advise on ethical and innovative uses of data-driven tech. Again, that is a positive statement, but I just do not think it is quite at the heart of understanding and communicating and having a debate about the ethics.

My concern is that while all this stuff is very positive, and I agree with the Government that we need to maintain our position as a world leader in artificial intelligence and that it is something we need to be very proud of—especially as we go through the regrettable process of leaving the European Union and the single market, we need to hold on to the strengths we have in the British economy—this week has shown that there is a need for an informed public debate on ethics. As no doubt all members of the Committee have read in my New Statesman article of today, one of the issues we have as the voice of our constituents in Parliament is that in order for our constituents to understand or take a view on what is right or wrong in this quickly developing space, we all need to understand it in the first place—to understand what is happening with our data and in the technology space, to understand what is being done with it and, having understood it, to then to take a view about it. The Cambridge Analytica scandal has been so newsworthy because the majority of people understandably had no idea that all this stuff was happening with their data. How we legislate for and set ethical frameworks must first come from a position of understanding.

That is why the new clause sets out that there should be an independent advisory board. The use of such boards is commonplace across Departments and I hope that would not be a contentious question. Subsection (2) talks about some of the things that that board should do. The Minister will note that the language I have used is quite careful in looking at how the board should monitor developments, monitor the protection of rights and look out for good practice. It does not seek to step on the toes of the Information Commissioner or the powers of the Government, but merely to understand, educate and inform.

The new clause goes on to suggest that the new board would work with the commissioner to put together a code of practice for data controllers. A code of practice with a technology ethics basis is important because it says to every data controller, regardless of what they do or what type of work they do, that we require ethical boundaries to be set and understood in the culture of what we do with big data analytics in this country. In working with the commissioner, this board would add great value to the way that we work with people’s personal data, by setting out that code of practice.

I hope that the new clause adds value to the work that the Minister’s Department is already doing. My hope is that by adding it to the Bill—albeit that current Parliaments cannot of course bind their successors and it could be legislated away in the future—it gives a solid grounding to the concept that we take technology ethical issues seriously, that we seek to understand them properly, not as politicians or as busy civil servants, but as experts who can be out with our stakeholders understanding the public policy consequences, and that we seek to have a proper debate with the public, working with enforcers such as the ICO to set, in this wild west, the boundaries of what is and is not acceptable. I commend the new clause to the Committee and hope that the Government will support it.

Margot James Portrait Margot James
- Hansard - -

I thank the hon. Gentleman for raising this very important subject. He is absolutely right. Data analytics have the potential to transform whole sectors of society and the economy—law enforcement and healthcare to name but some. I agree with him that a public debate around the issues is required, and that is one of the reasons why the Government are creating the centre for data ethics and innovation, which he mentioned. The centre will advise the Government and regulators on how they can strengthen and improve the way that data and AI are governed, as well as supporting the innovative and ethical use of that data.

--- Later in debate ---
It is clear that the law is hopelessly outdated. I hope this is a subject on which we can agree. We are now at the receiving end of a new generation of active measures, which are one of the greatest threats to us since the emergence of al-Qaeda at the beginning of the century. We must redouble our defences, so the new clause would give the Electoral Commission the power to issue targeted disclosure notices that require those who seek to influence a political campaign to share with the world information about who is being targeted with what and—crucially—who is writing the cheques.
Margot James Portrait Margot James
- Hansard - -

I will be brief in answering some of the serious matters raised by the right hon. Gentleman. The Information Commissioner, as the data regulator, is investigating alleged abuses as part of a broader investigation into the use of personal data during political campaigns. I have said many times that the Bill will add significantly to the commissioner’s powers to conduct investigations, and I have confirmed that we keep an open mind and are considering actively whether further powers are needed in addition to those set out in the Bill.

The Electoral Commission is the regulator of political funding and spending. The commission seeks to bring transparency to our electoral system by enforcing rules on who can fund and how money can be spent, but new clause 21 is about sending the commission into a whole new field: that of personal data regulation. That field is rightly occupied by the Information Commissioner. We can debate whether she needs more powers in the light of the current situation at Cambridge Analytica, and as I have said we are reviewing the Bill.

While the Electoral Commission already has the power to require the disclosure of documents in relation to investigations under its current remit, new clause 21 would provide the commission with new powers to require the disclosure of the settings used to disseminate material. However, understanding how personal data is processed is outside the commission’s remit.

The right hon. Gentleman suggested that his amendment would help with transparency on who is seeking to influence elections, which is very much needed in the current climate. The Government take the security and integrity of democratic processes very seriously. It is absolutely unacceptable for any third country to interfere in our democratic elections or referendums.

On new clause 22, the rules on imprints in the Political Parties, Elections and Referendums Act 2000 are clear. The current rules apply to printed election material no matter how it is targeted. However, the Secretary of State has the power under section 143 to make regulations covering imprints on other types of material, including online material. New clause 22 would therefore not extend the type of online material covered by such regulations. We therefore believe the new clause is unnecessary. The law already includes printed election material disseminated through the use of personal data gathered by whatever means, and the Government will provide further clarity on extending those rules to online material in due course by consulting on making regulations under the power in section 143(6).

On that basis, I ask the right hon. Gentleman to withdraw his new clause.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

That is a deeply disappointing answer. I was under the impression that the Secretary of State said in interviews today that he is open-minded about the UK version of the Honest Ads Act that we propose. That appears to be in some contrast to the answer that the Minister offered.

What this country has today is an Advertising Standards Authority that does not regulate political advertising; Ofcom, which does not regulate video when it is online; an Electoral Commission without the power to investigate digital campaigning; and an Information Commissioner who cannot get a search warrant. Worse, we have a Financial Conduct Authority that, because it does not have a data sharing gateway with the Electoral Commission, cannot share information about the financial background of companies that might have been laundering money going into political and referendum campaigns. The law is hopelessly inadequate. Through that great hole, our enemies are driving a coach and horses, which is having a huge impact on the health and wellbeing of our democracy.

That is not a day-to-day concern in Labour constituencies, but it is for the Conservative party. Voter Consultancy Ltd took out targeted dark social ads aimed at Conservative Members, accusing some of them of being Brexit mutineers when they had the temerity to vote for common sense in a vote on Brexit in this House. Voter Consultancy Ltd, for those who have not studied its financial records at Companies House, as I have, is a dormant company. It has no accounts filed. There is no cash flowing through the books. The question that provokes is: where does the money come from for the dark social ads attacking Conservative Members? We do not know. It is a matter of public concern that we should.

The law is out of date and needs to be updated. I will not press the matter to a vote this afternoon because I hope to return to it on Report, but I hope that between now and then the Minister and the Secretary of State reflect on the argument and talk to Mark Sedwill, the National Security Adviser, about why the national security strategy does not include an explicit objective to defend the integrity of our democracy. I hope that that change is made and that, as a consequence, further amendments will be tabled to ensure that our democracy is protected against the threats we know are out there.

I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

Question proposed, That the Chair do report the Bill, as amended, to the House.

Margot James Portrait Margot James
- Hansard - -

On a point of order, Mr Streeter. I wanted to thank you, and Mr Hanson in his absence, as well as, in the House of Lords, my noble Friends Lord Ashton, Baroness Williams, Lord Keen, Baroness Chisholm and Lord Young, and the Opposition and Cross-Bench peers. I also thank the Under-Secretary of State for the Home Department, my hon. Friend the Member for Louth and Horncastle, and the Opposition Front Bench Members—the right hon. Member for Birmingham, Hodge Hill, with whom it has been a pleasure debating in the past two weeks, and the hon. Member for Sheffield, Heeley, who was not able to be in her place this afternoon.

I offer great thanks to both Whips. It was the first Bill Committee for my hon. Friend the Member for Selby and Ainsty in his capacity as Whip, and my first as Minister, and it has been a pleasure to work with him. I also thank the hon. Member for Ogmore. My hon. Friend the Under-Secretary and I are grateful to our Parliamentary Private Secretary, my hon. Friend the Member for Mid Worcestershire, who has worked terribly hard throughout the proceedings, as indeed have the Clerks, the Hansard writers, the Doorkeepers and the police. Without the officials of my Department and, indeed, the Home Office, we would all have been bereft, and I am most grateful to all the officials.

Question put and agreed to.

Bill, as amended, accordingly to be reported.

Data Protection Bill [ Lords ] (Fifth sitting)

Margot James Excerpts
Tuesday 20th March 2018

(6 years, 1 month ago)

Public Bill Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Other general functions of the Commissioner
Margot James Portrait The Minister of State, Department for Digital, Culture, Media and Sport (Margot James)
- Hansard - -

I beg to move amendment 122, in schedule 13, page 194, line 36, leave out from beginning to end of line 4 on page 195.

This amendment is consequential on the omission of Clause 121 (see Amendment 47).

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 121 stand part.

Margot James Portrait Margot James
- Hansard - -

Amendment 122 and clause 121 deal with measures inserted into the Bill with the intention of protecting and valuing certain personal data held by the state—an issue championed by Lord Mitchell, to whom I am grateful for taking the time to come to see me to further explain his amendments, and for giving me the opportunity to explain how we plan to address the issues he raised.

Lord Mitchell’s amendments require the Information Commissioner to maintain a register of publicly controlled data of national significance and to prepare a code of practice that contains practical guidance in relation to personal data of national significance, which is defined as data that, in the Commissioner’s opinion,

“has the potential to further…economic, social or environmental well-being”

and

“financial benefit…from processing the data or the development of associated software.”

Lord Mitchell has made it clear that his primary concern relates to the sharing of health data by the NHS with third parties. He believes that some information sharing agreements have previously undervalued NHS patient data, and that the NHS, along with other public authorities, needs additional guidance on optimising the benefits derived from such sharing agreements.

We agree that the NHS is a prime state asset, and that its rich patient data records have great potential to further medical research. Its data could be used to train systems using artificial intelligence to diagnose patients’ conditions, to manage risk, to target services and to take pre-emptive and preventive action—all developments with huge potential. I have discussed this matter with ministerial colleagues; not only do we want to see these technological developments, but we want the NHS, if it is to make any such deals, to make fair deals. The benefits of such arrangements are often not exclusively monetary.

NHS patient data is only ever used within the strict parameters of codes of practice and the standards set out by the National Data Guardian and other regulatory bodies. We of course recognise that we must continue in our efforts to make the best use of publicly held data, and work is already being carried out to ensure that the value of NHS patient data is being fully recognised. NHS England and the Department of Health and Social Care have committed to working with representatives of the public and of industry to explore how to maximise the benefits of health and care data for patients and taxpayers.

Lord Mitchell’s provision in clause 121 proposes that the commissioner publish a code of practice. However, if there is a problem, a code would seem to be an unduly restrictive approach. Statutory codes are by necessity prescriptive, and this is an area where the public may benefit from a greater degree of flexibility than a code could provide in practice, especially to encourage innovation in how Government use data to the benefit of both patients and taxpayers.

The Government are releasing public data to become more transparent and to foster innovation. We have released more than 40,000 non-personal datasets. Making the data easily available means that it will be easier for people to make other uses of Government-collected data, including commercial exploitation or to better understand how government works and to hold the Government to account. The benefits of each data release are quite different, and sometimes they are unknown until later. Lord Mitchell’s primary concern is health data, but can guidance on how that is used be equally applicable to the vast array of data we release? Such guidance would need to be so general that it would be useless.

Even if we stay focused on NHS data and what might help to ensure that the value of it is properly exploited, Lord Mitchell’s proposal has some significant problems. First, by definition, data protection legislation deals with the protection of personal data, not general data policy. Companies who enter into data sharing agreements with the NHS are often purchasing access to anonymised patient data—that is to say, not personal data. Consequently, the code in clause 121 cannot bite. Secondly, maintaining a register of data of national significance is problematic. In addition to the obvious bureaucratic burden of identifying the data that would fall under the definition, generating a list of data controllers who hold data of national significance is likely to raise a number of security concerns. The NHS has been the victim of cyber- attacks, and we do not want to produce a road map to resist those who want to harm it.

Thirdly, we do not believe that the proposed role is a proper one for the Information Commissioner, and nor does she. It is not a question of legislative enforcement and, although she may offer valuable insight on the issues, such responsibilities do not comfortably fit with her role as regulator of data protection legislation. We have consulted the commissioner on the amendments and she agrees with our assessment. In her own terms, she considers herself not to be best placed to advise on value for money and securing financial benefits from the sharing of such personal data with third parties. Those matters are far removed from her core function of safeguarding information rights. She adds that others in Government or the wider public sector whose core function it is to drive value from national assets may be a more natural home for providing such best practice advice.

Ian Murray Portrait Ian Murray (Edinburgh South) (Lab)
- Hansard - - - Excerpts

I have the great pleasure of representing a constituency with one of the best medical research facilities in the world. One of the greatest impediments for that facility is getting access to anonymised NHS data for its research. Is the Minister saying that her amendment, which would remove the Lords amendment, would make it easier or more difficult for third parties to access that anonymised data?

Margot James Portrait Margot James
- Hansard - -

I am ill-qualified to answer the hon. Gentleman’s question. Hypothetically, it would probably make it more difficult, but that is not our purpose in objecting to clause 121, which we do not see as being consistent with the role of the Information Commissioner, for the reasons I set out. However, he raises an interesting question.

I agree with Lord Mitchell that the issues that surround data protection policy, particularly with regard to NHS patient data, deserve proper attention both by the Government and by the National Data Guardian for Health and Care, but we have not yet established that there is any evidence of a problem to which his provisions are the answer. We are not sitting on our laurels. As I have already said, NHS England and the Department of Health and Social Care are working to ensure that they understand the value of their data assets. Further work on the Government’s digital charter will also explore this issue. When my right hon. friend the Prime Minister launched the digital charter on 25 January, she made it clear that we will set out principles on the use of personal data.

Amendment 122 removes Lord Mitchell’s amendment from schedule 13. We do this because it is the wrong tool; however, we commit to doing everything we can to ensure that we further explore the issue and find the right tools if needed. [Interruption.] I have just received advice that the amendments will make no difference in relation to the hon. Gentleman’s question, because anonymised data is not personal data.

I commend amendment 122 and give notice that the Government will oppose the motion that clause 121 stand part of the Bill.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I am grateful that the Minister made time to meet my former noble Friend Lord Mitchell. These are important amendments and it is worth setting out the background to why Lord Mitchell moved them and why we give such priority to them.

In 2009-10, we began to have a debate in government about the right approach to those agencies which happen to sit on an enormous amount of important data. The Government operate about 200 to 250 agencies, and some are blessed with data assets that are more valuable than those of others—for example, the Land Registry or Companies House sit on vast quantities of incredibly valuable transactional data, whereas other agencies, such as the Meteorological Office, the Hydrographic Office and Ordnance Survey, sit on sometimes quite static data which is of value. Some of the most successful American companies are based on Government data—for example, The Weather Channel is one of the most valuable and is based on data issued from, I think, the US meteorological survey. A number of Government agencies are sitting on very valuable pots of data.

The debate that we began to rehearse nearly 10 years ago was whether the right strategy was to create public-private partnerships around those agencies, or whether more value would be created for the UK economy by simply releasing that data into the public domain. I had the great pleasure of being Chief Secretary to the Treasury and the Minister for public service reform. While the strong advice inside the Treasury was that it was better to create public-private partnerships because that would release an equity yield up front, which could be used for debt reduction, it was also quite clear to officials in the Cabinet Office and those interested in public service reform more generally that the release of free data would be much more valuable. That is the side of the argument on which we came down.

After the White Paper, “Smarter Government”, that I brought to the House, we began the release of very significant batches of data. We were guided by the arguments of Tim Berners-Lee and Professor Nigel Shadbolt, who were advising us at the time, that this was the right approach and it was very good to see the Government continue with that.

There are still huge data pots locked up in Government which could do with releasing, but the way in which we release them has to have an eye on the way we create value for taxpayers more generally. Beyond doubt, the area of public policy and public operations where we have data that is of the most value is health. The way in which, in the United States, Apple and other companies have now moved into personal health technology in a substantial way betrays the reality that this is going to be a hugely valuable and important market in years to come. If we look at the US venture industry we can see significant investment now going into health technology companies.

--- Later in debate ---
The precedent we have is back in, I think, 1998-99, when the last Labour Government put together what came to be called the Domesday book of Government assets. We are now looking for a similar kind of catalogue assembled for significant data assets. Rather unfashionably for a Labour MP, at that time I was an investment banker working for a small bank called Rothschild & Co. in London. I know that will ruin my pro-Corbyn credentials.
Margot James Portrait Margot James
- Hansard - -

They were never very impressive.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

The Minister is very generous. From that vantage point in the City, I was able to watch the level of ingenuity, creativity and innovation that was unlocked simply by the Government telling the world, “Here are the assets that are in public hands.” All sorts of ideas were floated for using those assets in a way that was better for taxpayers and public service delivery.

To the best of my knowledge, we do not have a similar data catalogue today. What Lord Mitchell is asking is for Ministers to do some work and create one. They can outsource that task to the Information Commissioner. Perhaps the Information Commissioner is not the best guardian of that particular task, but I am frustrated and slightly disappointed that the Minister has not set out a better approach to achieving the sensible and wise proposals that Lord Mitchell has offered the Government.

The reason why it is so important in the context of the NHS is that the NHS is obviously a complicated place. It is an economy the size of Argentina’s. The last time I looked, if the NHS were a country, it would be the 13th biggest economy on earth. It is a pretty complicated place and there are many different decision makers. Indeed, there are so many decision makers now that it is impossible to get anything done within the NHS, as any constituency MP knows. So how do we ensure that, for example, in our neck of the woods, Queen Elizabeth Hospital Birmingham does not strike its own data sharing agreement with Google or DeepMind? How do we ensure that the NHS in Wales does not go in a particular direction? How do we ensure that the trust across the river does not go in a particular direction? We need to bring order to what is potentially an enormous missed opportunity over the years to come.

The starting point is for the Government, first, to ensure we have assembled a good catalogue of data assets. Secondly, they should take some decisions about whether the organisations responsible for those data assets are destined for some kind of public-private partnership, as they were debating in relation to Companies House and other agencies a couple of years ago, or whether—more wisely—we take the approach of creating a sovereign wealth fund to govern public data in this country, where we maximise the upside for taxpayers and the opportunities for good public service reform.

The example of Hinkley Point and the unfortunate example of the Google partnership with DeepMind, which ran into all kinds of problems, are not good precedents. In the absence of a better, more concrete, lower risk approach from the Government, we will have to defend Lord Mitchell’s wise clause in order to encourage the Government to come back with a better solution than the one set out for us this morning.

Margot James Portrait Margot James
- Hansard - -

I enjoyed the right hon. Gentleman’s speech, as it went beyond some of the detail we are debating here today, but I was disappointed with the conclusion. I did not rest my argument on it being just too difficult to organise such a database as proposed by Lord Mitchell; there are various reasons, chief among them being that we are here to debate personal data. A lot of the databases the right hon. Gentleman referred to as being of great potential value do not contain personal data. Some do, some do not: the Land Registry does not, Companies House does, and so forth. Also, the Information Commissioner has advised that this is beyond her competence and her remit and that she is not resourced to do the job. Even the job of defining what constitutes data of public value is a matter for another organisation and not the Information Commissioner’s Office. That is my main argument, rather than it being too difficult.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

Happily, what sits within the scope of a Bill is not a matter for Ministers to decide. First, we rely on the advice of parliamentary counsel, which, along with the Clerks, was clear that this amendment is well within the scope. Secondly, if the Information Commissioner is not the right individual to organise this task—heaven knows, she has her hands full this week—we would have been looking for a Government amendment proposing a better organisation, a better Ministry and a better Minister for the work.

Margot James Portrait Margot James
- Hansard - -

I can only be the Minister I am. I will try to improve. I was not saying that Lord Mitchell’s amendment is not within the scope of the Bill; I was making the point that some of the databases and sources referred to by the right hon. Gentleman in his speech went into the realms of general rather than personal data. I therefore felt that was beyond the scope of the Information Commissioner’s remit.

I share the right hon. Gentleman’s appreciation of the value and the uniqueness of the NHS database. We do not see it just in terms of its monetary value; as the hon. Member for Edinburgh South made clear in his intervention, it has tremendous potential to improve the care and treatment of patients. That is the value we want to realise. I reassure the right hon. Gentleman and put it on record that it is not my place as a Minister in the Department for Digital, Culture, Media and Sport, or the place of the Bill, to safeguard the immensely valuable dataset that is the NHS’s property.

Louise Haigh Portrait Louise Haigh
- Hansard - - - Excerpts

Before the Minister concludes, given that she has focused so much on NHS data, can she update the Committee on the Government’s progress on implementing Dame Fiona Caldicott’s recommendations about health and social care data?

Margot James Portrait Margot James
- Hansard - -

I cannot give an immediate update on that, but I can say that Dame Fiona Caldicott’s role as Data Guardian is crucial. She is working all the time to advise NHS England and the Secretary of State for Health and Social Care on how best to protect data and how it can deliver gains in the appropriate manner. I do not feel that that is the place of the Bill or that it is my role, but I want to reassure the Committee that the Secretary of State for Health and Social Care, to whom I am referring Lord Mitchell, is alive to those issues and concerns. The NHS dataset is a matter for the Department of Health and Social Care.

Amendment 122 agreed to.

Schedule 13, as amended, agreed to.

Clauses 117 and 118 ordered to stand part of the Bill.

Schedule 14 agreed to.

Clauses 119 and 120 ordered to stand part of the Bill.

Clause 121

Code on personal data of national significance

--- Later in debate ---
Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

The debate rehearsed in the other place was whether we should acquiesce in a derogation that the Government have exercised to set the age of consent for personal data sharing at 13, as opposed to 16, which other countries have adopted. There was widespread concern that 13 was too young. Many members of the Committee will have experienced pressing the agree button when new terms and conditions are presented to us on our updates to software on phones, or privacy settings presented to us by Facebook; privacy settings, it is now alleged, are not worth the paper that they were not written on.

Debates in the other place centred on what safeguards could be wrapped around children if that derogation were exercised and the age of consent left at 13. With Baroness Kidron, we were keen to enshrine in legislation a step towards putting into operation the objectives of the 5Rights movement. Those objectives, which Baroness Kidron has driven forward over the past few years, are important, but the rights therein are also important. They include not only rights that are enshrined in other parts of the Bill—the right to remove, for example—but important rights such as the right to know. That means that someone has the right to know whether they are being manipulated in some way, shape or form by social media technologies.

One of the most interesting aspects of the debate in the public domain in the past few months has been the revelation that many of the world’s leading social media entrepreneurs do not allow their children to use social media apps, because they know exactly how risky, dangerous and manipulative they can be. We have also heard revelations from software engineers who used to work for social media companies about the way they deliberately set out to exploit brain chemistry to create features of their apps that fostered a degree of addiction. The right to know is therefore very powerful, as is the right to digital literacy, which is another important part of the 5Rights movement.

It would be useful to hear from the Minister of State, who—let me put this beyond doubt—is an excellent Minister, what steps she plans to take to ensure that the age-appropriate design code is set out pretty quickly. We do not want the clause to be passed but then find ourselves in a situation akin to the one we are in with section 40 of the Crime and Courts Act 2013 where, five years down the line, a misguided Secretary of State decides that the world has changed completely and that this bit of legislation should not be commenced.

We would like the Minister to provide a hard timetable— she may want to write to me if she cannot do so today—setting out when we will see an age-appropriate design code. We would also like to hear what steps she will take to consult widely on the code, what work she will do with her colleagues in the Department for Education to ensure that the code includes some kind of ventilation and education in schools so that children actually know what their rights are and know about the aspects of the code that are relevant to them, and, crucially, what steps she plans to take to include children in her consultation when she draws up the code.

This is an important step forward, and we were happy to support it in the other place. We think the Government should be a little more ambitious, which is why we suggest that the rights set out by the 5Rights movement should become part of a much broader and more ambitious digital Bill of Rights for the 21st century, but a start is a start. We are pleased that the Government accepted our amendment, and we would all be grateful if the Minister told us a little more about how she plans to operationalise it.

Margot James Portrait Margot James
- Hansard - -

I thank the right hon. Gentleman for his generous remarks. To recap, the idea that everyone should be empowered to take control of their data is at the heart of the Bill. That is especially important for groups such as children, who are likely to be less aware of the risks and consequences associated with data processing. Baroness Kidron raised the profile of this issue in the other place and won a great deal of support from peers on both sides of that House, and the Government then decided to introduce a new clause on age-appropriate design to strengthen children’s online rights and protections.

Clause 124 will require the Information Commissioner to develop a new statutory code that contains guidance on standards of age-appropriate design for online services that are likely to be accessed by children. The Secretary of State will work in close consultation with the commissioner to ensure that that code is robust, practical and meets children’s needs in relation to the gathering, sharing and storing of their data. The new code will ensure that websites and apps are designed to make clear what personal data of children is collected, how it is used and how both children and parents can stay in control of it. It will also include requirements for websites and app makers on privacy for children under 18.

The right hon. Gentleman cited examples of the consultation he hopes to see in preparation for the code. In developing the code, we expect the Information Commissioner to consult a wide range of stakeholders, including children, parents, persons who represent the interests of children, child development experts and trade associations. The right hon. Gentleman mentioned the Department for Education, and I see no reason why it should not be included in that group of likely consultees.

The commissioner must also pay close attention to the fact that children have different needs at different ages, as well as to the United Kingdom’s obligations under the United Nations Convention on the Rights of the Child. The code interlocks with the existing data protection enforcement mechanism found in the Bill and the GDPR. The Information Commissioner considers many factors in every regulatory decision, and non-compliance with that code will weigh particularly heavily on any organisation that is non-compliant with the GDPR. Organisations that wish to minimise their risk will apply the code. The Government believe that clause 124 is an important and positive addition to the Bill.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

Will the Minister say a word about the timetable? When can we expect the consultation and code of practice to be put into operation?

Margot James Portrait Margot James
- Hansard - -

There should be no delay to the development of the code and the consultation that precedes it. If I get any additional detail on the timetable, I will write to the right hon. Gentleman.

Question put and agreed to.

Clause 124, as amended, ordered to stand part of the Bill.

Clause 125

Approval of data-sharing, direct marketing and age-appropriate design codes

Amendment made: 49, in clause 125, page 69, line 9, leave out “with the day on which” and insert “when” —(Margot James.)

This amendment is consequential on Amendment 71.

Clause 125, as amended, order to stand part of the Bill.

Clauses 126 to 130 ordered to stand part of the Bill.

Clause 131

Disclosure of information to the Commissioner

Question proposed, That the clause stand part of the Bill.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

Clause 131 deals with disclosure of information to the Information Commissioner, and this is probably a good point at which to ask whether the Information Commissioner has the right level of power to access information that is pertinent to her investigations into the misuse of information. Thanks to The Guardian, The New York Times, and particularly the journalist Carole Cadwalladr, we have had the most extraordinary revelations about alleged misbehaviour at Cambridge Analytica over the past couple of years. Indeed, Channel 4 News gave us further insight into its alleged misdemeanours last night.

We have a situation in social media land that the Secretary of State has described as the “wild west”. Some have unfairly called the Matt Hancock app one of the features of that wild west, but I would not go that far, despite its slightly unusual privacy settings. None the less, there is now cross-party consensus that the regulatory environment that has grown up since the 2000 e-commerce directive is no longer fit for purpose. Yesterday, the Secretary of State helpfully confirmed that that directive will be modernised, and we will come on to discuss new clauses that suggest setting a deadline for that.

One deficiency of today’s regulatory environment is the inadequate power that the Information Commissioner currently has to access information that is important for her investigations. We have a wild west, we have hired a sheriff, but we have not given the sheriff the power to do her job of keeping the wild west in order. We now have the ridiculous situation that the Information Commissioner must declare that she is going to court to get a warrant to investigate the servers of Cambridge Analytica, and to see whether any offence has been committed.

--- Later in debate ---
Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

If I wanted to hide something from a newspaper and I thought that the newspaper was going to print it inappropriately, I would apply for an emergency injunction to stop the newspaper running it. I do not understand why the Information Commissioner has had to broadcast her intentions to the world, because that has given Cambridge Analytica a crucial period of time in which to do anything it likes, frankly, to its data records. The quality of the Information Commissioner’s investigation must be seriously impaired by the time that it has taken to get what is tantamount to a digital search warrant.

Is the Minister satisfied in her own mind that clause 131 and its associated clauses are powerful enough? Will she say more about the Secretary of State’s declaration to the House last night that he would be introducing amendments to strengthen the Commissioner’s power in the way that she requested? When are we going to see those amendments? Are we going to see them before this Committee rises, or at Report stage? Will there be a consultation on them? Is the Information Commissioner going to share her arguments for these extra powers with us and with the Secretary of State? We want to see a strong sheriff patrolling this wild west, and right now we do not know what the Government’s plan of action looks like.

Margot James Portrait Margot James
- Hansard - -

I just want to recap on what clause 131 is about. It is intended to make it clear that a person is not precluded by any other legislation from disclosing to the commissioner information that she needs in relation to her functions, under the Bill and other legislation. The only exception relates to disclosures prohibited by the Investigatory Powers Act 2016 on grounds of national security. It is therefore a permissive provision enabling people to disclose information to the commissioner.

However, the right hon. Member for Birmingham, Hodge Hill has taken the opportunity to question the powers that the Information Commissioner has at her disposal. As my right hon. Friend the Secretary of State said yesterday in the Chamber, we are not complacent. I want to correct something that the right hon. Member for Birmingham, Hodge Hill said. My right hon. Friend did not say that he would table amendments to the Bill on the matter in question. He did say that we were considering the position in relation to the powers of the Information Commissioner, and that we might table amendments, but we are in the process of considering things at the moment. I presume that that goes for the right hon. Gentleman as well; if not, he would surely have tabled his own amendments by now, but he has not.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

The Minister will notice that I have tabled a number of new clauses that would, for example, bring election law into the 21st century. I think that the Secretary of State left the House with the impression yesterday that amendments to strengthen the power of the Information Commissioner would be pretty prompt. It is hard to see another legislative opportunity to put that ambition into effect, so perhaps the Minister will tell us whether we can expect amendments soon.

Margot James Portrait Margot James
- Hansard - -

I can certainly reassure the right hon. Gentleman that we are looking at the matter seriously and, although I cannot commit to tabling amendments, I do not necessarily rule them out. I have to leave it at that for now.

On a more positive note, we should at least acknowledge that, although the Bill strengthens the powers of the Information Commissioner, her powers are already the gold standard internationally. Indeed, we must bear it in mind that the data privacy laws of this country are enabling American citizens to take Cambridge Analytica to court over data breaches.

I want to review some of the powers that the Bill gives the commissioner, but before I do so I will answer a point made by the right hon. Member for Birmingham, Hodge Hill. He said that the commissioner had had difficulties and had had to resort to warrants to pursue her investigation into a political party in the UK and both the leave campaigns in the referendum. She is doing all that under existing data protection law, which the Bill is strengthening. That is encouraging.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I did not want to intervene, but I have been struggling with the matter myself. There are allegations that a significant donor to Leave.EU was supported in that financial contribution by organisations abroad. As I spoke to the Financial Conduct Authority and tabled questions to the Treasury, it was revealed that there were no data sharing gateways between the Electoral Commission and the FCA.

Margot James Portrait Margot James
- Hansard - -

I shall come back to the right hon. Gentleman on the relationship between the Information Commissioner and the FCA. I am sure that the information that he has already ascertained from the Treasury is correct, but there may be other ways in which the two organisations can co-operate, if required. The allegations are very serious and the Government are obviously very supportive of the Information Commissioner as she grapples with the current investigation, which has involved 18 information notices and looks as if it will be backed up by warrants as well. I remind the Committee that that is happening under existing data protection law, which the Bill will strengthen.

Question put and agreed to.

Clause 131 accordingly ordered to stand part of the Bill.

Data Protection Bill [ Lords ] (Second sitting)

Margot James Excerpts
Tuesday 13th March 2018

(6 years, 1 month ago)

Public Bill Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Louise Haigh Portrait Louise Haigh
- Hansard - - - Excerpts

Given that the Minister asked so nicely, I will. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendments made: 87, in schedule 1, page 127, line 30, at end insert—

“( ) The reference in sub-paragraph (4)(b) to a data subject withholding consent does not include a data subject merely failing to respond to a request for consent.”

This amendment clarifies the intended effect of the safeguard in paragraph 15(4) of Schedule 1 (processing necessary for an insurance purpose).

Amendment 88, in schedule 1, page 127, line 39, at end insert—

“( ) is of data concerning health which relates to a data subject who is the parent, grandparent, great-grandparent or sibling of a member of the scheme,”.

This amendment provides that the condition in paragraph 16 of Schedule 1 (occupational pension schemes) can only be relied on in connection with the processing of data concerning health relating to certain relatives of a member of the scheme.

Amendment 89, in schedule 1, page 128, line 6, at end insert—

“( ) The reference in sub-paragraph (2)(b) to a data subject withholding consent does not include a data subject merely failing to respond to a request for consent.”

This amendment clarifies the intended effect of the safeguard in paragraph 16(2) of Schedule 1 (processing necessary for determinations in connection with occupational pension schemes).

Amendment 90, in schedule 1, page 131, line 14, at end insert—

“( ) If the processing consists of the disclosure of personal data to a body or association described in sub-paragraph (1)(a), or is carried out in preparation for such disclosure, the condition in sub-paragraph (1) is met even if, when the processing is carried out, the controller does not have an appropriate policy document in place (see paragraph 5 of this Schedule).”

This amendment provides that when processing consists of the disclosure of personal data to a body or association that is responsible for eliminating doping in sport, or is carried out in preparation for such disclosure, the condition in paragraph 22 of Part 2 of Schedule 1 (anti-doping in sport) is met even if the controller does not have an appropriate policy document in place when the processing is carried out.

Amendment 91, in schedule 1, page 133, line 17, leave out from “interest” to end of line 21.—(Margot James.)

This amendment removes provisions from paragraph 31 of Schedule 1 (extension of conditions in Part 2 of Schedule 1 referring to substantial public interest) which are unnecessary because they impose requirements which are already imposed by paragraph 5 of Schedule 1.

Margot James Portrait The Minister of State, Department for Digital, Culture, Media and Sport (Margot James)
- Hansard - -

I beg to move amendment 92, page 134, line 18 [Schedule 1], leave out “on the day” and insert “when”.

This amendment is consequential on Amendment 71.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendments 107, 108, 111, 113, 114, 21, 29 to 40, 43 to 46, 118 to 121, 48, 49, 53, 55, 56, 123 to 125, 59 and 71.

Margot James Portrait Margot James
- Hansard - -

Following engagement with local government stakeholders, we have recognised that the maximum time period permitted for responses to the subject access request set out in parts 3 and 4 of the Data Protection Bill subtly differs from that permitted under the GDPR and part 2 of the Bill. That is because the GDPR and, by extension, part 2 rely on European rules for calculating time periods, whereas parts 3 and 4 implicitly rely on a more usual domestic approach. European law, which applies to requests under part 2, says that when one is considering a time period in days, the day on which the request is received is discounted from the calculation of that time period. In contrast, the usual position under UK law, which applies to requests under parts 3 and 4 of the Bill, is that that same seven-day period to respond would begin on the day on which the request was received. In a data protection context, that has the effect of providing those controllers responding to requests under parts 3 and 4 with a time period that is one day shorter in which to respond.

To provide consistency across the Bill, we have decided to include a Bill-wide provision that applies the European approach to all time periods throughout the Bill, thus ensuring consistency with the directly applicable GDPR. Having a uniform approach to time periods is particularly helpful for bodies with law enforcement functions, which will process personal data under different regimes under the Bill. Without these amendments, different time periods would apply, depending on which regime they were processing under. Ensuring consistency for calculating time periods will also assist the information commissioner with her investigatory activities and enforcement powers, for example by avoiding the confusion and potential disputes that could arise relating to her notices or requests for information.

Amendment 71 provides for a number of exemptions to the European approach where deviating from our standard approach to time periods would be inappropriate. For example, where the time period refers to the process of parliamentary approval of secondary legislation, it would clearly not be appropriate to deviate from usual parliamentary time periods. The unfortunate number of amendments in this group comes from the need to modify existing language on time periods, currently worded for compliance with the usual UK approach, so that it applies the approach of the EU rules instead. I hope that this has provided the Committee with sufficient detail on the reasons for tabling this group of amendments.

Amendment 92 agreed to.

Question proposed, That the schedule, as amended, be the First schedule to the Bill.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

We had a useful debate this morning about the whys and wherefores of whether the article 8 right to privacy should be incorporated into the Bill. Although we were disappointed by the Minister’s reply, what I thought was useful in the remarks she made was a general appreciation of the importance of strong data rights if the UK is to become a country with a strong environment of trust within which a world of digital trade can flourish.

I will briefly alert the Minister to a debate we want to have on Report. The reality is that we feel schedule 1 is narrowly drawn. It is an opportunity that has been missed, and it is an opportunity for the Minister to come back on Report with a much more ambitious set of data rights for what will be a digital century. When we look around the world at the most advanced digital societies, we can see that a strong regime of data rights is common to them all.

I was recently in Estonia, which I hope the Minister will have a chance to visit if she has not done so already. Estonia likes to boast of its record as the world’s most advanced digital society; it is a place where 99% of prescriptions are issued online, 95% of taxes are paid online and indeed a third of votes are cast online. It is a country where the free and open right to internet access is seen as an important social good, and a good example of a country that has really embraced the digital revolution and translated that ambition into a set of strong rights.

The Government are not averse to signing declaratory statements of rights that they then interpret into law. They are a signatory to the UN universal declaration of human rights and the UN convention on the rights of the child; the Human Rights Act 1998 is still in force—I have not yet heard of plans to repeal it—and of course the Equality Act 2010 was passed with cross-party support. However, those old statements of rights, which date back to 1215, were basically to correct and guard against dangerous imbalances of power. Things have moved on since 1215 and the worries that the barons had about King John. We are no longer as concerned as people were in 1215 about taking all the fish weirs out of the Thames, for example.

--- Later in debate ---
None Portrait The Chair
- Hansard -

To make matters clear to hon. Members and in particular those who are new to the Committee, the right hon. Member for Birmingham, Hodge Hill tabled a number of amendments—171 to 175 and 177 to 178—that were not selected because they were tabled only yesterday. We need to have several days’ notice before selection can be considered. Had they been tabled earlier, we could have debated and voted on those amendments now. I have given the right hon. Gentleman leeway to widen his arguments about schedule 1, and it is up to him whether he wishes to table those amendments on Report. He is perfectly in order to do so. The debate today is on schedule 1, and the points that the right hon. Gentleman has made in relation to potential amendments are a heads-up for the future or for the Minister to respond to at this point.

Margot James Portrait Margot James
- Hansard - -

The right hon. Member for Birmingham, Hodge Hill covered a lot of important ground. He mentioned the digital charter. We are bringing forward the digital charter and we do not intend for it to be set in stone. We recognise that this is a fast-changing environment and so it is deliberately something that will evolve over time. We both share the concerns that he expressed with regard to fake news and the rights and protections needed for children and young people who, as he says, make up a third of internet users. We will address many of the things he highlighted as part of our internet safety strategy, and I look forward to debating them further with him on Report.

To add to what we have already discussed under schedule 1, article 9 of the GDPR limits the processing of special categories of data. Those special categories are listed in article 9(1) and include personal data revealing racial or ethnic origin, health, political opinions and religious beliefs. Some of the circumstances in which article 9 says that special category data can be processed have direct effect, but others require the UK to make related provision.

Clause 10 introduces schedule 1 to the Bill, which sets out in detail how the Bill intends to use the derogations in article 9 and the derogation in article 10 relating to criminal convictions data to permit particular processing activities. To ensure that the Bill is future-proof, clause 10 includes a delegated power to update schedule 1 using secondary legislation. Many of the conditions substantively replicate existing processing conditions in the 1998 Act and hon. Members may wish to refer to annexe B to the explanatory notes for a more detailed analysis on that point.

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

I want to make one point about schedule 1. Amendment 9, which was made this morning, allows democratic engagement to be a purpose under article 6(1)(e) of the GDPR—namely, that consent is not required for the processing of data for public interest or the exercising of official authority and the purposes of democratic engagement. I wonder whether the definitions of political parties and politicians under schedule 1 could be used to restrict that amendment, so that organisations other than political parties and politicians are not able to process data in the public interest for democratic engagement without consent. For example, if Leave.EU or Open Britain wanted to process our personal data, they ought to do so with consent, not using the same public interest for democratic engagement purposes as politicians or parties.

Margot James Portrait Margot James
- Hansard - -

I understand the hon. Gentleman’s concerns. The GDPR requires data controls to have a legal basis laid down in law, which can take the form, for example, of a statutory power or duty, or a common-law power. Any organisation that does not have such legal basis would have to rely on one of the other processing conditions in article 6. With regard to the amendment that was agreed to this morning, we think that further restricting clause 8 might risk excluding bodies with a lawful basis for processing. However, the hon. Gentleman is free to raise the issue again on Report.

Question put and agreed to.

Schedule 1, as amended, accordingly agreed to.

Clauses 11 to 13 ordered to stand part of the Bill.

Clause 14

Automated decision-making authorised by law: safeguards

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I beg to move amendment 153, in clause 14, page 7, line 30, at end insert—

“(1A) A decision that engages an individual’s rights under the Human Rights Act 1998 does not fall within Article 22(2)(b) of the GDPR (exception from prohibition on taking significant decisions based solely on automated processing for decisions that are authorised by law and subject to safeguards for the data subject’s rights, freedoms and legitimate interests).”

This amendment would clarify that the exemption from prohibition on taking significant decisions based solely on automated processing must apply to purely automated decisions that engage an individual’s human rights.

--- Later in debate ---
Brendan O'Hara Portrait Brendan O'Hara (Argyll and Bute) (SNP)
- Hansard - - - Excerpts

I will speak to amendments 130, 133 and 135, which appear in my name and that of my hon. Friend the Member for Cumbernauld, Kilsyth and Kirkintilloch East. Our amendments seek to provide protection for individuals who are subject to purely automated decision making, specifically where we believe that it could have an adverse impact on their fundamental rights. The amendments would require that where human rights are or possibly could be impacted by automated decisions, ultimately there are always human decision makers. The amendments would instil that vital protection of human rights with regard to the general processing of personal data.

The amendments seek to clarify the meaning of a decision that is based solely on automated processing, which is a decision that lacks meaningful human input. That reflects the intent of the GDPR, and provides clarification that purely administrative human approval of an automated decision does not make that decision a human one. It is simply not enough for human beings to process the information in a purely administrative fashion, but to have absolutely no oversight or accountability for the decision that they process. We strongly believe that automated decision making without human intervention should be subject to strict limitations to ensure fairness, transparency and accountability, and to safeguard against discrimination. As it stands, there are insufficient safeguards in the Bill.

As the right hon. Member for Birmingham, Hodge Hill said, we are not talking about every automated decision. We are not talking about a tech company or an online retailer that suggests alternatives that someone may like based on the last book they bought or the last song they downloaded. It is about decisions that can be made without human oversight that will or may well have long-term, serious consequences on an individual’s health, financial status, employment or legal status. All too often, I fear that automated decisions involve an opaque, unaccountable process that uses algorithms that are neither as benign nor as objective as we had hoped they would be, or indeed, as we thought they were when we first encountered them.

We are particularly concerned about elements of the Bill that allow law enforcement agencies to make purely automated decisions. That is fraught with danger and at odds with the Data Protection Act 1998, as well as article 22 of the GDPR, which states:

“The data subject shall have the right not to be subject to a decision based solely on automated processing”.

Although there are provisions in the GDPR for EU member states to opt out of that, the opt-out does not apply if the data subject’s rights, freedoms or legitimate interests are undermined.

I urge the Government to look again at the parts of the Bill about automated decision making, to ensure that when it is carried out, a human being will have to decide whether it is reasonable and appropriate to continue on that course. That human intervention will provide transparency and capability, and it will ensure that the state does not infringe on an individual’s freedoms—those fundamental rights of liberty and privacy—which are often subjective. Because they are subjective, they are beyond the scope of an algorithm.

There are serious human rights, accountability and transparency issues around fully automated decision making as the Bill stands. Amendment 130 says that any human involvement has to be “meaningful”. We define meaningful human oversight as being significant, of consequence and purposeful. As I have said, that is far beyond the scope of an algorithm. If an individual’s rights are to be scrutinised and possibly fundamentally affected, it is an issue of basic fairness that the decision is made, or at least overseen, by a sentient being. I hope the Government accept the amendments in the faith in which they were tabled.

Margot James Portrait Margot James
- Hansard - -

The amendments relate to automated decision making under the GDPR and the Bill. It is a broad category, which includes everything from trivial things such as music playlists, as mentioned by the hon. Member for Argyll and Bute, and quotes for home insurance, to the potentially more serious issues outlined by the right hon. Member for Birmingham, Hodge Hill of recruitment, healthcare and policing cases where existing prejudices could be reinforced. We are establishing a centre, the office for artificial intelligence and data ethics, and are mindful of these important issues. We certainly do not dismiss them whatsoever.

Article 22 of the GDPR provides a right not to be subject to a decision based solely on automatic processing of data that results in legal or similarly significant effects on the data subject. As is set out in article 22(2)(b), that right does not apply if the decision is authorised by law, so long as the data subject’s rights, freedoms and legitimate interests are safeguarded.

The right hon. Member for Birmingham, Hodge Hill, mentioned those safeguards, but I attribute far greater meaning to them than he implied in his speech. The safeguards embed transparency, accountability and a right to request that the decision be retaken, and for the data subject to be notified should a decision be made solely through artificial intelligence.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

The Minister must realise that she is risking an explosion in the number of decisions that have to be taken to Government agencies or private sector companies for review. The justice system is already under tremendous pressure. The tribunal system is already at breaking point. The idea that we overload it is pretty optimistic. On facial recognition at public events, for example, it would be possible under the provisions that she is proposing for the police to use facial recognition technology automatically to process those decisions and, through a computer, to have spot interventions ordered to police on the ground. The only way to stop that would be to have an ex post facto review, but that would be an enormous task.

Margot James Portrait Margot James
- Hansard - -

The right hon. Gentleman should be aware that just because something is possible, it does not mean that it is automatically translated into use. His example of facial recognition and what the police could do with that technology would be subject to controls within the police and to scrutiny from outside.

Louise Haigh Portrait Louise Haigh
- Hansard - - - Excerpts

The case that my right hon. Friend raises is certainly not hypothetical. The Metropolitan police have been trialling facial recognition scanning at the Notting Hill carnival for the last three years with apparently no legal base and very little oversight. We will move on to those issues in the Bill. That is exactly why the amendments are crucial in holding law enforcement agencies to account.

Margot James Portrait Margot James
- Hansard - -

As the hon. Lady says, the police are trialling those things. I rest my case—they have not put them into widespread practice as yet.

Returning to the GDPR, we have translated the GDPR protections into law through the Bill. As I said, the data subject has the right to request that the decision be retaken with the involvement of a sentient individual. That will dovetail with other requirements. By contrast, the amendments are designed to prevent any automated decision-making from being undertaken under article 22(2)(b) if it engages the rights of the data subject under the Human Rights Act 1998.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

Will the Minister explain to the Committee how a decision to stop and search based on an automated decision can be retaken? Once the person has been stopped and searched, how can that activity be undone?

Margot James Portrait Margot James
- Hansard - -

I am not going to get into too much detail. The hon. Member for Sheffield, Heeley mentioned an area and I said that it was just a trial. She said that facial recognition was being piloted. I do not dispute that certain things cannot be undone. Similar amendments were tabled in the other place. As my noble Friend Lord Ashton said there, they would have meant that practically all automated decisions under the relevant sections were prohibited, since it would be possible to argue that any decision based on automatic decision making at the very least engaged the data subject’s right to have their private life respected under article 8 of the European convention on human rights, even if it was entirely lawful under the Act.

--- Later in debate ---
Brendan O'Hara Portrait Brendan O'Hara
- Hansard - - - Excerpts

In that case, I will not press the amendment now.

Margot James Portrait Margot James
- Hansard - -

I beg to move Government amendment 10, in clause 14, page 8, line 4, leave out “21 days” and insert “1 month”.

Clause 14(4)(b) provides that where a controller notifies a data subject under Clause 14(4)(a) that the controller has taken a “qualifying significant decision” in relation to the data subject based solely on automated processing, the data subject has 21 days to request the controller to reconsider or take a new decision not based solely on automated processing. This amendment extends that period to one month.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 11, 12, 23, 24, 27, 28, 41 and 42.

Margot James Portrait Margot James
- Hansard - -

Amendments 10, 11 and 12 relate to clause 14, which requires a data controller to notify a data subject of a decision based solely on automatic processing as soon as is reasonably practicable. The data subject may then request that the data controller reconsider such a decision and take a new decision not based solely on automated processing.

The purpose of the amendments is to bring clause 14 into alignment with the directly applicable time limits in article 12 of the GDPR, thereby ensuring that both data subjects and data controllers have easily understandable rights and obligations. Those include giving the data subject longer to request that a decision be reconsidered, requiring that the controller action the request without undue delay and permitting an extension of up to two months where necessary.

Furthermore, to ensure that there is consistency across the different regimes in the Bill—not just between the Bill and the GDPR—amendments 23, 24, 41 and 42 extend the time limit provisions for making and responding to requests in the other regimes in the Bill. That is for the simple reason that it would not be right to have a data protection framework that applies one set of time limits to one request and a different set of time limits to another.

In a similar vein, amendments 27 and 28 amend part 3 of the Bill, concerning law enforcement processing, to ensure that controllers can charge for manifestly unfounded or excessive requests for retaking a decision, as is permitted under article 12 of the law enforcement directive. To prevent abuse, amendment 28 provides that it is for the controller to be able to show that the request was manifestly unfounded or excessive.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

It would be useful if the Minister could say a little more about the safeguards around the controllers charging reasonable fees for dealing with requests.

It is quite easy to envisage situations where algorithms take decisions. We have some ex post facto review; a citizen seeks to overturn the decision; the citizen thinks they are acting reasonably but the commercial interest of the company that has taken and automated the decision means that it wants to create disincentives for that rigmarole to unfold. That creates the risk of unequal access to justice in these decisions.

If the Minister is not prepared to countenance the sensible safeguards that we have proposed, she must say how she will guard against another threat to access to justice.

Margot James Portrait Margot James
- Hansard - -

The right hon. Gentleman asks a reasonable question. I did not mention that data subjects have the right of complaint to the Information Commissioner if the provisions are being abused. I also did not mention another important safeguard, which is that it is for the data controller to show that the request is manifestly unfounded or excessive. So the burden of proof is on the data controller and the data subject has the right of involving the Information Commissioner, if he or she contests the judgment taken in this context, concerning unfounded or excessive requests in the opinion of the data controller. I hope that satisfies the right hon. Gentleman.

Amendment 10 agreed to.

Amendments made: 11, in clause 14, page 8, leave out line 10 and insert “within the period described in Article 12(3) of the GDPR—”

This amendment removes provision from Clause 14(5) dealing with the time by which a controller has to respond to a data subject’s request under Clause 14(4)(b) and replaces it with a requirement for the controller to respond within the time periods set out in Article 12(3) of the GDPR, which is directly applicable.

Amendment 12, in clause 14, page 8, line 16, at end insert—

‘(5A) In connection with this section, a controller has the powers and obligations under Article 12 of the GDPR (transparency, procedure for extending time for acting on request, fees, manifestly unfounded or excessive requests etc) that apply in connection with Article 22 of the GDPR.” —(Margot James.)

This amendment inserts a signpost to Article 12 of the GDPR which is directly applicable and which confers powers and places obligations on controllers to whom Clause 14 applies.

Clause 14, as amended, ordered to stand part of the Bill.

Clause 15

Exemptions etc.

Margot James Portrait Margot James
- Hansard - -

I beg to move amendment 13, in clause 15, page 8, line 31, after “21” insert “and 34”

This amendment is consequential on Amendment 94.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 14, 93 to 106, 109, 110 and 112.

Margot James Portrait Margot James
- Hansard - -

Schedule 2 allows for particular rights or obligations contained in the GDPR to be disapplied in particular circumstances, where giving effect to that right or obligation would lead to a perverse outcome. To do that, it makes use of a number of derogations in the GDPR, including articles 6(3) and 23(1).

Amendments 93, 95 and 109 permit article 19 of the GDPR to be disapplied for the purposes in parts 1, 2 and 5 of schedule 2.

When a data controller corrects or deletes personal data following a request from a data subject, article 19 of the GDPR requires them to inform all persons to whom the personal data has been disclosed. Additionally, if requested, the data controller must inform the data subject about those persons to whom the data has been disclosed. Following the introduction of the Bill, we have had further representations from a range of stakeholders, including the banking industry, regulators and the media sector, about the problems that article 19 might create in very particular circumstances.

The amendments will ensure that, for example, where a bank may have shared personal data about one of its customers with the National Crime Agency because of a suspected fraud, it will not have to tell the data subject about that disclosure when the customer changes their address with the bank. That will ensure that the data subject is not tipped off about the suspected fraud investigation.

Several amendments in the group are designed to ensure that a valuable provision of the GDPR—article 34—does not have unintended consequences for controllers who do the right thing by seeking to prevent or detect crime, assist with the assessment or collection of tax or uncover abuses in our society. Article 34 requires data controllers to inform a data subject if there has been a data breach that is likely to result in a high risk to the rights and freedoms of an individual. In normal operation, this is an important article, which we hope will prompt a step change in the way organisations think about cyber-security.

However, article 23(1) enables member states to create laws to restrict the scope of the obligations and rights for which article 34 provides in the minority of cases where it conflicts with other important objectives of general public interest. The amendments seek to do that in the Bill. Amendment 94 responds to the concerns of the finance sector that compliance with article 34 may result in persons under investigation for financial crime being tipped off. Amendment 110 serves a similar purpose for media organisations.

Article 85(2) creates scope for member states to provide exemptions from chapter 4 of the GDPR, which includes article 34, if they are necessary to reconcile the right to the protection of personal data with the freedom of expression. The amendment intends to ensure that processing data for a special purpose that is in the public interest is not prejudiced—for example, by a controller having to notify the data subject of a breach in relation to pre-publication undercover footage. Importantly, data controllers will still be required, for the first time, to report a breach to the Information Commissioner under article 33 of the GDPR. That will ensure that she is well placed to take all the necessary steps to ensure data subjects’ rights are respected, including by monitoring compliance with these new exemptions.

On the more general question of who can make use of the exemptions in schedule 2 and when, amendment 96 broadens the exemption in paragraph 7 of the schedule, which relates to the protection of members of the public. As drafted, the exemption applies to personal data processed for the purposes of discharging a function that is designed to protect members of the public against dishonesty, malpractice or incompetence by persons who carry out activities that bring them into contact with members of the public. We have identified an issue with that wording: a number of public office holders, including police staff, do not carry out activities that necessarily bring them into contact with members of the public. Amendment 96 broadens the scope of the exemption to include processing in relation to individuals who work for those organisations in a behind-the-scenes capacity.

We have also had representations from several regulators on the need to make additional provisions to protect the integrity of their activities. Amendment 97 provides the UK’s Comptroller and Auditor General, and their counterpart in each of the devolved Administrations, with an exemption from certain GDPR provisions where these are likely to prejudice their statutory functions. That will prevent certain individuals who suspect they may be under scrutiny from trying to use their rights under the GDPR, such as article 15 (confirmation of processing) as a way of confirming that their data is being processed, or from using article 17 (right to erasure) and article 18 (restriction of processing) to undermine the effectiveness of an audit.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I have just had a request to remove jackets, because of the warm temperature in the room. I give my permission to do so. I call the Minister.

Margot James Portrait Margot James
- Hansard - -

Thank you, Mr Hanson. I agree with the tribute paid by the right hon. Member for Birmingham, Hodge Hill to the custodians of some of the most wonderful archives in the world. I will comment on his proposals with regard to such archives shortly, but I hope that recent debates have left no doubt in hon. Members’ minds that the Government are absolutely committed to preserving the freedom of the press, and maintaining the balance between privacy and freedom of expression in our existing law, which has served us well for so many years.

As set out in the Bill, media organisations can already process data for journalistic purposes, which includes media archiving. As such, we believe that amendment 170 is unnecessary and could be unhelpful. I agree with the right hon. Gentleman that it is crucial that the media can process data and maintain media archives. In the House of Lords, my noble Friend Lord Black of Brentwood explained very well the value of media archives. He said:

“Those records are not just the ‘first draft of history’; they often now comprise the only record of significant events, which will be essential to historians and others in future, and they must be protected.”—[Official Report, House of Lords, 10 October 2017; Vol. 785, c. 175.]

However, recital 153 indicates that processing for special purposes includes news archiving and press libraries. Paragraph 24 of schedule 2 sets out the range of derogations that apply to processing for journalistic purposes. That includes, for example, exemption from complying with requests for the right to be forgotten. That means that where the exemption applies, data subjects would not have grounds to request that data about them be deleted. It is irrelevant whether the data causes substantial damage or distress.

However, if media organisations are archiving data for other purposes—for example, in connection with subscriber data—it is only right that they are subjected to the safeguards set out in article 89(1), and the Bill provides for that accordingly. For that reason, I hope that the right hon. Gentleman agrees to reconsider his approach and withdraw his amendment.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I am happy to withdraw the amendment, although I would say to the Minister that the helpful words we have heard this afternoon will not go far enough to satisfy the objections that we heard from organisations. We reserve the right to come back to this matter on Report. We will obviously consult the organisations that helped us to draft the amendment, and I urge her to do the same. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Schedule 2, as amended, agreed to.

Schedule 3

Exemptions etc from the GDPR: health, social work, education and child abuse data

Amendments made: 111, in schedule 3, page 160, line 21, leave out

“with the day on which”

and insert “when”.

This amendment is consequential on Amendment 71.

Amendment 112, in schedule 3, page 162, line 3, leave out paragraph 16 and insert—

“16 (1) This paragraph applies to a record of information which—

(a) is processed by or on behalf of the Board of Governors, proprietor or trustees of, or a teacher at, a school in Northern Ireland specified in sub-paragraph (3),

(b) relates to an individual who is or has been a pupil at the school, and

(c) originated from, or was supplied by or on behalf of, any of the persons specified in sub-paragraph (4).

(2) But this paragraph does not apply to information which is processed by a teacher solely for the teacher’s own use.

(3) The schools referred to in sub-paragraph (1)(a) are—

(a) a grant-aided school;

(b) an independent school.

(4) The persons referred to in sub-paragraph (1)(c) are—

(a) a teacher at the school;

(b) an employee of the Education Authority, other than a teacher at the school;

(c) an employee of the Council for Catholic Maintained Schools, other than a teacher at the school;

(d) the pupil to whom the record relates;

(e) a parent, as defined by Article 2(2) of the Education and Libraries (Northern Ireland) Order 1986 (S.I. 1986/594 (N.I. 3)).

(5) In this paragraph, “grant-aided school”, “independent school”, “proprietor” and “trustees” have the same meaning as in the Education and Libraries (Northern Ireland) Order 1986 (S.I. 1986/594 (N.I. 3)).”

This amendment expands the types of records that are “educational records” for the purposes of Part 4 of Schedule 3.

Amendment 113, in schedule 3, page 164, line 7, leave out

“with the day on which”

and insert “when”.—(Margot James.)

This amendment is consequential on Amendment 71.

Schedule 3, as amended, agreed to.

Schedule 4 agreed to.

Clause 16

Power to make further exemptions etc by regulations

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

We agree that the clause offers Ministers a rather sweeping power to introduce new regulations. Over the course of what has been quite a short day in Committee we have heard many reasons to be alarmed about equipping Ministers with such sweeping powers. We proposed an amendment to remove the clause, which I think was not selected because we have this stand part debate. What we need to hear from the Minister are some pretty good arguments as to why Ministers should be given unfettered power to introduce such regulations without the effective scrutiny and oversight of right hon. and hon. Members in this House.

Margot James Portrait Margot James
- Hansard - -

I am glad that the right hon. Gentleman feels we have had a short day in Committee. In answer to his questions and those of the hon. Gentleman, the order making powers in clauses 16 and 113 allow the Secretary of State to keep the list of exemptions in schedules 2 to 4 and 11 up to date. As I mentioned when we discussed order making powers in relation to clause 10 and schedule 1, we carefully reviewed the use of such powers in the Bill following recommendations from the Delegated Powers and Regulatory Reform Committee. We think an appropriate balance has now been struck. It might be helpful if I explain the reasons for our thinking.

Clause 16 includes order making powers to ensure that the Secretary of State can update from time to time the particular circumstances in which data subjects’ rights can be disapplied. That might be necessary if, for example, the functions of a regulator are expanded and exemptions are required to ensure that those new functions cannot be prejudiced by a data subject exercising his or her right to object to the processing.

We believe it is very important that the power to update the schedules is retained. Several of the provisions in schedules 2 to 4 did not appear in the Data Protection Act 1998 and have been added to the Bill to address specific requirements that have arisen over the last 20 years.

For example, the regulatory landscape has changed dramatically since the 1998 Act. Organisations such as the Bank of England, the Financial Conduct Authority and the National Audit Office have taken on a far broader range of regulatory functions, and that is reflected in the various amendments we have tabled to paragraphs 7 to 9 of schedule 2, to provide for a broader range of exemptions. No doubt, there will be further changes to the regulatory landscape in the years to come. Of course, other exemptions in schedule 2 have been carried over from the 1998 Act, or indeed from secondary legislation made under that Act, with little change. That does not mean, however, that they will never need to be amended in the future. Provisions made under the 1998 Act could be amended via secondary legislation, so it would seem remiss not to afford ourselves that same degree of flexibility now. If we have to wait for primary legislation to make any changes, it could result in a delay of months or possibly years to narrow or widen an extension, even where a clear deficiency had been identified. We cannot predict the future, and it is important that we retain the power to update the schedules quickly when the need arises.

Importantly, any regulations made under either clause would be subject to the affirmative resolution procedure. There would be considerable parliamentary oversight before any changes could be made using these powers. Clause 179 requires the Secretary of State to consult with the Information Commissioner and other interested parties that he considers appropriate before any changes are made.

I hope that that reassures Members that we have considered the issue carefully. I commend clause 16 to the Committee.

Question put, That the clause stand part of the Bill.

The Committee proceeded to a Division.

--- Later in debate ---
Accreditation of certification providers
Margot James Portrait Margot James
- Hansard - -

I beg to move amendment 15, in clause 17, page 10, line 16, leave out “authority” and insert “body”.

This amendment corrects the reference in Clause 17(7) to the “national accreditation authority” by amending it to refer to the “national accreditation body”, which is defined in Clause 17(8).

Clause 17 relates to the certification of data controllers. This is a relatively new concept and will take time to bed in, but it could also be a significant step forward in ensuring that data subjects can have confidence in controllers and processors and, perhaps even more important, that controllers and processors can have confidence in each other. It is likely to be particularly relevant in the context of cloud computing and other business-to-business platforms where individual audits are often not feasible in practice.

Before they can audit controllers, certification bodies must be accredited, either by the Information Commissioner or by the national accreditation body, UKAS. Clause 17 and schedule 5 set out how the process will be managed. Unfortunately, there is a typographical error in clause 17. It refers erroneously to the “national accreditation authority” in subsection (7), when it should refer to the “national accreditation body”. Amendment 15 corrects that error.

Amendment 15 agreed to.

Clause 17, as amended, ordered to stand part of the Bill.

Schedule 5

Accreditation of certification providers: reviews and appeals

Amendment made: 114, in schedule 5, page 170, line 21, leave out “In this paragraph” and insert—

“Meaning of “working day”

7 In this Schedule”

This amendment applies the definition of “working day” for the purposes of the whole of Schedule 5. There are references to “working days” in paragraphs 5(2) and 6(3) of that Schedule.(Margot James.)

Schedule 5, as amended, agreed to.

Clause 18 ordered to stand part of the Bill.

Clause 19

Processing for archiving, research and statistical purposes: safeguards

Amendment made: 16, in clause 19, page 12, line 2, leave out “(d)” and insert “(e)” —(Margot James.)

This amendment amends the definition of “relevant NHS body” in this Clause by adding special health and social care agencies established under Article 3 of the Health and Personal Social Services (Special Agencies) (Northern Ireland) Order 1990 (which fall within paragraph (e) of section 1(5) of the Health and Social Care (Reform) Act (Northern Ireland) 2009).

Clause 19, as amended, ordered to stand part of the Bill.

Clauses 20 to 22 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned.—(Nigel Adams.)