Public Authorities (Fraud, Error and Recovery) Bill Debate
Full Debate: Read Full DebateLord Vaux of Harrowden
Main Page: Lord Vaux of Harrowden (Crossbench - Excepted Hereditary)Department Debates - View all Lord Vaux of Harrowden's debates with the Department for Work and Pensions
(3 days, 12 hours ago)
Grand CommitteeMy Lords, I support my noble friend Lady Finn, particularly on Amendment 60A, because as we go through this process it feels as though the Government are trying to be judge and jury on whether the existence of an order should apply at all. I am conscious that it is important that the Government be allowed to get on and have this more straightforward way of collecting money that they are due, but it strikes me as pretty draconian that the question of whether a debt exists cannot be challenged—it cannot go for review. I appreciate we are debating the amendment, but I say by the way, in reference to the Explanatory Notes for Clause 34 on the process for review, that the legislation does not point to the fact that it is supposed to go to a higher-grade person; I am sure that it will be set out in guidance, which I hope will have statutory standing. It strikes me as odd that, having not been able to even challenge whether the order should exist, you cannot go to a tribunal about it, either. Ministers will know that I wish that parts of the Bill would go further in trying to get money back from people in a variety of ways, but in this area I do not agree with the approach of the Government and certainly agree with that of my noble friend.
My Lords, I was not going to speak on this group, but, as the noble Baroness, Lady Anderson, proved the other day, Amendment 60A is not necessary because Clause 12 sets out clearly that these orders can be used only where there has been a final determination of the amount owing by the court or where it has been agreed.
However, I support Amendment 61A. Frankly, it is becoming a bit of a weakness in an awful lot of areas that the impact assessments that come with legislation are regularly quite poor. It is incredibly important that, when we make regulations that will have impacts on people, we understand what those impacts are.
I have one other question that I probably should have dealt with by means of an amendment, but I have only just spotted something. Why are regulations made under Clauses 37(2)(c) to (f) subject to the negative procedure and not the affirmative procedure?
My Lords, the amendments tabled by the noble Baroness, Lady Finn, and the noble Viscount, Lord Younger of Leckie, raise important considerations about procedural fairness and transparency in the implementation of the Bill. Amendment 60A, which would allow applicants to request a review into the existence or value of the payable amount, would provide a valuable safeguard, ensuring that individuals have an accessible means to challenge decisions where there might be uncertainty or dispute. This aligns well with the principle of natural justice and could help prevent errors going uncorrected.
Amendments 61A and 61B focus on the mechanisms surrounding direct deduction orders, emphasising the need for accountability and parliamentary oversight. Requiring an impact assessment to accompany any changes to the processing of these orders, as proposed in Amendment 61A, would encourage transparency about the potential costs and effects on banks’ operational capacity. Similarly, Amendment 61B’s provision that consultation outcomes must be laid before Parliament prior to implementation would ensure democratic scrutiny. Together, these amendments would contribute to a more open and considered approach, balancing the efficient recovery of public funds with the need for oversight and due process, and I support them.
My Lords, all these amendments pertain to deduction from earnings orders—or DEOs, as I shall refer to them from here. DEOs are a mechanism by which the PSFA can instruct an employer to make deductions from the liable person’s salary in order to recover the money owed as a result of fraud or error. This power can be exercised only after the amount owed has been agreed by the liable person, a court or tribunal, or if the penalty appeal period has lapsed or an appeal has been finally determined. People can avoid their employers being contacted if they simply engage with us and pay what they owe.
DEOs are an established mechanism used by the courts, the DWP, the Child Maintenance Service and some local authorities. We have sought to emulate best practice and established processes to make it straightforward for the employers that have to implement them. There are safeguards for the liable person, such as a protected earnings amount of 60% and the requirement for deductions to be affordable and fair, as set out in Clause 41.
Before an order is made, the liable person will have the opportunity to make representation on the proposed terms. Amendment 61C would create an obligation for the PSFA to provide the reasoning behind its decision to proceed with a DEO following these representations. Amendment 61D would create a similar obligation for the PSFA to demonstrate that it has taken the liable person’s wider circumstances into account when determining the level of affordable and fair deductions. Both these amendments are duplicative as the PSFA would be doing this anyway, as a matter of good public law. As I outlined previously, guidance will also be published detailing what information will be supplied to the liable person as part of the wider decision-making processes.
Amendment 61E would limit the regulation-making powers in Clause 41(7) to establishing affordability considerations. We have striven to put as much detail into the Bill as possible, but there are elements where it is valuable to have a degree of flexibility so that further conditions or restrictions can be added to the measures to reflect wider societal, economic and technological changes. This amendment would severely limit the Government’s ability to adapt to these changes and impact the efficacy of this recovery method, thus potentially reducing the money lost to fraud that could be recovered in the future.
Amendment 61F would require that the PSFA consults with employers on the level of admin costs that they can charge the liable person for implementing a DEO. There are standard charges of £1 per deduction period allowed by the courts and other organisations that use DEOs. It is not for the PSFA to set up a different regime single-handedly, as it will be following established processes already used across government. If it is felt that changes to this charge should be made, they would need to be done in conjunction with the other bodies.
Amendments 62A and 62B would prevent a suspended DEO from being restarted after 24 months. We discussed the same matter on Monday, in relation to direct deduction orders. I confirm that I am still reflecting on the points raised by the noble Baronesses, Lady Fox and Lady Finn, and the noble Lord, Lord Vaux, which also apply to DEOs, and I am having meetings with officials on them. It is important that the PSFA has discretion in how it can react to individual circumstances counterbalanced against its duty to recover money lost to fraud and error in the most appropriate way. There is a balance to be struck and I shall report back on my reflections in due course.
Finally, Amendment 62C would require that, when the PSFA revokes a DEO, it provides the reasoning to both the liable person and their employer. In practice, this would be shared with the liable person as a matter of good public law to safeguard the public law duty of fairness in decision-making for the individuals subject to the orders. However, there are serious privacy considerations that could be undermined by providing such information to the employer. Upon the establishment of a DEO, the employer is not told anything about the DEO other than what is to be deducted from the liable person’s salary. This is the only information of relevance to the employer. Any other information would be a breach of privacy.
Regarding some of the other points raised, particularly by the noble Baroness, Lady Finn, I think it would be helpful to your Lordships if I assist them with some more information on safeguards. Regarding the safeguards in place for the use of DEOs, including preventing hardship, the Public Sector Fraud Authority has committed to the following safeguards: vulnerability assessments, maximum deduction amounts, opportunities for representation, reviews and appeals, and the ability to notify a change of circumstances. The PSFA will continue to utilise best practice from across government.
On the question of who determines the amount of debt owed, the Public Sector Fraud Authority’s investigation will calculate the debt owed to the Government as a result of fraud or error following an investigation into suspected fraud. The liable person will be notified of the recoverable amount. If they do not agree, a firm and final determination will be sought by a court or tribunal.
The noble Baroness, Lady Finn, asked what is meant by “among other things” in Clause 41. Clause 41(6) gives the Minister powers to
“make further provision about the calculation of amounts to be deducted”
in respect of DEOs. To be clear, to make further provision would not allow the Minister to qualify or change the provision, only to add specific conditions or restrictions that can be taken into account when calculating the amount to be deducted. As given as an example in Clause 41(7), the key consideration will be hardship and defining what constitutes hardship. It is important that the definition of hardship is not fixed, as what constitutes hardship today may look very different in, say, 10 years’ time.
The term “among other things” could also include other items that can be taken into account when calculating DEOs that are not so immediately obvious. For example, the regulations could be used in allowing for a different deduction rate around the Christmas period, when the liable person might have other outgoings that would not be reasonably foreseeable when the order was first given.
I hope that goes some way to assuring noble Lords about our safeguards and that the noble Baroness will feel able to withdraw her amendment.
My Lords, I listened to the Minister, and I listened to her the other day on the same subjects regarding DDOs. A question occurs. In many cases, the amount owed is set by the court. Why, then, does the court not decide how that amount should be repaid? Why do we have to go through all these processes and decisions by the departments rather than the court?
The noble Lord makes a very interesting point, on which I will have to reflect and come back to him, if that is okay.
My Lords, we have been debating Part 1, which gives substantial powers to the Cabinet Office when the Minister has reasonable grounds to suspect fraud, and we are about to kick off on Part 2, which gives substantial powers to the DWP. Those include police-style powers to enter private premises, search them and seize property, as well as powers to demand information. Those are potentially very intrusive powers, so it is essential that they can be exercised only when it is genuinely appropriate to do so.
The two amendments in this group cover both Parts 1 and 2, and they provide essential clarification as to how the DWP and PSFA should interpret the legal threshold for most of the investigative powers in the Bill, which is the requirement to have “reasonable grounds” of suspicion of fraud.
The amendments are intended to ensure that, when the DWP and PSFA are exercising their investigative powers under this Bill, reasonable grounds do not include generalisations or stereotypes of certain categories of people—for example, that members of a particular social group are more likely to be involved in fraudulent activity than others. Investment in data analytics and other emerging technologies, such as AI, for fraud risk detection is inevitably, and probably rightly, increasing. The Government have signalled their intention to turbocharge AI and to mainline AI into the veins of the nation, including the public sector.
The Government are, as we speak, trying to pass the Data (Use and Access) Bill, which would repeal the current ban on automated decision-making and profiling of individuals. The DWP has invested heavily in artificial intelligence, widening its scope last year to include use of a machine-learning tool to identify fraud in universal credit advances applications, and it intends to develop further models. This is despite a warning from the Auditor-General in 2023 of
“an inherent risk that the algorithms are biased towards selecting claims for review from certain vulnerable people or groups with protected characteristics”.
The DWP admitted that its,
“ability to test for unfair impacts across protected characteristics is currently limited”.
There are real concerns about the inaccuracy of algorithms, particularly when such inaccuracy is discriminatory, when mistakes disproportionately impact a certain group of people. It is well evidenced that machine-learning algorithms can learn to discriminate in a way that no democratic society would wish to incorporate into any reasonable decision-making process about individuals. An internal DWP fairness analysis of the universal credit payments algorithm, which was published only due to a freedom of information request, has revealed a “statistical significant outcome disparity” according to people’s age, disability, marital status and nationality.
This is not just a theoretical concern. Recent real-life experiences in both the Netherlands and Sweden should provide a real warning for us, and are clear evidence that we must have robust safeguards in place. Machine-learning algorithms used in the Netherlands’ child tax credit scandal learned to profile those with dual nationality and low income as being suspects for fraud. From 2015 to 2019, the authorities penalised families over suspicion of fraud based on the system’s risk indicators. Tens of thousands of families, often with lower incomes or belonging to ethnic minorities, were pushed into poverty. Some victims committed suicide. More than a thousand children were taken into foster care. The scandal ultimately led to the resignation of the then Prime Minister, Mark Rutte.
In Sweden in 2024, an investigation found that the machine-learning system used by the country’s social insurance agency is disproportionately flagging certain groups for further investigation over social benefits fraud, including women, individuals with foreign backgrounds, low-income earners and people without university degrees. Once cases are flagged, fraud investigators have the power to trawl through a person’s social media accounts, obtain data from institutions and even interview an individual’s neighbours as part of their investigations.
The two amendments that I have tabled are based on paragraph 2.2 of Code A to the Police and Criminal Evidence Act 1984, in relation to police stop and search powers, which states that:
“Reasonable suspicion cannot be based on generalisations or stereotypical images of certain groups or categories of people as more likely to be involved in criminal activity”.
These amendments would not reduce the ability of departments to go after fraud. Indeed, I argue that by ensuring that the reasonable suspicion is genuine, rather than based on stereotypes, they should improve the targeting of investigations and therefore make the investigations more effective, not less so.
The Bill extends substantial intrusive powers to the Cabinet Office, the PFSA and the DWP, and those powers must be subject to robust safeguards in the Bill. The use of “generalisations or stereotypes”, whether through automated systems or otherwise, should never be seen as grounds for reasonable suspicion. I hope the Minister will see the need for these safeguards in that context, just as they are needed and exist in relation to stop and search powers. I beg to move.
My Lords, it is a pleasure to follow the noble Lord, Lord Vaux of Harrowden, and to speak in favour of Amendments 75A and 79A, to which I have attached my name and which noble Lords will see have rather broad support in political terms—perhaps not the broadest I have ever seen but it is certainly up there. I must also pay tribute to Justice, a cross-party law reform and human rights organisation that is the UK section of the International Commission of Jurists, which has been most determined in ensuring that these issues are raised in this Bill, in this context.
I have already addressed these issues in the Chamber in a number of amendments to the Employment Rights Bill that I tabled and spoke to. I am not going to repeat all that I said there, but I cross-reference those amendments. If noble Lords want to find out more about this issue, there is an excellent book by the researcher Shannon Vallor, The AI Mirror, which is a useful metaphor for understanding the risks whereby we live in a biased society in which those biases risk being reflected back to us and magnified by the use of artificial intelligence and algorithms. That is very much what these two amendments seek to address.
The noble Lord has already given us two international examples of where using AI, algorithms, stereotypes and generalisations in investigations has gone horribly wrong. I have to add a third example, which is the infamous case in Australia of “Robodebt”. That was an automated debt recovery and assessment programme, from the rough equivalent of the DWP, that was exercised in Australia. There was controversy before and through its implementation, and it was an unmitigated disaster. I point the Minister and others to the fact that there was a Royal Commission in Australia which said the programme had been
“a costly failure of public administration in both human and economic terms”.
I note that the House of Representatives in Australia passed a public apology to the huge number of people who were affected.
In a way, I argue that these amendments are a protection for the Government, that this will be written into law: there is a stop that says, “No, we cannot allow things to run out of control in the way we have seen in so many international examples”. I think these are truly important amendments. I hope we might hear positive things from the Minister but, if not, we are going to have to keep pursuing these issues, right across the spectrum. I was very taken: Hansard will not record the tone of voice in which the noble Lord, Lord Vaux, said that the Government wish “to mainline AI”, but it is important to note that a concerning approach is being taken by the Government to the whole issue of artificial so-called intelligence.
The noble Baroness will be very aware that we now have several days of Committee before us on stage 2 of the Bill, and I look forward to discussing this and many issues with her as the Committee stage progresses.
My Lords, I thank all noble Lords who have taken part in this short but informative debate. I seem to be getting a bit of a track record. I thought my previous record was managing to get an amendment signed by both the noble Baroness, Lady Bennett, and the noble Baroness, Lady Noakes. I might even have surpassed that with this one. I am not sure quite what that says.
I am partially reassured by what the Minister has said, and obviously I am sure that she and her team will follow the safeguards that she has talked about. But those safeguards are not in statutes. For example, she talked about decisions being taken only by humans in relation to putting out information requests. That is not the case. The code of conduct refers only to decisions that will affect benefits, not the information request side of things, and it is only in the code of conduct, which can be changed at will. I am uncomfortable here.
We are talking, particularly with the eligibility verification process, about very large amounts of data, potentially on 9.9 million people. Who knows how many will flag up eligibility indicators? But without a shadow of doubt, the department will be using some form of algorithmic or AI tool to decide which of those are the ones the department wants to concentrate on. If that is the case, that is where the bias can creep in. If bias creeps into the algorithm or the machine learning tool and comes up to a person, it is easy to say “computer said yes” or “computer said no” and not to question the data coming to you.
I am not totally comfortable that there really are the safeguards at the moment. We are going to come to the human interaction at a later stage of the debate, so I will not go further into that. To be honest, I suspect that the Netherlands, Sweden and Australia probably had similar safeguards. They did not work. I cannot say for certain, but most departments believe that they are doing the right thing and that the safeguards are working. But they did not in those cases, and real problems were caused to vulnerable people.
I will withdraw the amendment but this is something that we will definitely come back to. Just in passing, I also welcome the noble Viscount, Lord Younger, to the right side of the fence with us. I beg leave to withdraw the amendment.