Labour Wants to Force Banks to Spy on Their Most Vulnerable Customers
‘A two-tier justice system.’
by Harriet Williamson
24 April 2025

A new law could force banks to spy on their poorest and most vulnerable customers, while letting the wealthy escape scot-free.
The Department for Work and Pensions could soon have the power to make secretive and invasive interventions into the lives and finances of 23 million people, as part of what the Labour party has described as the “biggest fraud crackdown in a generation”.
Labour’s public authorities (fraud, error and recovery) bill, currently making its way through the Commons at a lightning pace, quietly resurrects one of the most controversial parts of the Tories’ data protection and digital information (DPDI) bill, which was scrapped last year. If passed, legislation could come into effect in a matter of months, allowing the DWP to instruct banks to algorithmically scan the bank accounts of all their customers, to identify those receiving benefits.
Banks will then compare the accounts of people in receipt of benefits against a list of DWP ‘eligibility criteria’. If AI flags an account as meeting this criteria, banks are compelled to pass on the name and account number to the government for investigation.
Under the direction of the DWP, banks will then deduct cash directly from the accounts of claimants who have been overpaid, adding administration costs, in a move criticised this week by an independent government watchdog.
The Regulatory Policy Committee said the department’s own impact assessment of the bill failed to “sufficiently take into consideration the potential impact on the poorest members of society of reclaiming overpayments due to error”.
Jasleen Chaggar, legal and policy officer at Big Brother Watch, told Novara Media: “[The new bill is] dressed up as having more safeguards than the previous [Tory] bill, but the powers are essentially the same. When you’re doing this kind of algorithmic scanning on scale, there will inevitably be errors and those errors are going to disproportionately impact disabled people, carers, the elderly and the poorest in society.
“We’re really concerned about this targeting of all welfare recipients who are having their privacy infringed just because they’ve received benefits.”
Coventry South MP Zarah Sultana accused the government of rushing the legislation through with little scrutiny, and said the bill risks “creating a two-tier justice system – one for the very wealthy, who will never face this kind of intrusion, and another for those on benefits, who will be subject to constant scrutiny”.
Former shadow chancellor John McDonnell also voiced concerns about the bill’s bank spying powers, suggesting it targets the wrong people. “If there was a group of people whose accounts we would want to monitor because there has been a history of fraud, and who have had to pay money back – some have gone to prison – it would be MPs.”
Banks aren’t particularly happy, either. After quietly lobbying against the plans for over a year, and despite ministers’ efforts to court the City, they went public with concerns in January. UK Finance, the trade body that represents British banks, said the plans could “undermine the banks’ own efforts to protect vulnerable account holders” and compromise regulatory and legal obligations banks already have to their customers.
The legislation is part of a raft of new initiatives to use AI to scrutinise benefits claimants. And the DWP is already using a machine learning tool for detecting universal credit fraud that the department’s own analyses has found to be biased, incorrectly targeting individuals from certain demographic groups more than others.
In December 2023, the DWP purchased three licenses for an AI social media ‘listening’ tool from business process outsourcing company Capita. The Cosain software can analyse and flag millions of social media posts from individual citizens in a bid to clamp down on benefit fraud, at a cost of £46,800 for one year.
In addition to being invasive, Chaggar believes the new measures will have little impact on government funds. “By the DWP’s own assessment, these powers aren’t going to be very effective. We calculated, using their impact assessment, that the powers are estimated to recover 1.4% of annual fraud and error loss.”
Rick Burgess, from Greater Manchester Coalition of Disabled People, warns that sweeping new powers are unlikely to remain trained on benefits claimants. “At the moment, they’re saying that if you get a benefit award from the DWP, that you should have fewer rights to financial privacy than a citizen who doesn’t, and that is discrimination.
“But when banks start spying on behalf of the government, all that remains is for statutory instruments to change the targeting. At the moment, they might say, ‘oh, it’s only on universal credit claimants’. Then it might be, ‘it’s only universal credit claimants and Pip claimants’, then just universal credit, Pip and pensioners.
“Once the capacity, the technology and the fundamental legal change is in place, the mission creep is very, very easy for them to pull off, and potentially the whole country could find itself under surveillance.”
A DWP spokesperson said: “We do not recognise this characterisation of our fraud, error and recovery bill. The bill includes an eligibility verification measure which will require banks to share limited data on claimants who may wrongly be receiving benefits – such as those on universal credit with savings over £16,000. This does not involve access to benefit claimants’ bank accounts.”
Computer says no.
Labour’s fraud bill isn’t the only piece of legislation bringing unaccountable and potentially biased AI into contact with poor and vulnerable people.
The data (use and access) bill, which is approaching the final stage of its journey through parliament, is about to make it much easier for AI tools to make decisions without any human involvement.
Under current GDPR legislation, there are conditions on where solely automated decision-making can be used. The data (use and access) bill will reverse this to allow automated decision-making in almost all cases.
The exception will be where special category or ‘sensitive’ data is processed – but as revealed by the DWP’s own fairness analyses of their advance payments fraud tool, there are plenty of proxies that can be used in lieu of special category data that reveal the same information and can be used to discriminate in the same way.
Buried within the bill are provisions that will allow the secretary of state to change the definition of what constitutes meaningful human input into a decision. In practice, this effectively means that someone could merely ‘rubber-stamp’ a machine-made decision despite not having the requisite training or skills to interrogate or challenge it.
“The number of decisions that can be made through solely automated decision-making will expand dramatically if this bill is passed as it is,” Chaggar told Novara Media. However, it’s unlikely that everyone affected will understand the real-world consequences of Labour’s data bill.
“If this is being used by the DWP, there could just be a generic explanation on their website that they’re using this algorithm for this purpose, and this is how it works,” Chaggar said. “But if you’re an individual who then receives a negative decision outcome about your eligibility, you won’t really connect the dots and realise that it’s been made through solely automated decision-making.”
Another worrying part of the bill involves something called ‘recognised legitimate interest’. The UK GDPR currently recognises six categories for the lawful process of our data. One is called ‘legitimate interest’, where a weighting exercise must take into account how the individual’s rights and freedoms will be affected by data processing.
Labour’s data bill introduces a seventh category – recognised legitimate interest. It’s very broad, including areas such as crime and national security. Under this category, no individual weighting exercise has to take place. This means that an AI can scrape, share and use sensitive personal data on wide-ranging – or spurious – grounds without having to assess the impact on people’s lives.
Many people see the data bill and the public authorities fraud bill as lynchpin legislations in a sweeping attack on sick, disabled, impoverished and otherwise vulnerable people. Chaggar says they will allow the government to use AI in a “permissive way” that targets people on benefits – and eventually, all of us.
“There’s an urgency in how they are being driven through parliament without very much pushback, and that’s why we need to get the public engaged in what these algorithms will mean for them.
“The increased surveillance welfare recipients will be subjected to if these two bills pass is all part of a digitised welfare state where people on benefits are subjected to a higher level of scrutiny and are treated as suspects by default, just by virtue of needing support.”
Harriet Williamson is a journalist and former editor at Pink News and the Independent.