The UK’s Department for Work and Pensions has insisted that its use of machine learning is defensible and “nothing to worry” after a committee of MPs criticised its effect on benefits claimants.
The Committee of Public Accounts dropped a scathing report at the end of January, saying the service the department provided to customers was wildly variable, depending on what service they were using.
Meanwhile, benefits claimants received £4 billion less than they were entitled to, while at the same time £9.5 billion was overpaid.
The departments modernisation programme was also criticised. It is due to run from 2022-23 to 2032-33, and MPs said its scale and complexity posed significant risk.
More specifically, MPs said the DWP’s use of machine learning to identify potential fraud posed a risk. They added: “The previous Public Accounts Committee repeatedly raised concerns about the impact of data analytics and machine learning on legitimate benefit claims being delayed or reduced, the number of people affected, and whether this is affecting specific groups of people.”
The MPs also referenced the risk of “machine learning taking on human biases when it is trained on historical data and the potential for wide–scale detrimental impacts on claimants if there is a system–error.”
Machine learning will only be as good as the data it is trained on. And given the DWP’s creaking systems, and historic tech problems, it’s reasonable to question exactly how good its data is.
The MPs called on the department to share, “in confidence if necessary” the results of its most recent fairness impact assessment “to provide reassurance that its use of machine learning is not resulting in claimants being treated unfairly.”
The DWP permanent secretary, Sir Peter Schofield, in a committee hearing just before the report was published, said the department only had one machine learning system in production, for universal credit advances.
But he confirmed it was working on others on “other areas of loss, such as undeclared living together, self-employment and capital.” He insisted they were just to guide decisions. But he added publishing too much information could help fraudsters.
Neil Couling, DWP director general for fraud, disability and health, promised the committee: “What we are going to do is set some standards independent of the teams running this. Then we are going to get our analytical teams—teams not running the system—to oversee the analyses.”
Publishable analyses will be included in the next set of annual accounts: “So that Parliament can see for itself that there is nothing to worry about here.”
A DWP spokesperson added, “It is right that we use machine learning models to identify potential fraud and error risks and prevent fraud in the highest areas of loss – and we do so within a robust governance and ethical framework and with constant human oversight.”
Notably, the DWP did not field its chief digital information office at the most recent committee hearing. Richard Corbridge left the department in October, after just 18 months to return to the private sector. He was replaced on an interim basis by Helen Wylie, previously CTO at the department.