The UK government’s ambitious AI plans are at risk from a long tail of legacy kit, underpaid employees, and the dominance of a small number of tech giants.
Whitehall’s efforts are also dogged by a lack of transparency and low public trust, a report by Parliament’s Public Accounts Committee said today.
The report will make uncomfortable reading for a government that is banking on AI to improve efficiency and delivery by the public sector, and boost productivity in the country at large. Both of which are essential for Keir Starmer’s efforts to dig the country out of a worsening economic hole.
The first problem the MPs zeroed in on was the long tail of creaking public tech choking AI efforts.
“Out of the 72 highest–risk legacy digital systems prioritised as part of the 2022–2025 digital and data roadmap, 21 still lack remediation funding, and data quality and data sharing barriers are persistent and long–standing,” the MPs said.
DSIT told the MPs that it “emphasised prioritising the “systems that have the most valuable data” and “the highest levels of security vulnerability”. At the same, there was no “magic bullet” for aging software and hardware, and civil servants said “it will take hard work over a long time to fix, and acknowledged that it needed to get a better grip on the issue.”
This all matters as “Access to good–quality data was identified as a barrier to implementing AI by 62% of the 87 government bodies responding to [an] NAO survey.”
The MPs demanded action on this within six months, including ensuring funding for the highest risk legacy tech and aaddressing the risks to AI adoption resulting from barriers to data–sharing and poor data quality.
Too few departments are reporting transparently on their algorithm–assisted decision making and DSIT must do more to demonstrate that its assurance of high–risk AI is robust, the committee said. Just 33 records had been published on the Algorithmic Transparency Recording Standard website.
Again, the MPs asked for an update in six months on “Departmental compliance with the Algorithmic Transparency Recording Standard and further action it is taking to tackle gaps in transparency to strengthen public trust, including to address public concerns over data privacy and the sharing of sensitive data.
They said “strengthened spend controls for high–risk AI use cases will support safe and ethical AI roll–out.”
Concerted effort and leadership from DSIT is needed or government risks duplicating effort and cost from siloed pilot activity. Not least as despite a plethora of pilots, “there is so far little evidence of successful adoption at scale. To grasp the opportunities of AI, government must learn from these pilots, identify the most promising examples, and where appropriate, help drive adoption at scale so the whole of the public sector can take advantage. “
None of this is helped by a shortage of digital and data skills in government. But the committee is “Sceptical that the reforms DSIT is planning, including strengthening digital leadership and assessing the competitiveness of the overall package for digital and data professionals, will be sufficient to tackle the skills gap where previous attempts have failed.”
Of course, it’ not just skills as such, its “civil service pay levels that are uncompetitive with the private sector and the need for more technical roles within the profession”. The pay gap between private and public sector architects was pegged at around 35%, or £30,000 per year. And 5.4% of the civil service workforce is in the digital and data profession, which compares unfavourably to the private sector benchmark of 8–12%.
The civil service wants to make use of secondments from the private sector.
There’s one part of the private sector with plenty of “skills” to share. Big tech. But that’s gonna cost.
Unsurprisingly the MPs reported that “Stakeholders are concerned that the AI market is dominated by a small number of large technology companies, and that government’s approach to procurement is not set up to get the best from all suppliers.”
The success of a planned AI sourcing and procurement framework and digital commercial centre of excellence “will be critical to ensuring a vibrant AI market in the UK and value for money in the procurement of AI in the public sector.”
Even so, the MPs noted that The State of Digital Government review reported that “government is not doing enough to ensure the public sector benefits from the scale of its buying power.”
In response, the Cabinet Office and DSIT assured the MPs there were tackling this, “Giving as an example a recent agreement reached with Microsoft to offer discounted products to every part of the public sector.” Something the Cabinet Office apparently described as “quite groundbreaking”.
But tech giants might not be so amenable in future. The big cloud platforms have historically chafed at attempts by Europe (in the broadest sense) to regulate AI and digital services. Newly aligned with an aggressive Trump administration they're pushing back much harder at attempts to crimp their power by mere Europeans.
Starmer’s government has already ditched the AI safety approach of the Sunak era, in favour of a more US-style approach. But Washington – and tech operators – don’t seem to be in a particularly compromising mood these days.
Alex Case industry principal at Pegasystems, and a former senior official in the Cabinet Office and the Treasury, said it was critical that the government bring civil servants along with the changes.
“Civil servants want to know where and how AI can help them be more productive. From our own work with governments and enterprises globally, we know that are a variety of ways that both generative and process AI and machine learning can play a role in automating and streamlining important government back office work.”