Skip to content

Search the site

UK.gov wants an almost open license for public AI projects to prevent commercial hijacking

Benefits should be for taxpayers, not multinationals

Photo by Jordhan Madec / Unsplash

AI specialists in the UK government are working with counterparts in other countries to develop an almost open license that will prevent commercial organisations from hijacking their work and selling it back to them.

The aim is to create a model that will allow other parties to examine government produced code and projects but restrict their free use to public sector adjacent or charitable organisations, with taxpayers ultimately benefiting from any commercial adoption.

Dr Laura Gilbert, head of AI for government at the Ellison Institute, and former director of the Incubator for AI at 10 Downing St, told the State of Open Con conference in London, the biggest AI risk for government and policy makers was “not doing it.”

Given the strains on UK public services, particularly the National Health Service, “If we don't invest in AI, if we don't start doing this really, really well, we really do a huge disservice.”

While at Number 10, Gilbert oversaw the creation of an incubator for AI solutions. A number of projects it developed were unveiled by science secretary Peter Kyle last month.

These included Red Box, which is a platform to embed gen AI into policy creation, allowing policy makers to analyse mountains and documents to produce tailored briefings.

These projects had a commitment to “radical transparency”, Gilbert said, “We tell everybody what we're doing all the time, and we're discussing our code.”

But this approach had pros and cons, she said, not least the possibility of commercial companies taking the open sourced code and productizing it.

Which she said was exactly what happened with Red Box. “One of the first things that happened with Red Box open sourcing is a certain company went and took that code and repackaged it and then sold it back into government, which is very much not what we hoped.”

More broadly, she said, “Other governments can use it, other countries can use it, people who haven't paid taxes…that is challenging, particularly if you don't have a good relationship with those countries or companies.”

This had spurred an effort to develop a licensing model that is “not fully open. You can look at our code. You can experiment with it.”  It could be used in the public or charitable sectors or to derive economic value in the UK. But, she said, “We don’t necessarily want other uses.”

Speaking to the Stack, Gilbert said the effort was at a “working group” stage, and was working with counterparts in other countries, including Canada’s Vector Institute for Artificial Intelligence, which faced similar problems.

The aim was to develop a license that encourages AI for public good, which can be easily reused and deployed across the public sector. “But you don't miss the opportunity to generate financial value for taxpayer from other use cases.”

“We need to be able to fund the NHS. If you're thinking companies are making billions of pounds off of this software, then we think the taxpayer should have some of that.”

Amanda Brock, head of Open UK, said that while the organisation could support Gilbert's effort, once any restrictions were imposed on use, projects could not be considered open source. To be truly open source, she said, “Anyone can use it for any purpose, and you enable your competitors.”

Or, in the government's case, multinationals with an eye on the public purse.

See Also: Federal CIOs need better skills: $100b in annual IT spend at risk

Latest