Skip to content

Search the site

AI is really a cloud-native problem, CNCF boss argues

(Irrational) exuberance is part of the process says Priyanka Sharma

The Cloud Native Computing Foundation sought to place itself at the heart of AI as it kicked off Kubecon in Paris and shrugged off any suggestion that the entire tech industry is in the grip of “irrational exuberance”.

That term, coined by 1990s Federal Reserve chairman Allan Greenspan, may chill anyone that remembers the dot com crash of 2000. And CNCF executive director Priyanka Sharma said it was understandable that some would see the current situation in this way.

“It can be off putting sometimes,” she conceded. “The wild valuations, humungous investment rounds; every startup with AI in its name.”

But Sharma quoted a venture capitalist who lost a packet in the dot com crash, who said “nothing important has ever been built without irrational exuberance”. [Which might further stoke the fears of anyone who also remembered the dot com crash and every subsequent bubble was “different this time”.]

AI has arguably eclipsed cloud hype – distracting or enticing CIOs and investors – Sharma insisted that the cloud native community “are the people who are building the infrastructure that supports the future.”

See also: PDFs, RAG, and LlamaParse: Generative AI's "Swiss Army Knife" adds a welcome new toolkit

She cited the cloud native world’s innovations around observability, workflow, and CI/CD, amongst others – as well as Kubernetes’ role in orchestrating the vast systems needed for today’s AI systems. All of which she said were particularly important when it comes to getting AI and machine learning projects out of research and into production.

“Prototyping generative AI is incredibly easy. But going to production at scale is how the AI dream is going to be realised,” she declared. And, she said, feedback from the user community was highlighting the difficulties here, not least in when it comes to “proprietary cloud-based solutions.”

Moreover, she said, research by the Linux Foundation showed that “60% of large organisations are experiencing or expect to experience a rapid increase in cost because of AI/ML workloads.”

See also: PlayStation wants to get gameservers running on Kubernetes. Here's why.

The answer, she said, was open technology and standards, as the cloud native world had already demonstrated.

In a panel with Sharma, Jeffrey Morgan, founder of Ollama, argued that the models themselves needed to be open source, in line with the surrounding tooling and infrastructure, to ensure customer trust.

Customers will ask themselves, “If the model’s not open source, how are we supposed to extend that in a way that's unique to our business? And how are we supposed to fulfil our dreams if we’re not able to actually open up the model understand how it works and ultimately, secure it?”

Timothee Lacroix, CTO at Mistral.AI added that the rapid development of AI made an open source tooling approach necessary. “It gives us so much more opportunity to just swap out things without asking ourselves, am I optimising during this one year something for hardware that I have to throw out next year,”

Morgan added, “my big ask is, how do we take the basics within our monitoring, security, scanning, logging, nd how do we bring that into this new world of AI-based applications.” The big challenge was to this fast enough to match the rapid evolution of AI, he said.

The CNCF has formalized its thinking with a whitepaper from its AI Working Group.

The white paper said AI was emerging as a “dominant cloud workload”, and while existing cloud native technology supported certain aspects of this, there were “challenges and gaps” to address.

Overall, said report coauthor Cathy Zhang, There were issues around cost efficiency and security. GPUs continued to be underutilized, and efforts to boost utilization would contribute towards sustainability.

Ultimately, the whitepaper predicted, a new engineering discipline would emerge with individuals rejoicing in job titles such as MLDevOps or AI engineer “becoming the glue between Data Science, Infrastructure and Development in the next few months or years.”

Join peers following The Stack on LinkedIn

 

Latest