Skip to content

Search the site

NewsGPUNVIDIAAI

NVIDIA datacentre chief to Europe: GPUs are coming, but have you got the power?

Schneider urges youngsters to think firmware and electronics skills instead of LLMs and data science

NVIDIA’s director for accelerated data centre, has reassured AI-coveting European firms that they will see increased access to the vendor’s GPU technology amidst concerns that a dearth of silicon is holding back European firms.

But that depends on industry entrants grasping that power electronics and engineering is an essential part of the mix, Schneider Electric’s datacentre power chief added, speaking alongside the chipmaker at an event.

NVIDIA doesn’t break down its shipments by region, however, there’s a consensus that you’re more likely to encounter its silicon in the US and East Asia than across Europe. The European Commission has become so concerned about the ability of European startups and researchers to access GPUs that it decided to repurpose its Nvidia-powered HPC infrastructure to fill this perceived gap.

It doesn’t help that the commission is also trying to balance the demand for AI infrastructure – ie, power-hungry, GPU-packed datacentres – with concerns about their environmental impact, as well as ethical and data sovereignty concerns.

So it is, perhaps, not surprising that during a panel of datacentre operators at Schnieder’s Innovation Summit earlier this month, one participant said they would “be surprised to have one single data centre for AI” in Europe.

(In the US there has been a flurry of AI-centric "cloud" providers like Coreweave popping up, many of them backed by NVIDIA itself; possibly amid concern that the hyperscalers are working aggressively on their own silicon for AI.)

NVIDIA’s Dion Harris during a joint interview with Schneider Electric’s secure power division EVP Pankaj Sharma and The Stack, said it was clear that CIOs faced multiple constraints as they looked to embrace AI. “There's definitely the infrastructure and being able to get access to GPUs,” he said.

One route to market for “a lot of the key enterprises we're working with” he said, was the vendor’s DGX cloud offering. “We're working with colo providers, and hyperscalars to some extent, to build out infrastructure to allow enterprises to train models, and then deploy them in their own inference-based infrastructure. So, they [CIOs] don't need to go and buy the big 20,000 GPU cluster.”

Really want your own kit? Rethink your DC...

The chiller plant of a data centre. Credit: Equinix.

But some customers will feel they must have their own kit – including the HGX H200 platform the company unveiled last year. NVIDIA and Schneider Electric have developed reference designs for powering AND cooling AI ready datacentres to help guide enterprises or colocation providers building out or retrofitting their infrastructure.

“We're really trying to be more thoughtful and working with the ecosystem to really have partners like Schneider to help build this out and give a lot of forethought and planning,” Harris said. “So, the customers are ready when these systems are rolling out, because a lot of the systems that we announced will be available later this year.”

In the meantime, he said, customers could start planning out how to build new datacentres or retrofit existing ones, “So they can really start to be prepared for what's required.”

That said, for European organizations, there are concerns over their ability to secure the GPU capacity. At the same time the EU is looking to lead the world in regulation around AI – as well as around the power hungry data centres underpinning AI.

See also: New OVHcloud NVIDIA GPU instances start landing

“Essentially, you know, we comply with whatever the regulations are,” Harris said. But he added, there was a “tug of war” with EU policy makers understanding that AI will be part of the economy and a driver for the economy meaning they were trying to balance ethical and environmental issues with ensuring the region is not left behind.

Nevertheless, he said, Europe has “a number of big systems coming online this year. The Swiss National Computing Centre (CSCS) will have around 10,000 Grace Hoppers, he said. The EuroHPC Joint Undertaking will also fire up a 93 petaflop Grace Hopper powered system at its Forschungszentrum Jülich facility in Germany.

“In just the last year, we've announced probably no less than six or seven world class AI supercomputers that will be used throughout Europe,” he said.

Ultimately, though, he said, a bigger challenge for companies looking to get on the AI train was how to gain access to skills.

“The good news is that one of the things in NVIDIA has been very sort of adamant about is trying to codify its own knowledge and make it available,” he claimed. Many developers might complain that Nvidia’s platform operates as a walled garden, but Harris insisted, “We’re not approaching this knowledge gap as a source of power.” And, he added, AI itself was a tool for reducing the knowledge gap. “I don’t think it’s a gating challenge.”

But building out or retrofitting datacentres also presents some gnarly challenges in terms of skills beyond coding or LLMs – not least in ensuring a supply of engineers who can work out how build and power them sustainably.

“One of the biggest gaps for me is young talent with power electronics experience,” said Sharma.

Undergrads were fixated on getting into the data science or software development spaces, he said. “It's hard to convince people that no, you’ve got to do hardware, because that's the base. And then you got to do firmware.”

Once youngsters grasped the power part of the equation, he said, things started to fall into place.

“When you speak to those kids and explain the entire sustainability story then they understand like, this makes sense. I want to save the planet. I'm 20 years old, you know, I'm gonna live for another 60 years. That makes sense.”

Arguably those power focused engineering skills might serve youngsters better than [today’s] AI specific skills. As Harris said, “We think there's gonna be increased value in domain expertise. Knowing how to programme Python won't be nearly as valuable when you can just talk to a computer.”

Join peers following The Stack on LinkedIn

Latest