Skip to content

Search the site

Nvidia launches AI offensive with Grace Hopper "superchip" platform

Nvidia has announced a new "superchip" design dubbed Grace Hooper as well as a service called AI-Workbench as part of a renewed push into the artificial intelligence space

Nvidia has unveiled a new chip and services as part of its latest pitch into the AI space.

Dubbed "Grace Hopper" after the pioneering female engineer and US Navy Rear Admiral, the processor is said by Nvidia to be specially equipped to handle complex generative AI algorithms in what it calls a "superchip."

Combining an ARM-based CPU (Grace) and GPU (Hopper) onto a single chip, the processor also sports an improved HBM3e memory architecture and higher bandwidth. The chips will integrate with Nvidia's MX server specification.

Nvidia boasts that, when installed in a server, the Grace Hopper platform will over eight processor cores and 282GB of HBM3e memory.  This can be expanded up to 1.2TB when multiple chips are linked together.

Nvidia said that it expects the new chips to be up and running in servers by Q2 of next year.

"To meet surging demand for generative AI, data centers require accelerated computing platforms with specialized needs,” said Nvidia CEO Jensen Huang,

"The new GH200 Grace Hopper Superchip platform delivers this with exceptional memory technology and bandwidth to improve throughput, the ability to connect GPUs to aggregate performance without compromise, and a server design that can be easily deployed across the entire data center."

In addition to the new chips, Nvidia said it will be launching a new service designed to streamline the development of AI-enabled applications.

To do this, Nvidia has released a tool called AI-Workbench and enlisted the help of AI software specialist Hugging Face to help deliver models and specifications for developers to use.

The chipmaker said that it designed the new service to be able to scale from a single developer's notebook up datacenter-scale applications.

The service will also integrate with NGX and GitHub.

"This is going to be a brand new service to connect the world’s largest AI community to the world’s best training and infrastructure," Huang said.

Huang said that both the Hopper chip and the AI-Workbench is part of a larger push by Nvidia to take control of the AI hardware space and leverage the chipmaker's extensive GPU operation for AI applications.

"Graphics and artificial intelligence are inseparable," Huang said, "graphics needs AI, and AI needs graphics."

Latest