Skip to main content

Nvidia’s first CPU is here and powering next-gen cloud gaming

During Computex 2022, Nvidia announced the upcoming release of its first system reference designs powered by the Nvidia Grace CPU. Upon launch, Nvidia’s first CPU will help usher in the next generation of high-performance computing (HPC), enabling tasks such as complex artificial intelligence, cloud gaming, and data analysis.

Nvidia Grace Hopper processors.
Image used with permission by copyright holder

The upcoming Nvidia Grace CPU Superchip and the Nvidia Grace Hopper Superchip will find their way into server models from some of the most well-known manufacturers, such as Asus, Gigabyte, and QCT. Alongside x86 and other Arm-based servers, Nvidia’s chips will bring new levels of performance to data centers. Both the CPU and the GPU were initially revealed earlier this year, but now, new details have emerged alongside an approximate release date.

Although Nvidia is mostly known for making some of the best graphics cards, the Grace CPU Superchip has the potential to tackle all kinds of HPC tasks, ranging from complex AI to cloud-based gaming. Nvidia teased that the Grace Superchip will come with two processor chips connected through Nvidia’s NVLink-C2C interconnect technology.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Joined together, the chips will offer up to 144 high-performance Arm V9 cores with scalable vector extensions as well as an impressive 1TB/s memory subsystem. According to Nvidia, its new design will double the memory bandwidth and energy efficiency of current-gen server processors. Some of the use-cases for the new CPU that Nvidia lists include data analytics, cloud gaming, digital twin, and hyper-scale computing applications.

Launching alongside the Nvidia Grace is the Nvidia Grace Hopper Superchip, and although strikingly similar by name, the “Hopper” gives it away — this is not just a CPU. Nvidia Grace Hopper pairs an Nvidia Hopper graphics card with an Nvidia Grace processor, once again utilizing the same NVLink-C2C technology.

Hopper H100 graphics card.
Image used with permission by copyright holder

Combining the two has a massive effect on the speed of data transfer, making it up to 15 times faster than that of traditional CPUs. Both of the chips are impressive, but the Grace and Grace Hopper combo should be capable of facing just about any task, including giant-scale artificial intelligence applications.

The new Nvidia server design portfolio offers single baseboard systems with up to four-way configurations available. These designs can be further customized based on individual needs to match specific workloads. To that end, Nvidia lists a few systems.

The Nvidia HGX Grace Hopper system for AI training, inference, and HPC comes with the Grace Hopper Superchip and Nvidia’s BlueField-3 data processing units (DPUs). There’s also a CPU-only alternative that combines the Grace CPU Superchip with BlueField-3.

Nvidia’s OVX systems are aimed at digital twins and collaboration workloads and come with a Grace CPU chip, BlueField-3, and Nvidia GPUs that are yet to be revealed. Lastly, the Nvidia CGX system is made for cloud gaming and graphics. It pairs the Grace CPU Superchip with BlueField-3 and Nvidia’s A16 GPUs.

Nvidia’s new line of processors and HPC graphics cards is set to release in the first half of 2023. The company teased that dozens of new server models from its partners will be made available around that time.

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
Nvidia’s most important next-gen GPU is less than 2 weeks away
An Nvidia GeForce RTX graphics card seen from the side.

Nvidia has just quietly announced the official release date of the RTX 4060. The GPU is coming out on June 29 and will be priced at $299.

Although performance-wise, the RTX 4060 can't hope to compete against some of the best graphics cards, it's still one of Nvidia's most important GPUs. Can it repeat the success of its predecessors?

Read more
Intel thinks your next CPU needs an AI processor — here’s why
The Intel Meteor Lake chip.

Intel thinks your next processor upgrade should include a dedicated AI processor and its upcoming Meteor Lake chips conveniently fill that gap. The company detailed how it suspects its Vision Processing Units (VPUs) will be leveraged at Computex 2023, and it's including these processors stock on every Meteor Lake chip.

The VPU isn't new. Intel introduced this dedicated AI processor with its 13th-gen Raptor Lake processors, but only on a select few models. The company says they'll come on all Meteor Lake chips, which are slated to launch at the end of 2023.

Read more
Nvidia built a massive dual GPU to power models like ChatGPT
Nvidia's H100 NVL being installed in a server.

Nvidia's semi-annual GPU Technology Conference (GTC) usually focuses on advancements in AI, but this year, Nvidia is responding to the massive rise of ChatGPT with a slate of new GPUs. Chief among them is the H100 NVL, which stitches two of Nvidia's H100 GPUs together to deploy Large Language Models (LLM) like ChatGPT.

The H100 isn't a new GPU. Nvidia announced it a year ago at GTC, sporting its Hopper architecture and promising to speed up AI inference in a variety of tasks. The new NVL model with its massive 94GB of memory is said to work best when deploying LLMs at scale, offering up to 12 times faster inference compared to last-gen's A100.

Read more