Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Nvidia’s next GPUs will be designed partially by AI

During the GTC 2022 conference, Nvidia talked about using artificial intelligence and machine learning in order to make future graphics cards better than ever.

As the company chooses to prioritize AI and machine learning (ML), some of these advancements will already find their way into the upcoming next-gen Ada Lovelace GPUs.

Nvidia logo made out of microchips.
Nvidia

Nvidia’s big plans for AI and ML in next-gen graphics cards were shared by Bill Dally, the company’s chief scientist and senior vice president of research. He talked about Nvidia’s research and development teams, how they utilize AI and machine learning (ML), and what this means for next-gen GPUs.

In short, using these technologies can only mean good things for Nvidia graphics cards. Dally discussed four major sections of GPU design, as well as the ways in which using AI and ML can drastically speed up GPU performance.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

The goal is an increase in both speed and efficiency, and Dally’s example talks about how using AI and ML can lower a standard GPU design task from three hours to just three seconds.

Using artificial intelligence and machine learning can help optimize all of these processes.

This is allegedly possible by optimizing up to four processes that normally take a lot of time and are highly detailed.

This refers to monitoring and mapping power voltage drops, anticipating errors through parasitic prediction, standard cell migration automation, and addressing various routing challenges. Using artificial intelligence and machine learning can help optimize all of these processes, resulting in major gains in the end product.

Mapping potential drops in voltage helps Nvidia track the power flow of next-gen graphics cards. According to Dally, switching from using standard tools to specialized AI tools can speed this task up drastically, seeing as the new tech can perform such tasks in mere seconds.

Dally said that using AI and ML for mapping voltage drops can increase the accuracy by as much as 94% while also tremendously increasing the speed at which these tasks are performed.

Nvidia's slide on automated cell migration.
Nvidia

Data flow in new chips is an important factor in how well a new graphics card performs. Therefore, Nvidia uses graph neural networks (GNN) to identify possible issues in data flow and address them quickly.

Parasitic prediction through the use of AI is another area in which Nvidia sees improvements, noting increased accuracy, with simulation error rates dropping below 10 percent.

Nvidia has also managed to automate the process of migrating the chip’s standard cells, cutting back on a lot of downtime and speeding up the whole task. With that, 92% of the cell library was migrated through the use of a tool with no errors.

The company is planning to focus on AI and machine learning going forward, dedicating five of its laboratories to researching and designing new solutions in those segments. Dally hinted that we may see the first results of these new developments in Nvidia’s new 7nm and 5nm designs, which include the upcoming Ada Lovelace GPUs. This was first reported by Wccftech.

It’s no secret that the next generation of graphics cards, often referred to as RTX 4000, will be intensely powerful (with power requirements to match). Using AI and machine learning to further the development of these GPUs implies that we may soon have a real powerhouse on our hands.

Editors' Recommendations

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
Nvidia’s new GPUs could be right around the corner
Nvidia's RTX 4070 graphics cards over a pink background.

Is Nvidia really about to add to its lineup of top GPUs? All signs point to yes, and now, we have an official Nvidia keynote on the horizon that tells us when we might hear more about the rumored RTX 40 Super. Nvidia revealed that it's going to deliver a special address on January 8 as part of CES 2024. Although the company hasn't confirmed what it's planning to cover, the rumor mill has been buzzing with information about three new desktop GPUs. But will they really be worth the upgrade?

Several reputable leakers have weighed in on the matter of the RTX 40-series refresh, and we've been getting updates about the range for a few weeks now. Nvidia doesn't need to specifically state that it'll talk about these graphics cards, as that is going to be the expectation anyway. The three GPUs in question are the RTX 4080 Super, RTX 4070 Ti Super, and the RTX 4070 Super.

Read more
I’m scared of next-gen Nvidia GPUs, and you should be too
Nvidia CEO Jensen Huang with an RTX 4090 graphics card.

Few things are as thrilling in the PC world as the release of a new lineup of some of the best graphics cards. The excitement builds for months on end, with benchmarks, leaks, predictions, and finally, the launch of said GPUs. While I'm far from immune to that sort of hype, I can't bring myself to be excited about Nvidia's RTX 5000-series. In fact, I'm kind of dreading it.

My fears are based on the last couple of generations. While Nvidia certainly knows how to push its performance to new heights, all of this comes at a price that the mainstream market may not be prepared to pay.
Nvidia's rise to dominance

Read more
Nvidia may launch three new Super GPUs to fight back AMD
Three RTX 4080 cards sitting on a pink background.

Nvidia may be readying three new GPUs -- the RTX 4080 Super, RTX 4070 Ti Super, and the RTX 4070 Super. We haven't seen Super cards since Nvidia's RTX 20-series, but if this leak turns out to be true, they're coming back. Will they be worthwhile enough to rank high among the best graphics cards? It's hard to say, but they could help it compete against AMD's recent GPUs.

The information comes from hongxing2020, a frequent leaker in the GPU space. Nvidia already has a decent spread of GPUs between the RTX 4080, RTX 4070 Ti, and the RTX 4070. However, if a refresh to the Ada lineup is reportedly on the way, we might see some notable changes, but only if Nvidia decides to shake things up and use a different chip for at least two out of those three GPUs.

Read more