Skip to main content

Could Google’s Stadia bring back multi-GPU gaming?

How the Google Stadia could lead to a new era of multi-GPU gaming

AMD

Running multiple graphics cards in an SLI or Crossfire configuration was always the reserve of the ultra-enthusiast. It was expensive, often less stable than a single card solution, and the performance gains were rarely even close to linear — even if you could find games to support it. In recent years support for Nvidia and AMD’s multi-card technologies has waned even further, suggesting that the idea of having more than one graphics card in a gaming system was dying.

But in light of Google’s new Stadia game streaming service, we couldn’t help but wonder if it might make it viable. It’s all speculation at this point, but

Doubling up the graphics

There’s a lot we don’t know about Stadia, but what Google has promised is that it will be able to deliver 4K HDR gaming to anyone who wants it. That means that the technology Google uses to render those games at the server end, will need to be powerful. We’ve already learned that Stadia servers will combine a custom x86 CPU with an AMD graphics core. The implied specifications would suggest it’s some form of custom AMD Vega 56 GPU, but a Navi graphics card would make a lot of sense too.

Regardless of the GPU used though, Google will need more power if it hopes to live up to some the company’s stated ambitions. Google itself discussed how in the future Stadia could support 8K resolution and doing so in the near future would certainly require more than even the most powerful graphics cards of today can deliver. Even just pushing more costly visual features like ray tracing or higher framerates might require some more power than a single graphics card could afford.

That’s where multiple GPUs could play in.

UL Benchmarks (formerly Futuremark) suggested in a recent release that multiple GPUs could be a way to provide enough power for such effects. It even released a demonstration video showcasing how Google’s Stadia could leverage multiple graphics cards as and when required, to prevent too broad a framerate variation despite the complexity of a scene increasing.

Google Stadia tech demo: cloud-based multi-GPU rendering

In this demo video, the tweaked 3DMark Firestrike scene uses a singular graphics card for “most of the traditional geometry rendering,” as UL Benchmarks explains. But also, “additional GPUs are called in as needed to enhance the scene with dynamic fluid simulations and complex particle effects.”

This isn’t a demo that UL has just put together by itself. It was at GDC this week specifically to showcase cloud-based, multi-GPU rendering over Stadia which it’s been working on with Google for “months.” Google’s new streaming platform can, and already does, use multiple GPUs, so expect it to be used to deliver the performance the platform needs when it launches later this year.

UL Benchmarks

But that doesn’t really matter if there aren’t games to support it. There are very few modern games that support multiple graphics cards and of those that do, the experience is far from smooth. Performance gains of using multiple GPUs tend to be in the 20-40 percent range, which is a small step up for the added expense of a whole secondary graphics card. Multiple GPUs can also introduce stutters, frame syncing problems, and other odd stability issues.

But Stadia could change that. Developers don’t put much effort into supporting multiple graphics cards because the install base is so small. With Stadia, that could quickly expand. If Google were to equip its Stadia servers with multiple graphics cards, improving support for multiple GPUs would finally be of real benefit to developers. Give it a little time and the chicken and egg snowball starts to gather pace. As developers improve support for multi-GPUs, non-Stadia gamers could consider the additional cost worth it for added performance. Combining multiple mid-range graphics cards could offer a cheaper, staggered upgrade path than buying a brand new high-end alternative.

A lot would have to happen for such a world to become a reality. It would depend highly on the implementation of multi-GPU technology with Stadia being comparable to the way it operates on a single home gaming PC. But UL Benchmarks’ demo shows that in certain settings there is a real benefit to multiple graphics cards in 2019. If those settings were to become more common and broader, who’s to say that we couldn’t all see a benefit from adding a second GPU to our gaming PCs in the years to come? Here’s hoping.

Editors' Recommendations

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
GPU prices are back on the rise again
RTX 4060 Ti sitting next to the RTX 4070.

We haven't had to worry about the prices of some of the best graphics cards for quite some time. With most GPUs sold around their recommended retail price, there are plenty of options for PC builders in need of a new graphics card. However, a new report indicates that we might see an increase in GPU prices, especially on the cards made by Nvidia's add-in board partners (AIBs). Is it time to start worrying about another GPU shortage? Not quite, but it might be better to shop now before it gets worse.

The grim news comes from IT Home, a Chinese tech publication that cites anonymous "industry sources" as it predicts that Nvidia's AIBs are about to raise their prices by up to 10% on average -- and this won't be limited to high-end GPUs along the lines of the RTX 4090. In fact, IT Home reports that the RTX 4070 Super has already received a price increase of about 100 yuan, which equals roughly $14 at the time of this writing. This is a subtle price increase given that the GPU costs $550 to $600, but according to the report, it might just be the beginning.

Read more
How 8GB VRAM GPUs could be made viable again
Screenshot of full ray tracing in Cyberpunk 2077.

Perhaps there is still some hope for GPUs with low VRAM. According to a new patent published by Microsoft, the company worked out a method that could make ray tracing and path tracing more viable in terms of how much video memory (VRAM) they use. As of right now, without using upscaling techniques, seamless ray tracing requires the use of one of the best graphics cards—but this might finally change if this new method works out as planned.

This new patent, first spotted by Tom's Hardware, describes how Microsoft hopes to reduce the impact of ray tracing on GPU memory. It addresses the level of detail (LOD) philosophy, which is already something that's used in games but not in relation to ray tracing, and plans to use LOD to adjust ray tracing quality dynamically, thus lowering the load that the GPU -- particularly its memory -- has to bear.

Read more
Nvidia is the ‘GPU cartel,’ says former AMD Radeon manager
A hand holding the RTX 4090 GPU.

AMD's former senior vice president and general manager of Radeon has come out with some strong words against Nvidia. Scott Herkelman called Nvidia "the GPU cartel" in response to a story from the Wall Street Journal in which Nvidia's customers claim that it delays GPU shipments in retaliation for those customers shopping with other suppliers.

The accusation in question comes from Jonathan Ross, CEO of AI chip startup Groq, who said, "a lot of people that we meet with say that if Nvidia were to hear that we were meeting, they would disavow it. The problem is you have to pay Nvidia a year in advance, and you may get your hardware in a year, or it may take longer, and it's, 'Aw shucks, you're buying from someone else, and I guess it's going to take a little longer.'"

Read more