Skip to main content

Could Google’s Stadia bring back multi-GPU gaming?

How the Google Stadia could lead to a new era of multi-GPU gaming

AMD

Running multiple graphics cards in an SLI or Crossfire configuration was always the reserve of the ultra-enthusiast. It was expensive, often less stable than a single card solution, and the performance gains were rarely even close to linear — even if you could find games to support it. In recent years support for Nvidia and AMD’s multi-card technologies has waned even further, suggesting that the idea of having more than one graphics card in a gaming system was dying.

But in light of Google’s new Stadia game streaming service, we couldn’t help but wonder if it might make it viable. It’s all speculation at this point, but

Doubling up the graphics

There’s a lot we don’t know about Stadia, but what Google has promised is that it will be able to deliver 4K HDR gaming to anyone who wants it. That means that the technology Google uses to render those games at the server end, will need to be powerful. We’ve already learned that Stadia servers will combine a custom x86 CPU with an AMD graphics core. The implied specifications would suggest it’s some form of custom AMD Vega 56 GPU, but a Navi graphics card would make a lot of sense too.

Regardless of the GPU used though, Google will need more power if it hopes to live up to some the company’s stated ambitions. Google itself discussed how in the future Stadia could support 8K resolution and doing so in the near future would certainly require more than even the most powerful graphics cards of today can deliver. Even just pushing more costly visual features like ray tracing or higher framerates might require some more power than a single graphics card could afford.

That’s where multiple GPUs could play in.

UL Benchmarks (formerly Futuremark) suggested in a recent release that multiple GPUs could be a way to provide enough power for such effects. It even released a demonstration video showcasing how Google’s Stadia could leverage multiple graphics cards as and when required, to prevent too broad a framerate variation despite the complexity of a scene increasing.

Google Stadia tech demo: cloud-based multi-GPU rendering

In this demo video, the tweaked 3DMark Firestrike scene uses a singular graphics card for “most of the traditional geometry rendering,” as UL Benchmarks explains. But also, “additional GPUs are called in as needed to enhance the scene with dynamic fluid simulations and complex particle effects.”

This isn’t a demo that UL has just put together by itself. It was at GDC this week specifically to showcase cloud-based, multi-GPU rendering over Stadia which it’s been working on with Google for “months.” Google’s new streaming platform can, and already does, use multiple GPUs, so expect it to be used to deliver the performance the platform needs when it launches later this year.

UL Benchmarks

But that doesn’t really matter if there aren’t games to support it. There are very few modern games that support multiple graphics cards and of those that do, the experience is far from smooth. Performance gains of using multiple GPUs tend to be in the 20-40 percent range, which is a small step up for the added expense of a whole secondary graphics card. Multiple GPUs can also introduce stutters, frame syncing problems, and other odd stability issues.

But Stadia could change that. Developers don’t put much effort into supporting multiple graphics cards because the install base is so small. With Stadia, that could quickly expand. If Google were to equip its Stadia servers with multiple graphics cards, improving support for multiple GPUs would finally be of real benefit to developers. Give it a little time and the chicken and egg snowball starts to gather pace. As developers improve support for multi-GPUs, non-Stadia gamers could consider the additional cost worth it for added performance. Combining multiple mid-range graphics cards could offer a cheaper, staggered upgrade path than buying a brand new high-end alternative.

A lot would have to happen for such a world to become a reality. It would depend highly on the implementation of multi-GPU technology with Stadia being comparable to the way it operates on a single home gaming PC. But UL Benchmarks’ demo shows that in certain settings there is a real benefit to multiple graphics cards in 2019. If those settings were to become more common and broader, who’s to say that we couldn’t all see a benefit from adding a second GPU to our gaming PCs in the years to come? Here’s hoping.

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
A former Stadia exclusive comes to PC and consoles this week, and it’s a blast
Pac-Men chase one another through a beach maze in Pac-Man Mega Tunnel Battle: Chomp Champs.

If you're a fan of Pac-Man, you're in luck. On May 9, Pac-Man Mega Tunnel Battle: Chomp Champs (not the catchiest name, is it?) hits PC and all major consoles. The online-only multiplayer release is a battle royale game where 64 players fight to survive across interconnected mazes. It's a chaotic use of the Pac-Man formula that fans of the series should enjoy.

If all of this sounds oddly familiar, it's because the game originally launched nearly four years ago ... as a now-lost Google Stadia exclusive.

Read more
AMD’s canceled GPU could have crushed Nvidia
The AMD Radeon RX 7900 XTX graphics card.

For months now, we've been hearing rumors that AMD gave up on its best graphics card from the upcoming RDNA 4 lineup, and instead opted to target the midrange segment. However, that doesn't mean that such a GPU was never in the works. Data mining revealed that the card may indeed have been planned, and if it was ever released, it would've given Nvidia's RTX 4090 a run for its money.

The top GPU in question, commonly referred to as Navi 4C or Navi 4X, was spotted in some patch information for AMD's GFX12 lineup -- which appears to be a code name for RDNA 4. The data was then posted by Kepler_L2, a well-known hardware leaker, on Anandtech forums. What at first glance seems to be many lines of code actually reveals the specs of the reportedly canceled graphics card.

Read more
Intel may fire the first shots in the next-gen GPU war
Intel Arc A770 GPU installed in a test bench.

The GPU market is about to start heating up in just a few short months, and that's not just due to AMD and Nvidia. According to a new report, Intel plans to release its highly anticipated, next-gen Arc Battlemage graphics cards sooner than many have expected, and the GPUs might drop at just the perfect time to steal some sales away from AMD and Nvidia.

The tantalizing news comes from a report by ComputerBase. The publication claims that during Embedded World 2024, an event that took place in Germany, Intel's partners implied that Arc Battlemage GPUs might launch before this year's Black Friday. Realistically, this implies that Intel would have to hit the market in early November at the latest, giving its partners and retailers enough time to make the products readily available during the Black Friday shopping craze.

Read more