Running multiple graphics cards in an SLI or Crossfire configuration was always the reserve of the ultra-enthusiast. It was expensive, often less stable than a single card solution, and the performance gains were rarely even close to linear — even if you could find games to support it. In recent years support for Nvidia and AMD’s multi-card technologies has waned even further, suggesting that the idea of having more than one graphics card in a gaming system was dying.
But in light of Google’s new Stadia game streaming service, we couldn’t help but wonder if it might make it viable. It’s all speculation at this point, but
Doubling up the graphics
There’s a lot we don’t know about Stadia, but what Google has promised is that it will be able to deliver 4K HDR gaming to anyone who wants it. That means that the technology Google uses to render those games at the server end, will need to be powerful. We’ve already learned that Stadia servers will combine a custom x86 CPU with an AMD graphics core. The implied specifications would suggest it’s some form of custom AMD Vega 56 GPU, but a Navi graphics card would make a lot of sense too.
Regardless of the GPU used though, Google will need more power if it hopes to live up to some the company’s stated ambitions. Google itself discussed how in the future Stadia could support 8K resolution and doing so in the near future would certainly require more than even the most powerful graphics cards of today can deliver. Even just pushing more costly visual features like ray tracing or higher framerates might require some more power than a single graphics card could afford.
That’s where multiple GPUs could play in.
UL Benchmarks (formerly Futuremark) suggested in a recent release that multiple GPUs could be a way to provide enough power for such effects. It even released a demonstration video showcasing how Google’s Stadia could leverage multiple graphics cards as and when required, to prevent too broad a framerate variation despite the complexity of a scene increasing.
In this demo video, the tweaked 3DMark Firestrike scene uses a singular graphics card for “most of the traditional geometry rendering,” as UL Benchmarks explains. But also, “additional GPUs are called in as needed to enhance the scene with dynamic fluid simulations and complex particle effects.”
This isn’t a demo that UL has just put together by itself. It was at GDC this week specifically to showcase cloud-based, multi-GPU rendering over Stadia which it’s been working on with Google for “months.” Google’s new streaming platform can, and already does, use multiple GPUs, so expect it to be used to deliver the performance the platform needs when it launches later this year.
But that doesn’t really matter if there aren’t games to support it. There are very few modern games that support multiple graphics cards and of those that do, the experience is far from smooth. Performance gains of using multiple GPUs tend to be in the 20-40 percent range, which is a small step up for the added expense of a whole secondary graphics card. Multiple GPUs can also introduce stutters, frame syncing problems, and other odd stability issues.
But Stadia could change that. Developers don’t put much effort into supporting multiple graphics cards because the install base is so small. With Stadia, that could quickly expand. If Google were to equip its Stadia servers with multiple
A lot would have to happen for such a world to become a reality. It would depend highly on the implementation of multi-GPU technology with Stadia being comparable to the way it operates on a single home gaming PC. But UL Benchmarks’ demo shows that in certain settings there is a real benefit to multiple graphics cards in 2019. If those settings were to become more common and broader, who’s to say that we couldn’t all see a benefit from adding a second GPU to our gaming PCs in the years to come? Here’s hoping.