Skip to main content

Stop worrying so much about benchmarks when buying a new GPU

GPU prices are finally normal, and you might have found yourself in recent weeks browsing graphics cards reviews to see which ones top the charts. After all, the best graphics cards live and die based on their performance in gaming benchmarks, right?

But those benchmarks are far from a definitive answer, and in most cases, they skew the conversation away from the games you actually play and the experiences they offer.

I’m not saying we need to throw the baby out with the bathwater. GPU benchmarks offer a lot of value, and I don’t think anything needs to change about how we (or others) conduct GPU reviews. But now that it’s actually possible to upgrade your graphics card, it’s important to take all of the performance numbers in context.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Games, not benchmarks

A character swinging a sword in Lost Ark.
The most popular Steam game of 2022 so far? Lost Ark, which only calls for a GTX 1050. Image used with permission by copyright holder

DT’s computing evergreen coordinator Jon Martindale made a joke concerning GPU prices the other day: “I need a new GPU so I can get 9,000 frames in Vampire Survivors.” Silly, but there’s a salient point there. When looking at performance, it’s important to recognize the fact that there are around four times as many people playing Terraria or Stardew Valley as there are playing Forza Horizon 5 or Cyberpunk 2077 at any given time.

The best games to benchmark your PC are not the most popular games that people play. In the top 25 most popular Steam games, only two of them are regularly used in benchmarks: Grand Theft Auto V and Rainbow Six Siege. Virtually no “live” games are included in benchmark suites due to network variation, despite the fact that these games largely top the charts in player count, and recent, GPU-limited games are usually overrepresented.

The games that we and others have chosen as benchmarks aren’t the problem — they offer a way to push a GPU to its extreme in order to compare it to the competition and previous generations. The problem is that benchmark suites frame performance around the clearest margins. And those margins can imply performance that doesn’t hold up outside of a graphics card review.

Benchmarks are often misleading

A hand grabbing the RTX 3090 Ti graphics card.
Jacob Roach / Digital Trends

Especially when it comes to the most recent graphics cards, benchmarks can be downright misleading. Every benchmark needs at least an average frame rate, which is a problematic number in and of itself. Brief spikes in frame rate are over-represented in an average of 1% lows and 0.1% lows — which average the lowest 1% and 0.1% of frames, respectively. But those numbers still don’t say much about how often those frame rate dips occur — only how severe they are.

A frame time chart can show how often frame rate dips happen, but even that only represents the section of the game the benchmark focused on. I hope you see the trend here: The buck has to stop somewhere, even as more data points try to paint a picture of real-world performance. Benchmarks show relative performance, but they don’t say much about the experience of playing a game.

The RTX 3090 Ti is 8.5% faster than the RTX 3090 in Red Dead Redemption 2, for example. That’s true, and it’s important to keep in mind. But the difference between the cards when playing is all of seven frames. I’d be hard-pressed to tell a difference in gameplay between 77 fps and 84 fps without a frame rate counter, so while the RTX 3090 Ti is technically faster, it doesn’t impact the experience of playing Red Dead Redemption 2 in any meaningful way.

Performance benchmarks for the RTX 3090 and RTX 3090 Ti in Red Dead Redemption 2.
Image used with permission by copyright holder

The recent F1 2022 is another example. The game shows huge disparities in performance between resolutions with all of the settings cranked up (as you’d usually find them in a GPU review). But bump down a few GPU-intensive graphics options, and the game is so CPU limited that it offers almost identical performance between 1080p and 4K. No need for a GPU upgrade there.

No one is lying or intentionally misleading with benchmarks, but the strict GPU hierarchy they establish is an abstraction of using your graphics card for what you bought it for in the first place. Benchmarks are important for showing differences, but they don’t say if those differences actually matter.

How to make an informed GPU upgrade

Installing a graphics card in a motherboard.
Image used with permission by copyright holder

You should absolutely look at benchmarks before upgrading your GPU, as many as you can. But don’t put your money down until you answer these questions:

  • What games do I want to play?
  • What resolution do I want to play at?
  • Are there other components that I need to upgrade?
  • What’s my budget?

Relative performance is extremely important for understanding what you’re getting for your money, but better isn’t strictly better in the world of PC components. Depending on the games you’re playing, the resolution you’re playing at, and potential bottlenecks in your system, you could buy a more expensive GPU and get the exact same performance as a cheaper one.

That doesn’t mean you shouldn’t splurge. There’s a lot to be said about buying something nice just because it’s nice, even if it doesn’t offer a huge advantage. If you have the means, there’s novelty in owning something super powerful like an RTX 3090 — even if you just use it to play Vampire Survivors. Just don’t expect to notice a difference when you’re actually playing.

This article is part of ReSpec – an ongoing biweekly column that includes discussions, advice, and in-depth reporting on the tech behind PC gaming.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Don’t buy the RTX 3060 in 2024
The RTX 3060 installed in a gaming PC.

Nvidia's RTX 3060 is the most popular GPU around, and it's not even close. According to the latest Steam hardware survey, the 2021 GPU is in close to 7% of gaming PCs. That's a huge slice of the pie. For reference, the second most popular GPU, the RTX 2060, sits at just under 4%. It's easy to see why the GPU is popular, too. You can pick it up for between $250 and $300 -- and for even less used -- and it comes with a critical 12GB of VRAM.

It's the go-to GPU for maxed-out 1080p gaming in 2024, but based on my testing, it probably shouldn't be. The RTX 3060 is a workhorse, and for a large range of games, it's one of the best graphics cards you can buy. When it comes to the latest, most demanding games, however, the RTX 3060 struggles to keep up.
The litmus test

Read more
Intel’s new CPU feature boosted my performance by 26% — but it still needs work
The Intel Core i9-14900K slotted in a motherboard.

A 26% increase in frame rates from your CPU sounds far-fetched. If that's not enough to catch the attention of PC gamers, I don't know is. But trust me -- according to my own testing -- that's exactly what Intel's Application Optimization, or APO, delivers.

What started as a niche feature only supported by Intel's flagship chip and two games has since been broadened, with unofficial support for older CPUs and a much longer list of titles.

Read more
Intel’s next-gen GPU might be right around the corner
The Intel logo on the Arc A770 graphics card.

Intel's next-gen Battlemage graphics cards have already been caught in shipping -- but not to actual customers. Prolific hardware leaker @momomo_us shared shipping manifests that list two Battlemage GPUs sent through the mail at the "Pre QS" stage of development. Still, it's definitely a sign that Intel's hotly-anticipated Battlemage GPUs are moving along.

https://twitter.com/momomo_us/status/1773396489844515059

Read more