Skip to main content

Intel may already be conceding its fight against Nvidia

Two intel Arc graphics cards on a pink background.
Jacob Roach / Digital Trends

Nvidia continues to own the top-of-the-line GPU space, and the competition just hasn’t been able to, well, compete. The announcement of the impressive-sounding RTX 40 Super cards cements the lead even further.

As a result, AMD is said to be giving up on the high-end graphics card market with its next-gen GPUs. And now, a new rumor tells us that Intel might be doing the same with Arc Battlemage, its anticipated upcoming graphics cards that are supposed to launch later this year. While this is bad news, it’s not surprising at all.

Arc Battlemage leaks

First, let’s talk about what’s new. Intel kept quiet about Arc Battlemage during CES 2024, but Tom Petersen, Intel fellow, later revealed in an interview that it’s alive and well. The cards might even be coming out this year, although given Intel’s track record for not meeting GPU deadlines, 2025 seems like a safer bet. But what kind of performance can we expect out of these new graphics cards? This is where YouTuber RedGamingTech weighs in.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

RedGamingTech posted a big update to Intel Arc Battlemage specs in his latest video, and it doesn’t sound particularly good for high-end gaming enthusiasts. According to the YouTuber, the specifications of the flagship chip may be significantly different compared to his previous predictions. What’s worse, it might never even be released.

Initially, RedGamingTech suggested that the top Battlemage GPU would feature 56 Xe cores and a frequency of up to 3GHz. That’s still the case, but rumor has it that there’s been a big shake-up in memory bus and cache configuration. Instead of the 256-bit bus and the 116MB of L2 cache, the YouTuber now says that we can expect a 192-bit bus, 8MB of L2 cache, and a whopping 512MB of Adamantine cache.

Adamantine cache is still pretty unknown to us at this stage, although an Intel patent that PCGamer shared details on tells us more about it. It’s essentially Level 4 cache that’s comparable to AMD’s Infinity Cache and appears to work in a similar way.

That sounds pretty good, right? With 56 Xe cores, the card would be a huge upgrade over the Arc A770 that comes with 32 cores. However, even despite this massive L4 cache, those specs already hint at a less-than-high-end flagship for Intel. With a 192-bit bus, Intel would probably stop at around 12GB of VRAM, unless it ends up feeling adventurous like AMD with the RX 7600 XT or Nvidia with the RTX 4060 Ti. (Let’s hope that it won’t.)

Regardless of whether this GPU is even real, RedGamingTech suspects that Intel may choose not to release it at all due to unsatisfactory profit margins. Instead, Intel might focus on a GPU with 40 Xe cores, a 192-bit memory bus, 18MB of L2 cache, and zero “Adamantine” cache.

Is it time for Nvidia to celebrate?

Nvidia GeForce RTX 4090 GPU.
Jacob Roach / Digital Trends

AMD is reportedly bowing out of the high-end GPU race in this next generation. Now, Intel is said to be doing the same. Where does that leave Nvidia? Right at the very top, with complete control of the enthusiast GPU market and nothing to worry about in that regard.

It’s a dream for Nvidia, but it’s not so great for us, the end users. Giving Nvidia the ability to drive up the prices as much as it wishes brought us the RTX 40-series, where the prices and the performance often just don’t add up. With zero competition at the high end, the RTX 5090 might turn out to be a terrifying monstrosity with an eye-watering price tag. After all, why wouldn’t it be? It’s not like AMD or Intel are doing anything to keep Nvidia from doing otherwise.

On the other hand, even if Intel chooses to focus on the mainstream segment, things won’t change too much. AMD is Nvidia’s main competitor, and even now, when it has a couple of horses in this race, it still can’t match Nvidia’s flagship RTX 4090, or even the surprisingly impressive new RTX 40 Super cards. Intel, now one generation behind (and soon to be two), wouldn’t have been able to beat Nvidia’s future flagship either.

For the mainstream market, meaning the vast majority of GPUs that are sold, it’s actually good if AMD and Intel will be there and give Nvidia some heat. Those prices might end up less inflated as a result. Meanwhile, high-end gaming will be pricier than ever, but unfortunately, Intel wouldn’t have been able to stop Nvidia there anyway, regardless of the card it might never release.

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
Everything you need to know about buying a GPU in 2024
RTX 4090.

The graphics card, also known as the GPU, is arguably one of the most exciting components in any PC build. Alongside the processor, your graphics card often has the greatest impact on the overall performance of your PC. That makes it a pretty high-stakes purchase, especially if you consider that GPUs can get pretty expensive.

The GPU market has a lot to offer, and that's regardless of your needs and your budget. Whether you're aiming for something super cheap to support some light browsing or a behemoth to handle the most GPU-intensive games, you have lots of options. In this guide, we'll show you what to look out for so that you can pick the best GPU that fits your budget and needs.
Nvidia, AMD, or Intel?
Consumer graphics cards are generally split into two categories -- integrated and discrete graphics. Since you're here, you're most likely looking for a discrete (or dedicated) GPU, and that's what we're going to focus on in this article.

Read more
You shouldn’t buy these Nvidia GPUs right now
RTX 4060 Ti sitting on a pink background.

Buying a new GPU in this generation is a bit of a tricky minefield of graphics cards to steer clear of. Sometimes, the performance is there, but the value is not; other times, you could get something much more capable for the same amount of money.

While Nvidia makes some of the best GPUs, it's certainly no stranger to that performance vs. value dilemma. Below, I'll show you three Nvidia graphics cards you're better off avoiding right now and tell you their much better alternatives.
RTX 4060 Ti

Read more
Nvidia DLSS is amazing, but only if you use it the right way
Lies of P on the KTC G42P5.

Nvidia's Deep Learning Super Sampling, or DLSS, has become a cornerstone feature of modern PC games. It started as a way to boost your performance by rendering a game at a lower resolution, but the prominence and popularity of DLSS have prompted Nvidia to add even more features under the name.

Today, DLSS incorporates several different features, all of which leverage AI to boost performance and/or image quality. It can be intimidating if you're a new RTX user, so I'm here to break down all of the increases of DLSS in 2024 and how you can best leverage it in supported games.
The many features of DLSS

Read more