Skip to main content

Intel Arc graphics use AV1 to improve Twitch streams

Intel has just announced that it will support AV1 video coding technology in the new Intel Arc GPUs.

The tech will offer hardware-accelerated encoding that may have a huge impact on video streaming quality, making it potentially attractive to streamers and viewers alike.

Intel's Arc AV1 demo featured two Elden Ring streams for comparison purposes.
Intel

AV1 stands for AOMedia Video 1 and is a royalty-free video coding format. It was first designed to support and improve the quality of video streams over the internet. Today, Intel announced that it will be adopting this format on its Arc GPUs, potentially giving a huge boost in video quality to streamed content.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Upon the release of Intel Arc Alchemist discrete graphics cards, AV1 is going to be Intel’s video encoding standard and will have an impact on the way content looks when streamed live. As such, considering that Intel is going to be the first in line to offer this kind of support for this technology, it could potentially make its GPUs much more interesting to streamers than they would have been otherwise. Of course, this depends on whether the technology is as good as it seems in Intel’s preview.

Intel promises to deliver up to 8K quality in both decoding and encoding in AV1. Decoding maxes out at 8K and 60 frames per second (fps) in 12-bit HDR quality, while encoding goes up to 8K resolution at 10-bit HDR. Intel refers to this as the industry-first full AV1 hardware acceleration and claims that the technology will prove to be up to 50 times faster than software encoding.

Intel's Arc AV1 demo featured two Elden Ring streams for comparison purposes.
Intel

Intel showed off a video of two separate streams of Elden Ring in order to demonstrate the power of AV1. To do so, game footage was captured via XSplit gamecaster in 1080p at 5Mbps. The first video used the H.265 advanced video coding (AVC) standard while the second video relied on Intel’s AV1.

Although at first glance, the difference in image quality may seem rather small, pausing reveals just how much more detailed the stream is when AV1 is being used. Environmental details, such as rocks, grass, and ground clutter, all have their own shape and texture. The stream on the left side, while it shows almost the exact scene from the game, is nowhere near as detailed and comes off as blurry in comparison.

The video goes on to display both background and foreground improvements, showing crisp graphics in the stream encoded in AV1 in every frame. Even individual blades of grass look much more pronounced in AV1, despite the fact that both streams are consuming the same bandwidth and are running at 1080p. The difference is definitely there, indicating that the technology shows a lot of potential when paired with Intel’s discrete GPU.

Intel Arc graphics cards are a huge milestone for Intel, marking the company’s entrance into the discrete GPU market. First found in laptops, they will be available in a desktop version later this year.

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
Intel’s new Arc driver can boost your performance by up to 119%
Intel Arc A770 GPU installed in a test bench.

Intel has announced the rollout of a new driver update for its Arc graphics cards that promises a huge performance boost in various gaming titles. The new Game On driver with version 31.0.101.4885 primarily offers optimal performance for Assassin's Creed Mirage and Forza Motorsport. Additionally, the company claims up to double-digit gains in terms of performance with Deus Ex: Human Revolution gaining a 119% uplift with GPUs like the Arc A770 and A750.

The new driver update also brings improvement of up to 27% in Resident Evil 4 at 1080p with High ray tracing settings, and 12% in The Last of Us Part 1 at 1080p with Ultra settings. This makes the GPU even more competitive with Nvidia’s latest RTX 4060 and RTX 4060 Ti GPUs.

Read more
Intel’s next GPU just leaked, and it looks like a sub-$200 card worth buying
The Intel logo on the Arc A770 graphics card.

A long-forgotten Intel Arc GPU just made another reappearance, and this time around, it might actually be for real. Intel's Arc A580 popped up on Geizhals, an Austrian price comparison site, and it's even available in two different models. While the Arc A580 doesn't have what it takes to compete with some of the best graphics cards, it could turn out to be a solid budget-friendly option if priced appropriately.

The Intel Arc A580 was announced what feels like forever ago. Intel mentioned it several times prior to the launch of its A770 and A750, and it was included in the marketing materials for the Arc A770, Arc A750, and Arc 380. It always seemed like a good middle ground between those three GPUs, bridging the gap between the top of the lineup and the entry-level A380. But it never materialized, and Intel hasn't said a word about it since.

Read more
Intel’s new integrated graphics could rival discrete GPUs
The Intel Meteor Lake chip.

Intel has just announced an interesting update to its upcoming Meteor Lake chips: The integrated graphics are about to receive an unprecedented boost, allowing them to rival low-end, discrete GPUs. Now equipped with hardware-supported ray tracing, these chips have a good chance of becoming the best processors if you're not buying a discrete graphics card. Performance gains are huge, and it's not just the gamers who stand to benefit.

The information comes from a Graphics Deep Dive presentation hosted by Intel fellow Tom Petersen, well-known to the GPU market for working on Intel Arc graphics cards. This time, instead of discrete graphics, Petersen focused on the integrated GPU (iGPU) inside the upcoming Meteor Lake chip. Petersen explained the improvements at an architectural level, introducing the new graphics as Intel Xe-LPG.

Read more