Intel Arc B580 review: the fastest mainstream GPU - and 12GB of VRAM is the cherry on top
Arc de Triomphe?
The review embargo lifts today for Intel's Arc B580 graphics card - the firm's second generation GPU architecture, fully supporting hardware-accelerated machine learning and ray tracing. Intel is aiming squarely at the budget gamer with the $250 Arc B580, promising 12GB of VRAM and average performance that is, according to its own benchmarks, around 10 percent faster on average than the market leader: Nvidia's more expensive RTX 4060 8GB. A B570 is following in January, with a mild haircut to shaders, bandwidth and VRAM (10GB), with a mooted $220 price-point.
We were sent the Arc B580 last week and it's an impressive piece of kit. The limited edition reference card is well-built, totally quiet in operation and requires just one PCI-e eight-pin power input, with general operation seeing power draw around 170-180W at most. Display outputs are the standard trio of DisplayPorts, backed by an HDMI 2.1 output. In common with prior Arcs, HDMI seems to have some issues with capture cards - frustrating for our workflow, but surmountable with a DisplayPort to HDMI cable.
So why is there no full-blooded, multi-page Eurogamer review? Owing to some fundamental changes to our benchmarking set-up, we can't bring you our usual array of figures - a tech upgrade is needed behind the scenes on the Eurogamer CMS. However, our video workflow is functional, so I do encourage you to watch the video embedded below. You'll get some idea of why we are updating our benchmarking system in the first place (more data, more games, more holistic testing) but more importantly, you'll get to see Arc B580 in action - and it's good!
Arc B580 | Arc B570 | |
---|---|---|
Xe Cores | 20 | 18 |
Render Slices | 5 | 5 |
RT Units | 20 | 18 |
XMX AI Engines | 160 | 144 |
Graphics Clock | 2670MHz | 2500MHz |
Memory | 12GB | 10GB |
Memory Interface | 192-bit | 160-bit |
Memory Bandwidth | 456GB/s | 380GB/s |
Peak TOPs | 233 | 203 |
Total Board Power | 190W | 150W |
Ray tracing performance is the high point in the grand battle against RTX 4060. In Alan Wake 2 at high settings with low RT (the only setting that doesn't use path tracing), you're getting reflections and transparency reflections that exceed the quality on the PS5 Pro version. At native 1080p, B580 beats RTX 4060 by 29 percent and it's 51 points clear of the RX 7600, though it does have some stutter that Nvidia does not. Dying Light 2 at the same resolution is 14 percent ahead of Nvidia, 51 points clear of AMD, while Metro Exodus on extreme settings blitzes the competition: 16 percent ahead of RTX 4060, surpassing RX 7600 with a 77 percent advantage. It's not all plain sailing for Intel - the performance gap drops in Avatar, but generally, RT is the highlight.
In rasterisation, RTX 4060 is more competitive with Alan Wake 2 on high settings looking very similar to B580. RX 7600 pushes ahead, though both AMD and Intel once again exhibit stutter not seen on Nvidia cards. Meanwhile, Black Myth: Wukong actually sees Nvidia push ahead by around nine percent. The RTX 4060 is fascinating in how off-pace it is with RT disabled in Cyberpunk 2077, so it's no surprise to see B580 push ahead by 23 percent. Forza Horizon 5? It's virtually like-for-like between Intel and Nvidia, but both cards deliver benchmark performance at 1440p in excess of 60fps on the extreme setting. Do bear in mind with all of these results that the Arc B580 has that 12GB of VRAM and costs $50 less.
There were some issues during testing. Intel reckons its driver issues are behind them, but I found that performance in my Marvel's Spider-Man: Miles Morales benchmark (essentially the opening cutscene) was stuttering badly both with and without ray tracing. Cyberpunk 2077 with ultra or psycho RT also didn't work properly, hard-locking my PC. Intel says it's an issue the game has running with a Ryzen 7 9800X3D, but this doesn't apply to any other GPU I tested. Here's the thing: I'm only testing a relatively small range of modern titles and if there are issues with two of them, I can't help but wonder how many more titles have issues.
Legacy gaming was always a weak spot with the first generation Arc releases and while my old favourite, Assassin's Creed Unity, was patched back into the realms of good performance, Call of Duty Infinite Warfare always had issues, crashing on the first campaign level. With Arc B580, there's the same shader compilation issues (over 20 minutes to compile on Arc!) only this time I could not even start the campaign without the game crashing.
All told though, Arc B580 is a dream product in many ways - this is as close to disruption as we're going to see in a mature market like PC graphics. We've been complaining for a while now about how stagnant the sub-$300 GPU market is. We've been moaning about 8GB of VRAM not being enough for a new graphics card looking to deliver console quality experiences. We've not exactly been happy about the meagre gen-on-gen performance increases offered up by RTX 4060 and RX 7600, which were adequate but in no way exciting.
B580 is priced aggressively, the price vs performance ratio is best in class, ray tracing is excellent and XeSS is very, very good. I'm just concerned that support for Intel's upscaling isn't as prevalent as FSR and DLSS - for example, why isn't it supported in Alan Wake 2? XeSS 2 frame generation? In the battle to get our new benchmarking workflow online, I didn't have time to test it, but I'm looking forward to giving it a go - Intel is promising a full AI pipeline there.
In the meantime though, Intel Arc B580 is a highly compelling proposition and a shot in the arm for one of the most stagnant market segments. Our prior advice for this class of product was to get an RTX 4060 if you have to, or to pick up a bargain RTX 3060 for its 12GB of framebuffer memory. Arc B580 makes the decision making process a lot more challenging. Bearing in mind where it's come from, I don't think Intel could have put out a better product. This would be a must-buy if it wasn't for how prevalent and how crucial DLSS has become. Even so, it's highly worthy of consideration.