Does Nvidia ultra performance DLSS make 8K gaming viable?
A look at image quality and frame-rates on RTX 3090.
Is 8K gaming actually viable with Nvidia's latest Ampere-based GPUs? While a small selection of triple-A titles stand a good chance of hitting decent frame-rates at native 8K, Nvidia has released a new ultra performance mode for its DLSS AI upscaling technology - and its ambition is extraordinary. It aims to boost detail with a 9x multiplier, meaning that a core 1440p image is reconstructed to full 7680x4320 resolution: that's a final render output of 33 million pixels, reconstructed from just 3.7m. There's a whole discussion about whether 8K gaming is actually needed in the here and now - and it's a situation that can only be ascertained by actually giving it a go on a consumer screen, something we still don't have right now. Even so, the tools and techniques are certainly available to get a close-up look at image quality and performance and to draw some initial conclusions.
Before we go on, our contention remains that chasing resolution probably isn't the best use of GPU resources - even at 4K, let alone 8K. Ultra HD rendering is demanding enough, and pushing to 8K actually increases core resolution by 4x - an almost insane workload. We'd rather see higher quality pixels as opposed to more of them, especially as technologies like hardware-accelerated ray tracing are starting to produce some brilliant results. And it's certainly the case that native 8K rendering with RT is a bit of a disaster, even with the RTX 3090 - where Control renders a relatively simple ray traced scene at just nine frames per second. However, engage DLSS's new ultra performance mode and that leaps up to 50fps - a 455 per cent increase. The real question is whether you retain image quality as well and I think that from my perspective, there are promising beginnings here, but the technology isn't quite there yet.
There is still some impressive stuff on display though: 8K ultra performance DLSS renders at a native 1440p. Compare the image quality to native 1440p blown up to 8K and the difference is stark: Nvidia's AI upscaler is definitely doing a job here in resolving more detail, though the performance impact of 14 per cent is not insignificant. However, while initially promising, clarity is lost with the new technique when the game is in motion. It doesn't quite hold up, and certainly up against a native 8K image, it's simply not as good. With ray tracing in particular, the 9x multiplier in rays traced makes a big difference and the sheer detail isn't replicated. Put simply, in Control at least, the new DLSS can give a 'better than 4K' level of image quality - but it's not a match for the look of native rendering at 'full fat' 8K.
At 8K, ultra performance DLSS also has a profound impact on VRAM - which is presumably why Nvidia has only marketed the new DLSS technology as 8K capable on the 24GB RTX 3090. You're still getting a profound memory saving compared to native 8K rendering, however, where Control demands up to 20GB of VRAM to produce its ray traced visuals. One thing I do think it is important to stress is that the 9x multiplier effect with ultra performance DLSS can be used on other RTX GPUs - it's not an RTX 3090 exclusive - it's just that your target resolution would ideally top out at 4K max.
If the results weren't quite as good as we were hoping in Control, Death Stranding is more promising and it delivers a much closer approximation to native rendering, but it's still not without issues. The major problem I noticed is that texture mip level is related to native resolution, not DLSS output resolution, so you actually end up with some texture art that is decided less detailed than running natively at 8K. Elements of the environment that are in motion also have some issues in retaining full quality, while I think that the 9x resolution scaling effect may not be fully compatible with Death Stranding's post-process pipeline, producing some odd-looking shimmer. On balance, I think the experiment here is more successful than Control, but it's still problematic in some areas. 8K60 is viable with the standard performance mode, which reconstructs from native 4K, but there's a very strange rhythmic frame-rate drop effect which suggests something isn't working quite right there.
So how does this new variant of DLSS work out on less capable RTX graphics cards running at lower resolutions? I went from the fastest GPU money can buy to the decidedly less capable RTX 2060 and found the conclusions to be fairly predictable. The more pixels you have for the base image, the better the DLSS output. At 4K, ultra performance mode renders natively at 720p, and the effect doesn't quite work - in fact, I'd say that DLSS 8K from a 1440p source looks closer to native, even though the ratio between native vs DLSS pixels is the same. Again, I'd prefer to use the standard performance mode here. The increase to frame-rate isn't as pronounced of course, but it just looks much better - and of course, with a small overclock, RTX 2060 can render Death Stranding pretty much locked at 4K60.
Just to satisfy my curiosity, I did check out what ultra performance mode looks like at 1080p - which would see the technology scaling up from a mere 640x360 native, a lower resolution than most original Xbox games. Perhaps shockingly, it actually looks OK, all things considered. Static elements look close to native 1080p quality, but anything moving almost looks as if motion blur is applied. It's a neat technological showcase, but not really recommended - especially when Death Stranding already runs brilliantly on RTX 2060.
In summary, looking at all of the results I amassed, I'd consider this as an emergent technology with promise, but some way off the finished article. DLSS ultra performance mode at 8K looks better than native 4K in many aspects but its output in Control and its post-processing aliasing in Death Stranding make it less compelling. In its current iteration, a full-on 9x multiplier may just be too much of a leap for the current DLSS algorithm, and may not properly align with internal pixel grid assumptions, producing some strange effects. With that said, I like the ambition on display here and if 8K is some years away from being any kind of mainstream consideration, it's nice to know that Nvidia is at least starting to get to work on making its GPUs somehow manage to work on a 33m pixel display canvas.
Below 8K, I think it is novel and interesting just to see how extreme reconstruction fares from an image quality perspective, but there are limitations and no real application as such worth considering right now - DLSS ultra performance titles such as Control and Death Stranding can be tweaked to run well at 1080p even on an RTX 2060 and you don't really need to touch the new mode at all. However, there may well be some interesting applications for RTX 2080 Ti, RTX 3070 and RTX 3080 when looking to run titles on a high refresh rate 4K display, something we may well look into at some point.
For 8K gaming, selected titles should be able to run natively with RTX 3090 while DLSS performance mode should still bring home the bacon in terms of image quality: the 9x multiplier might be asking a bit too much of the existing technology, but the 4x multiplier is tried and tested at this point. Performance may well be less consistent when aiming to deliver 33m pixels per frame, but with variable refresh rate display technology, a 50-60fps experience should still look and feel pretty good. The proof of the pudding will be a complete battery of tests on an actual 8K screen and we hope to take a look at that soon.