Anthem's PC performance is improved by up to 65 per cent with Nvidia DLSS
Is there a catch?
When we first took a look at the PC version of Anthem, one thing was abundantly clear - this game is highly demanding on hardware. Average frame-rates are fine overall, but once the title's signature pyrotechnics kick off in full force, performance can drop alarmingly. Running at 4K resolution on max settings, not even Nvidia's top-tier RTX 2080 Ti graphics hardware can consistently run this game at 60 frames per second. However, the arrival of a new Anthem patch supporting Nvidia's deep learning super-sampling could potentially help.
DLSS is a fascinating technology that's still in its early days but has some remarkable properties. The idea looks simple on paper, and sounds almost too good to be true. The game renders natively at a lower resolution (4K DLSS tends to have a native 1440p base pixel-count) and then Nvidia's deep learning algorithm steps in to extrapolate the detail level up to 4K. In essence, new pixel detail is algorithmically generated to enrich the image.
DLSS is designed to replace temporal anti-aliasing (TAA) within a game's post-process pipeline, and it is fair to say that results thus far have been mixed. Early demos based around Final Fantasy 15 and Epic's Infiltrator showed the promise of the technology, while implementations like Battlefield 5's have not been so well received. Metro Exodus is a fascinating case: DLSS support at launch was extremely blurry, but a later patch radically improved the quality tremendously. And that's good, as DLSS opens the door for allowing higher resolutions to work at much higher frame-rates when paired with Nvidia's DXR-powered ray tracing.
Anthem doesn't benefit from DXR, but there is the question of its massive variation in frame-rates. Typically, DLSS adds around 35 to 40 per cent more performance, meaning a potentially significant reduction in the game's notorious frame-rate bottlenecks. Due to Anthem's unique frame-rate issues and its heavy CPU requirement, the reality is that DLSS can add anything from around 20 per cent to 65 per cent more performance, depending on context. The good news is that the top-end boost kicks in where you really need it most.
DLSS is a controversial technology because in common with other reconstruction technologies, while there is an increase in detail, 4K DLSS can look quite different to native 4K with TAA. Nvidia's deep learning technique actually has its own distinct look - it no longer has that per-pixel precision you may be used to with native rendering, or even some examples of TAA-based reconstruction techniques. I think it's still attractive and a massive, massive boost in quality over its native 1440p rendering, but the resolve of shapes and detail is different and that must be noted.
In the case of Anthem, the differences between DLSS and native 4K are intriguing. For example, vegetation presents rather differently. The per-pixel detail on the native presentation looks better on still shots, but in motion, this presents aliasing in motion owing to the high frequency highlights in the texture. Here, DLSS resolves less overall detail but looks much more coherent in motion with less aliasing. It does seem to be the case that DLSS can reduce in-texture detail overall, with a presentation that is like a kind of mid-point between 1440p and 4K. DLSS most closely resembles 1800p - which can look good on a 4K display, especially with the temporal consistency the technique provides - but has clear performance benefits.
Transparent elements in the presentation can cause issues with temporal anti-aliasing, producing some obvious ghosting effects. Anthem does better than most titles here and the ghost trail following a transparent element is faint, but still there. This is one area where DLSS has the advantage, and there is no real visible ghosting. However, some transparent elements like waterfalls actually seem to render at a much lower resolution - almost as if no reconstruction on these aspects is being processed from the base 1440p image. Bloom effects also exhibit some pop and flicker that is accentuated by the lower base pixel-count.
A good way to claw back performance on Frostbite games is to use the internal resolution slider, where at 4K, anything between 80 to 100 per cent of the display's native pixel count looks pretty good. 1800p sits at around 83 per cent on that scale, but there's a fundamental issue in Anthem in that BioWare has completely removed this functionality, but setting the game to output at 1800p manually reveals that 4K DLSS looks pretty close visually but has a further 10 to 12 per cent performance uplift, plus the temporal consistency advantages I talked about earlier. The advantages up against native 4K in terms of frame-rate are far higher, of course.
The fact that DLSS is derived from a lower resolution also helps to mitigate Anthem's real issues with heavy effects work - and I think it's fairly clear that bandwidth issues are the cause for these big performance drops. The bandwidth requirement drops massively the lower down the resolution chain you go, meaning big, big improvements to performance owing to DLSS's 1440p base pixel-count. It's most likely this that explains the top-end measurement of an extra 65 per cent of performance at 4K with DLSS enabled. Is it enough run Anthem at 4K on max settings locked to 60fps? Well, the heaviest scenes can still dip to the mid and low 50s, so the chances are that further settings tweaks will be required - DLSS raises the performance plateau but can't provide enough of a lift to combat all of the game's most challenging scenarios.
DLSS right now is a curious technology. Its reconstruction is definitely working and producing results that are generally pleasing next to native 4K and its closest competitor at a scaled 1800p. And interestingly, it looks to be more stable in motion than these two on opaque objects while performing better than both. At the same time, it offers up less visible detail on surfaces while transparencies seem to be unaffected by reconstruction at the moment, or at least accounted differently in a manner which makes the lower base resolution more obvious.
I do think it is a good alternative to resolution scaling in this title especially if you prefer a more stable imagine in general than one whose detail could translate to noise in motion. But that greater stability on opaque edges makes me wonder how DLSS would work as a normal anti-aliasing alternative to standard TAA we see in games. DLSS 2x has yet to exist in any game so far - this would take the DLSS principle and apply it to anti-aliasing at native resolution. I would really like to see Nvidia offer that option in a few games as its results even below native resolution are very impressive.
But more than that, DLSS demonstrates that reconstructing towards higher resolutions isn't just a way of allowing lower power consoles to punch above their weight. The same techniques can pay real dividends on PC - and it needn't require Nvidia's AI tensor cores to deliver good visual quality and tangible performance boosts. I'd recommend taking a look at The Division 2 on PC. Running at 75 per cent resolution scale at 4K and using Ubisoft Massive's temporal reconstruction effect produces some highly impressive results. Yes, it's 'faux K' but when it looks as good as this and as close as it does to the 'real thing' - who cares?