Red Dead Redemption 2's HDR support seems to serve no real purpose
Tests suggest it's an upscaled standard dynamic range image.
One of the most anticipated games of the year, Red Dead Redemption 2 launched last Friday - and it made quite the impact. As Digital Foundry posted recently, the final game represents a phenomenal achievement with few technical blemishes - but something doesn't seem quite right with its HDR implementation.
Actually describing what's going on and even basically communicating how high dynamic range works in any particular title isn't easy. You're almost certainly reading this on an SDR display, so HDR screenshots aren't going to cut it.
But it is possible to visually demonstrate some of the benefits of HDR using a method I've developed that maps luminance, or brightness, into colour - a technique I've been using for some time for my HDR analysis posts on the ResetEra forums. Images throughout this article have been graded so anything that falls within 100 nits of luminance are grey, effectively demonstrating the bounds of standard dynamic range imagery. Once we bust out of this standard range, we shall see colour - though with Red Dead Redemption 2, not everything is as it seems.
Based on the video output of Xbox One X, the output is clearly derived from the SDR image and is not actually rendered with any kind of high dynamic range. Yes, we can hit a max 500 nits, but this is with the HDR calibration ramped up to the max, artificially boosting peak brightness from what is likely a 100 nit source image. The game's recommended 100 setting for LCD screens would see all of our comparison shots on this page present in a completely monochrome manner, signifying an image that is, effectively, an SDR image.
This is not the first game to present us with what you might call 'fake' HDR - earlier this year the Xbox One port of Nier Automata also pulled a similar trick, effectively transplanting the standard dynamic range image into an 'HDR container'. Once active, your screen registers an HDR output from the console, but it just contains the SDR image within it. All of our tests suggests the same thing is happening in Red Dead 2 on both Xbox and PlayStation systems.
The evidence is easy to spot if you are accustomed to a native HDR image, as you move between SDR and HDR modes, but also within with HDR mode itself. Typically a game works with an image with an increased dynamic range, often up to 4000 nits of peak luminance, then image data is removed and mapped and compressed downwards to a lower value that better matches the display's capabilities.
In Red Dead Redemptions 2's case, Rockstar has chosen to do this back to front instead: the 8-bit SDR image appears to be expanded to fill a 10-bit container. Nothing is gained from this process and natively the image remains the same, so in that sense it's kind of analogous to a resolution upscale.
The upshot of this is that when a pixel is pure white in SDR, it now becomes the brightest possible value in HDR. This is problematic for a few reasons, one being that most HUD elements are pure white in SDR, which means you frequently get what feel like disproportionally bright text on-screen - and this also will be of concern those who are wary of screen burn. A bigger overall issue is that HDR displays treat HDR imagery and SDR imagery entirely differently: engaging HDR in Red Dead Redemption 2 instructs your display exactly how bright your display should be making each and every pixel... except it's using SDR data designed for a 100 nit display.
This produces what to many will feel like a dim image, and while it's technically correct for a light-controlled environment, it's often impractical. This is pretty normal for HDR content, but actual HDR then goes on to deliver luminance above and beyond the limitations of SDR's 100 nit intent, something that RDR2 does not deliver. The in-game HDR brightness settings are well aware of this, offering a HDR brightness slider and some numbers that appear to correspond to a maximum HDR nit output. Initially this confused us here: the game offers recommendations for both LCD and OLED displays of 100 and 300 respectively (with the full scale being 80-500).
Why is the OLED setting suggesting a higher peak brightness than the LCD, when those displays are almost universally are brighter? As it turns out, this setting does not control the peak brightness, it merely raises the SDR white point to this maximum value and expands the SDR image up to this - but the crucial point here is that while the HDR calibration screen may suggest that you're expanding the range of the image, it's simply remapping the original SDR image instead. Realistically, you're gaining nothing here.
On top of this, some users are going to experience a different set of dimness problems. Due to both Sony and Microsoft platforms not outputting any HDR metadata in the video output, many OLED displays from the last few years actually hold back on fully illuminating their pixels, as they wait for more pixel values of greater luminance intensity, which in this situation, never arrives.
When you play the game it's clear to see it has been very successfully tone-mapped down from the HDR internals of the RAGE engine all the way down to an 8-bit SDR output. For the most part most highlight detail is retained whilst simultaneously showing shadow detail - this is down to what can be a heavy tone map, which is a little reminiscent of some of the less nasty attempts at "HDR" photographs you have seen.
Another problem is an issue of luminance separation and contrast. A white coffee mug and a fluorescent light are both white, but they do not produce or reflect the same amount of light. Whilst RDR2 has neither coffee mugs or fluorescent lights, it does have fabric, snow and sources of light, all of which might share the same colour, but are all significantly different in terms of brightness. An image made for SDR simply doesn't carry enough data to transmit not only the colour of a pixel, but also how bright it should be relative to another (certainly on a larger scale anyway). This lack of separation will be immediately noticeable to those accustomed to HDR content, with details and objects expect to be reflecting or producing significant values light, essentially doing neither.
In the image above - which like all the comparison images has been taken with the in-game brightness slider maxed out to 500 - we can see the moonlight at 500 nits of luminance, however there are no other HDR level highlights in the image, even in the glimmers in the snow. Rise of the Tomb Raider, Forza Horizon 4 and Battlefield 5 all present these snow glimmers with increased intensity in their true HDR outputs. This lack of contrast within brightness values continues in daylight too. Here you can see both reflected snow which is almost 400-500 nits from a sun that is also 500 nits. Sitting on a floor of excessively bright light, we can see a lit campfire, which again is very close to 500 nits. Again, this wouldn't present like this in a properly executed HDR presentation.
One of the main goals of using HDR content with an HDR display is that it removes the need to use tricks to create perceived brightness. Whilst the game does often map scenes with lots of contrast into a scene without clipping, there are numerous examples of usage of what is meant to be particularly intense light. Based on an image derived from SDR, lots of highlight information has been thrown away, simply clipped out. On this occasion and several others, I also witnessed a hard stepped sudden change in exposure as you moved from a dark area to another lighter area. Still, after pushing the in-game HDR control to 500, we can see the evidence of the great SDR tone map in other areas. There is actually lots of detail in highlights as you can see in the image here, where there is very little clipping despite the perceived brightness.
With HDR turned on, we essentially have the choice between a dim image, an image that looks the same as it would have done without HDR enabled or an image that can be painfully illuminated.
You've got nothing to lose by turning on HDR in Red Dead Redemption 2 but nothing to gain either and this begs the question - what's the point in it being there? It all comes as somewhat of a disappointment within a game that is a technical tour-de-force. This is compounded by the fact that various other titles in the last few weeks have delivered absolutely phenomenal results from their HDR implementations.
And what's really disappointing here is that the potential for a stunning HDR implementation is there - in this game world with its stunning lighting and physically-based materials, Red Dead Redemption 2 should be up there with the best - it's just a shame it has fallen short. And by extension, as the highest profile title to deliver an underwhelming or non-existent HDR implementation, Red Dead Redemption 2 is somewhat of a mystery.
We contacted Rockstar to ask for comment and will update should any response arrive, but quite why Red Dead Redemption 2 presents like this is strange. Is it an aesthetic choice as they chase that movie-like look? I sincerely hope not, as there's no doubt that games are delivering the HDR goods where Hollywood is not - and it would be a great shame if we see more developers holding back on a technology because it doesn't have the visual language of another media format established way back in time to sub-optimal standards. After all, there's a reason we don't have many games that run at 24fps.
Given the phenomenal scale of technology that is delivered in the game, it almost seems implausible that there is a technical limitation and hopefully some more light will be shed upon this in time. In the meanwhile there is nothing to stop us enjoying what is one of the most groundbreaking technological showcases of this generation - but unfortunately, that does not extend to robust support for high dynamic range.