Why Ratchet and Clank: Rift Apart's 40fps support is a potential game-changer
120Hz display technology gives developers more options.
A few weeks back, Insomniac patched Ratchet and Clank: Rift Apart on PlayStation 5 to introduce a revised version of its 4K30 fidelity mode. Tapping into the capabilities of 120Hz displays, what the team delivered is a potential game-changer for console titles - a 40fps mode that looked just as good as the older 30fps offering, but running considerably more smoothly and feeling better to play. On the face of it, a bonus 10fps doesn't sound like a huge bump, but in actuality, it's a very big deal.
To explain why, we need to focus on why console games typically target 30fps or 60fps (though 120fps support is gaining traction). The reason is simple: consistency. At 60fps with v-sync engaged - as seen in Ratchet's performance modes - the game sends a new image to your display, synchronised with its refresh. That's why 60fps looks so smooth and consistent, the game is matched with the refresh rate of the display, with a new frame delivered every 16.7ms. If that's not possible to hit, 30fps is the better bet. By synchronising the game update with every other screen refresh, you retain that consistency and the sense of fluidity - each new frame arrives with a consistent 33.3ms update. A 40fps mode on a 60Hz screen would not look great: new frames would arrive at 16.7ms or 33.3ms intervals. It would look jerky and inconsistent.
That's why 60fps or 30fps are the typical performance targets - so what's so special about Ratchet's new 40fps fidelity mode? Well, moving to a 120Hz display, the rules change. 40fps is every third refresh on a 120Hz panel. Rather than delivering an uneven 40fps at 16.7ms or 33.3ms intervals, every new frame is delivered consistently at 25ms instead. And here's the thing: while 45fps may sound like the mid-point between 30fps and 60fps, in frame-time terms that is not the case: 25ms sits precisely between 16.7ms and 33.3ms. It's how you might think 45fps should be. A look at the performance shot lower down on this page, showing both frame-rate and frame-time should highlight this.
And that's why Ratchet and Clank's 40fps mode impresses. Based on what we're seeing from the game, the standard 4K30 fidelity mode runs at maximum resolution with the full suite of visual effects. The 60fps alternative - performance RT mode - cuts down resolution dramatically, reduces the density of the more packed environments and also tweaks Ratchet's fur simulation. But from what we've seen, it looks like there's a fair amount of GPU and CPU overhead left over at 4K30 - enough actually run the game at 4K40, with only very slight, occasional dips in dynamic resolution that are essentially unnoticeable. In effect, performance is at a mid-point between fidelity and performance RT modes, but you don't lose any of the game's graphical features or much of the resolution if you opt for the 40fps fidelity mode instead. The cost of buy-in is the 120Hz display necessary to ensure consistent delivery of new frames every 25ms.
40fps is significantly faster, smoother, and there are also input lag benefits too. I tested this by pointing a 240fps camera at both screen and DualSense controller. I measured input lag by jumping 10 times and measuring the amount of frames between pressing the button and the character beginning the jump animation. I averaged the results and converted frame counts to milliseconds. The end-to-end latency measurements include display lag (I tested on a 48-inch LG CX in 4K HDR in Game Mode) and the results are surprising. The game code runs faster, cutting down latency while at the same time, the screen is refreshing more quickly at 120Hz vs 60Hz - two vectors that drive down overall end-to-end lag.
Game Mode | Performance Target | Display Mode | End-To-End Input Lag |
---|---|---|---|
Fidelity Mode | 30fps/33.3ms | 60Hz | 117.5ms |
Fidelity Mode | 40fps/25ms | 120Hz | 80.8ms |
Performance RT Mode | 60fps/16.7ms | 60Hz | 75ms |
Performance RT Mode | 60fps/16.7ms | 120Hz | 60.8ms |
There are a couple of takeaways from this testing. The highlight is that on average, the 40fps mode is only 6ms slower than the 60fps performance RT mode running on a 60Hz display. Meanwhile, if you opt to stick with the performance RT mode but with the game set to 120Hz output, input lag drops by around 14ms on average. Put simply, the game feels more responsive and better to play. Meanwhile running the 30fps fidelity mode on a 60Hz screen is clearly the least responsive way to play the game overall - with a 117.5ms lag measurement. Insomniac mentions into its patch notes that the new 40fps fidelity mode is about reduced latency, but I was somewhat surprised at just how much of a win it is on my screen.
Drawbacks are minimal. To access the 120Hz mode, simply dip into the settings menu in-game, select 120Hz then choose which of the quality modes you want. You'll need to have 120Hz output set to 'automatic' on the PS5's front-end, but otherwise you are good to go. The only downer is that those with HDMI 2.0 displays that support 120Hz will be locked to a downscaled 1080p output, even though there are a range of TVs out there that may not have the full-fat HDMI 2.1 support required for 4K120 output from the console, but can still deliver 1440p120. This isn't Insomniac's fault - it's something Sony seriously needs to consider adding at the system level, along with VRR (variable refresh rate) support.
Discussion of VRR opens up another question, of course. Is a locked 40fps better or worse than unlocking the game completely and letting VRR smooth out the experience? From my perspective, there are pros and cons. First of all, the option of VRR support should be there with PlayStation 5 - there's no question about it. However, settling on 40fps gives a set render time for games to target (or for dynamic resolution scalers to calibrate against) and often sits outside of the VRR range supported by many screens. It's a good alternative, for sure, and as a bonus it'll work great on 120Hz screens that do not have VRR functionality.
Further applications are intriguing. What's clear is that many games do have CPU and GPU overhead left over when the 30fps cap is in place. A while back, we talked about Remedy's Control and how its photo mode removes the 30fps frame-rate cap in quality mode. I did some more tests on this title and found that PS5 only rarely seems to drop beneath 40fps - it's titles like this that appear to have the overheard that could benefit from embracing the 120Hz/40fps format as an option, in addition to the standard 60Hz/30fps modes typically used in today's quality modes. In fact, even where games may not quite possess the necessary GPU overhead, there is still dynamic resolution scaling that could be used to trade a little resolution for a big reduction in input lag and smoother gameplay - something that Ratchet demonstrates beautifully.
It's certainly food for thought. The new generation of console gaming isn't just about the advances happening within the consoles, but also in the surrounding technological landscape. The 30/60fps divide is essentially a relic of the established 60Hz standard - but we've moved on. Variable refresh rate has established itself as a game changer for HDMI 2.1 displays but doubling the refresh rate from 60Hz to 120Hz opens up a range of new opportunities, which dovetail nicely with trends in game development. In the here and now, with 60Hz still dominating, I wouldn't expect to see 40fps suddenly become a new standard amongst developers - but for the game-makers out there wedded to the 30fps quality mode/60fps performance mode set-up, I'd highly recommend testing out the Ratchet and Clank: Rift Apart 40fps mode with a view to considering its charms - it looks stunning, it plays really smoothly, and the opportunity it presents for the future of console games is compelling.