In Theory: is AMD's Ryzen CPU the game-changer for next-gen consoles?
Improved graphics are a given, but what about gameplay?
With PlayStation 4 Pro on the market and Xbox One X to follow, Microsoft and Sony's R&D focus is inevitably going to shift towards the next wave of machines. Questions surround the kind of generational leap that's possible in the next couple of years, and how much these new machines will cost. But there's one aspect of their technological make-up we can take as read: AMD's Ryzen CPU technology will take centre-stage - and the shift to a radically improved processor architecture could have more pronounced implications on the games we'll play.
It all brings us back to the core concept of hardware balance - the specific combination of CPU, GPU and RAM in a console design. The current-gen machines handed in true generational leaps in terms of graphics and memory (the latter is remarkable in offering a 16x increase over the last generation) but there's compelling evidence that CPU power was the casualty. AMD's Jaguar x86 cores - originally designed with mobile applications in mind - were effectively the only choice available for integration into console SoCs of the era. Two Jaguar clusters were paired up with desktop-level graphics, resulting in a profound impact on the way games were shaped. AMD remains the best partner for the new wave of consoles we expect to arrive in 2019/2020 and with this in mind, perhaps Ryzen is the game-changer. Indeed, it integration may well have more of an impact on game design than the amount of GPU teraflops we end up with in the new wave of machines.
In terms of evolving the games we play - and not just the quality of the visuals - we need to see a shift in different areas of console design, a hardware boost that benefits what we'll refer to as simulation. Consider Grand Theft Auto 4 on Xbox 360 and PlayStation 3, and stack up its in-game world to their predecessors in GTA3, Vice City and San Andreas. Visuals clearly operate on another level, but this is married with a true generational leap in terms of simulation. Liberty City came to life as a thriving, bustling metropolis - packed with various NPC types. But it's not just the volume of them that was important. They were equipped with an impressive range of context-sensitive behaviour: they went to work in the morning, drank coffee, went jogging, took out their umbrellas to protect themselves from the elements - and yes, interacted to a limited degree with the player. The quality of the simulation took on a whole new level compared to last-gen counterparts.
The last-gen era also birthed other franchises built on the ability to create a level of simulation never seen before - the in-game world generated in the original Assassin's Creed was like nothing we'd seen before in the past. The question is where the similar generational leaps in simulation can be found in the current generation. Storage and memory increased, which obviously had an impact on the size and shape of in-game worlds, but to what extent did simulation move on? The Assassin's Creed series is a fascinating case in point in that 2014's AC Unity looks and plays like a game designed for a very different kind of console than the machines that were actually delivered.
Ubisoft doubled down on its world-building, packing revolutionary France with NPCs, ramping up detail on the city itself dramatically. And as we soon discovered, the consoles simply couldn't keep up - even after multiple patches, only PS4 Pro's boost mode could brute-force the game to anything like its intended 30fps target. Not surprisingly, Ubisoft pivoted away from world simulation as a focus in Assassin's Creed Syndicate - and while we've only had limited exposure to Origins, the sense is of a title emphasising graphics over world complexity, a much better fit for the current-gen machines.
Other attempts to 'big up' existing game concepts have aimed high but fallen short. Probably the most obvious example is Avalanche's Just Cause 3 - the game that not even the PS4 Pro's boost mode could whip into shape. Emphasising and expanding on the series' signature physics and destruction was the right move conceptually, but perhaps the wrong one bearing in mind the hardware constraints of the current-gen consoles.
We've measured minimum frame-rates of 18fps on base PlayStation 4, rising to just 24fps with boost mode enabled. Meanwhile, DICE has attempted to bring the epic scale of 64-player Battlefield multiplayer to consoles - and again, CPU limits have obstructed the optimal performance level enjoyed by PC gamers. It's the same explanation offered up by Bungie for its 30fps-locked Destiny 2.
So what happens now? Microsoft has customised Jaguar for Xbox One X, so we get a 31 per cent bump in frequency with associated benefits, plus some interesting tweaks designed to maximise performance from L2 cache, but it's still a CPU of the same generation with the same fundamental limits in place. It's an open secret that AMD's Ryzen architecture is the way forward, and the smart money is on one Zen CCX module containing four cores running eight threads making up the CPU component of the next-gen APUs in the PC space. Implementation will vary - perhaps dramatically so on a console - but the current Ryzen processors are based on two CCXs in a single package, opening the door to four, six and eight core processors, depending on which bits AMD chooses to disable. All decent AM4 motherboards allow the user the choice of disabling CCXs, so we took a Ryzen 7 1700, and ran it as a quad-core part with one CCX.
What speed would we run it at? We went with 3.0GHz here, based on nothing more than just a hunch - Microsoft and Sony upped CPU clocks by 31 per cent in the move from 28nm to 16nm, and assuming the same thing happens again on the next process, Xbox One X's 2.3GH CPU speed increases to 3.0GHz. This is just a guess though - confident predictions on frequencies need to be tempered after Microsoft blew the ceiling off GPU clocks in a small form factor with its latest console design. We also may be under-estimating performance here not just in terms of frequency, but also in IPC (instructions per clock). We tested a first-gen Ryzen here - AMD has two further iterations planned before 2020, variations of which are likely to form the basis of next-gen's CPU technology.
We're didn't go in expecting to deliver anything like firm benchmarks, more an indication of what Ryzen is capable of on gaming workloads that have vexed the current-gen machines. Indeed, there are many reasons why our results with our Ryzen candidate may be lower than expected - there's none of the to-the-silicon optimisation console developers like to pursue, while AMD itself has stated that library PC games require updates to get the most out of the new architecture. On top of that, the PC versions don't have access to the stripped back APIs used on consoles, meaning further, additional overhead. Regardless, the results on their own terms impress.
Take The Witcher 3, for example. It's a title very much based on current-gen constraints, targeting 30Hz on consoles - a frame-rate objective it generally managed to hit after several patches. In general gameplay in the open world, our Ryzen candidate hits 100-120fps once GPU limits are effectively removed. In our CPU-busting Novigrad City benchmark, we're still at 80 to 90fps or thereabouts.
Just Cause 3? The same kind of physics work that bludgeons the console Jaguar cores into submission occupies an area between 55 to 80fps. It's a transformative experience, and it's identical to the kind of frame-rates we get from Assassin's Creed Unity in its busiest, most NPC-heavy scenes - and this title is important, in our view. It was a first-gen try-out of a new level of world simulation: flawed and limited in some respects but a genuine attempt to kick off a generational leap in world fidelity. There's a strong argument that owing to console hardware constraints, we never got to see where further iterations of that technology would have led us.
A move to Ryzen for the next wave of consoles could liberate game-makers to scale up their ambitions in terms of simulation. However, that's not to say that we've not had some advanced games on the current-gen systems - developers have pivoted to make the most of the technology available. Horizon Zero Dawn is an open world technical showcase with some phenomenal simulation - key aspects are skewed towards procedural elements, hived off to the GPU. To give you some idea of the scale of Guerrilla's work here, cliffs and mountains have an 'erosion' variable that generates rocks smoothed off by billions of years of exposure to the elements - it's amazing. Even cloud simulation is accurately mapped - and that's just the tip of the iceberg. Similarly, we've yet to see a true current-gen showcase from the masters of world simulation, Rockstar. Red Dead Redemption 2 should put that to rights and we're fascinated to see what they come up with.
But looking back to GTA4 and its massive generational leap over prior series entries, what more could Rockstar produce with Ryzen given the remarkable results the firm achieved with the heavier CPU bias in last-gen designs? Given the ability to ramp up simulation, maybe the deluge of Ubisoft open worlds would be significantly better, as well as bigger and prettier, than their last-gen counterparts. From our perspective, while the teraflop race is likely to define the expected power level of the true next-gen consoles, it's actually the CPU that may prove to have the most meaningful impact on the scope of the gameplay experiences we get from the new hardware.
Test results with a four core, eight thread Ryzen are promising - but the sobering reality is that PC benchmarking actually sees it fall some way short of a mainstream Core i5 overall, performance zigzagging between i3 and i5 levels at any given point, depending on the content. Even the Pentium G4560 puts in a stern challenge here - not quite what we expected when we were hoping for something more akin to a mini i7. On a superficial level, it illustrates how wide the gulf is between PC and console CPU power, but on the other hand, bearing in mind Jaguar's low performance level, developers have extracted much more from it than is possible in the PC space, and we'd expect the same from Ryzen. And we can't rule out the console manufacturers spinning their own custom variant, of course.
This year's release of the Xbox One X almost certainly represents the last hurrah for AMD's Jaguar technology - and exactly what replaces it is a decision that Sony and Microsoft can't take lightly. The end result will have profound implications on the sophistication of the games we'll be playing well into the next decade. The good news at least is that the core technology exists right now to dramatically increase game simulation quality in line with the fidelity of the visuals - and we'll be fascinated to investigate the power of AMD's desktop Ryzen APUs too, due later this year.