Skip to main content

We were arch-sceptics for years - but cloud gaming can work

Nvidia's GeForce Now and infrastructure upgrades deliver a startling proof of concept.

This week's Digital Foundry Direct Weekly delivers yet another content coverage 'war story' as we tried to do our best to cover Star Wars Jedi: Survivor, despite no PC code arriving until the day before release and no console review code at all until launch day. Attempts to source it earlier via retail stores helped, but from a creator point of view, I can't help but feel a touch disappointed that we didn't have more time with the console versions of the game. However, it's the second topic on the docket that I'm going to focus on in this blog - the CMA's blocking of the Microsoft/Activision deal owing to a perceived unfair advantage it would give Microsoft in the cloud space.

There's been plenty of analysis on whether the CMA's reservations are valid are not, but there was one piece of information contained in the documentation that intrigued me - that 'a rival' - likely Sony - contacted the CMA to tell them that the latency challenges facing cloud gaming have been overcome. The CMA was rather lacking in terms of detail here, only saying that 'more powerful graphical processing units (GPUs)' can solve the problem. The immediate reaction online was to dismiss the idea because you cannot overcome the laws of physics. Latency will always be a problem that cannot be conquered.

I speak as a journalist that has covered 'the power of the cloud' right from the very beginning, when OnLive claimed to be delivering an 'as good as local gaming' experience way back in 2011. It couldn't possibly work, I said. It did work, of course, in that a functional gaming experience was possible, but the lag was terrible and the image quality even worse. By the standards set by OnLive, it would never replace local gaming - and that's still the perception even today.

The thing is, assuming the CMA has paraphrased Microsoft's 'rival' correctly, I think this opinion has some weight.

Here's DF Direct Weekly #109, with Rich Leadbetter, Alex Battaglia and Oliver Mackenzie discussing the latest gaming and technology news.Watch on YouTube
  • 00:00:00 Introduction
  • 00:00:45 News 01: Star Wars Jedi: Survivor impressions
  • 00:23:57 News 02: Microsoft Activision purchase hits roadblock
  • 00:34:30 News 03: Microsoft pushes for Xbox energy sustainability
  • 00:48:21 News 04: ASUS ROG Ally chips benchmarked
  • 00:55:50 Supporter Q1: If RT becomes a big rendering focus, could future GPUs regress in rasterization performance?
  • 01:00:36 Supporter Q2: Are 8GB GPUs DOOMED to muddy textures on PC?
  • 01:07:33 Supporter Q3:How did Don feel about his first video?
  • 01:11:15 Supporter Q4: Will you make a GTA5 retro time capsule video when the title hits its 10th anniversary later this year?

A lot has changed since 2011 and of all the cloud gaming start-ups, it's only really Nvidia that has delivered a system that can be said to have addressed the issues. And yes, a lot of that is down to 'more powerful GPUs', alongside an incredibly robust and dogged approach to relocating and mitigating latency along the end-to-end pipeline. Check out Cyberpunk 2077 RT Overdrive on the GeForce Now RTX 4080 tier and not only are you seeing an experience that may well represent the next generation of consoles, you're also getting perfectly reasonable latency. With a mouse and keyboard, you can feel the difference in response - but it's subtle. Move to a controller - an inherently laggier form of input - and you'd be hard-pressed to tell.

So how was this achieved? There advances are plentiful. Systems from OnLive through to the first-gen xCloud through to Stadia all employed external encoders, meaning that the GPU would finish its work before sending it out to a proxy 'display' (ie the encoder). GeForce Now and xCloud integrate the encoder into the main processor, cutting down lag. Secondly, where possible, Nvidia aims to reduce latency in-game either via its driver-level functions or else via its bespoke latency-saving technology, Reflex. Latency can be further reduced simply by running the game with frame-rate unlocked. There's no v-sync off 'tearing' as full frames are plucked from the framebuffer for video encoding.

Finally, GeForce Now also supports 120Hz and even 240Hz streaming technology. The faster the stream update, the lower the latency. I'm likely underplaying many of the challenges that have been overcome here, but the point is that GeForce Now is delivering 4K HDR 120Hz streaming, and it's a class apart. I'm still trying to find the time to complete my GeForce Now RTX 4080 review, but here are the latency numbers, put together by my colleague, Tom Morgan using Nvidia's LDAT system, a fiddly but nigh-on foolproof way of measuring end-to-end lag to the millisecond, from button press to response on-screen.

Destiny 2 Average Latency (ms) Native PC Native Xbox Series X GeForce Now 3080 (PC App) GeForce Now 4080 (PC App)
60Hz 49.0 85.0 81.7 82.2
120Hz 31.8 40.6 59.5 60.6
Fortnite Average Latency (ms) PlayStation 5 Native PC (V-Sync Off) GeForce Now 4080 (V-Sync Off)
60Hz 96.8 35.4 78.3
120Hz 48.0 27.4 53.3
240Hz - - 33.1

There's another factor at play too: infrastructure upgrades. As you can see above, the truth is that Nvidia had cracked the latency challenge by the time it launched the GeForce Now RTX 3080 tier. Improvements since have come from higher frequency streaming - 4K120 and 1080p240 are new additions. However, the useability of cloud gaming - especially in the ADSL era - was limited owing to the fact that contention on the same home connection would kill the experience. If I was playing xCloud on an ADSL line, then my wife loaded up Netflix and watched some videos, my gaming would be severely compromised. It would be worse still if something like a bandwidth-sapping Steam download kicked in.

The arrival of FTTP (fibre to the premises) has brought with it the bandwidth to run cloud gaming concurrently with other standard internet activity - though those Steam downloads with unrestrained download caps may still prove problematic. The point is that for the most part, modern infrastructure allows for GeForce Now to co-exist with standard internet use within the home. Client to server latency? For me, it's typically 5-7ms. Not everyone has FTTP, of course, but within ten years? Absolutely.

GeForce Now isn't a total replacement for a local console, not right now. One of the ways latency is addressed is to pipe frames to the user as quickly as possible, meaning that 'frame pacing' as such can look off. However, it stands as a practical example of how good cloud gaming can be and it's perhaps the only cloud service that has aggressively made good on the other great cloud promise: that the servers would be regularly upgraded with the latest hardware. That's how you can play Cyberpunk 2077 RT Overdrive right now at 4K max at around 50-75fps - the servers were upgraded with the latest Ada Lovelace GPUs.

As well as covering Cyberpunk 2077 RT Overdrive on less capable GPUs, we also look at the game running without a local dGPU at all, via GeForce Now RTX 4080.Watch on YouTube

I'm not going to say that the CMA's blocking of the Microsoft deal is anything to do with the viability of cloud gaming or that 'the power of the cloud' gives Xbox an unassailable edge because xCloud itself is OK, but clearly a couple of generations behind Nvidia's technology. Where Microsoft is well positioned is that it can provide the level of infrastructure required to make cloud gaming work - it owns Azure after all. On top of that, the route to profitability with cloud gaming is likely easier for Microsoft than anyone else now that Google is out of the market. There's no reason why xCloud server blades can't earn their keep as standard Windows servers when they're not being called upon for gaming.

Right now though, xCloud is far from the killer app it needs to be. Image quality is still problematic, there's not enough resolution or bandwidth to replace a local experience. Latency is far more noticeable though 60fps games claw back enough response to play out quite reasonably. Upgrades to this, particularly in terms of lag reduction, are likely only viable if Microsoft decides to pursue the kind of strategies Nvidia has - and that's not easy as so much is inherently based on what you might call the PC approach to gaming.

However, Microsoft's 'rival' mentioned in the CMA document does suggest that the challenges of cloud gaming are being actively investigated and that there may have been progress. And if you really want your mind blown, dig into id software's patents for Bethesda's in-limbo Orion streaming system, which seems to have ideas like a latency-masking system using user input to influence video stream motion vectors. It sounds a bit like the negative latency claims from Stadia so many mocked. I think the point is that engineers in the gaming space love impossible challenges - making cloud gaming work amongst them.

This expands quite a bit from my thoughts in this week's DF Direct Weekly, but there's a lot of other great stuff in there too - discussion on Xbox's remarkable sustainability drive (that I hope to write about in more depth soon), discussion on the ROG Ally benchmark reveal, plus our thoughts on the struggles facing owners of 8GB graphics cards. As always, do consider joining the DF Supporter Program to get involved with the show, to talk tech with our team, and to get early access to the Direct and a bunch of other cool stuff too. In the meantime though, if you can sample GeForce Now's RTX 4080 tier, I do wonder what you make of it - and whether cloud gaming has indeed been solved in the key ways I think it has been.

Read this next