Skip to main content

Digital Foundry vs. OnLive UK

So then: can it possibly work?

Latency

Whether you're talking about the PC, the home consoles or OnLive, every modern video game has lag. The process of taking input from the player, processing it within the console and displaying it on-screen takes a surprisingly long amount of time, and this is commonly described as input lag or controller lag. On top of this we need to add display latency - the time taken for a flat panel display to accept the input and bring it up on-screen, and you'd be surprised at just long this can take too.

OnLive has additional challenges. Input needs to be processed from the player, beamed across to the company's servers in Luxembourg, and only then can the game processing of the player's input begin. Once the frame is rendered, it needs to be compressed, beamed back from Luxembourg, decoded, and displayed on-screen. That being the case, it's safe to say that OnLive's achievement in actually making its system playable is nothing short of being totally and utterly remarkable. So how does OnLive itself explain its achievement?

"Video games today, when they're built for Xbox 360, PS3 or even PC, they have pre-render queues," OnLive's Steve Perlman explains.

"In order to get as much realism as they can with the processing hardware they have, they introduce multi-frame lag in games. There is some period of delay before the result hits the screen. We're able to compensate for that because we have state-of-the-art servers with very high performance GPUs. A 2005 class Xbox or PS3 game, when you put it on a 2011 class server, we don't have to have that pre-render queue. Instead, we use that time for the network delay. The algorithm keeps getting better and better."

There's merit to Perlman's argument. As regular Digital Foundry readers will know, generally speaking, 60 frames-per-second games have lower input lag than 30FPS games. As a ballpark measurement, with the former you can expect latencies of between 100ms to 133ms while with the latter that time drops to between 66ms to 83ms.

OnLive's recipe for playable gameplay is fairly straightforward then: run as close to 60FPS as possible and use the time saved for the encoding, transmission and decoding of video. How successful is it? Let's take a look at some latency measurements.

Unreal Tournament 3 and Borderlands demonstrate that OnLive can achieve latencies that aren't a million miles away from the home console experience, but in our testing session DiRT 3 disappointed.

Here we use a Ben Heck latency controller monitor board, which lights up LEDs when buttons are pressed. Use a 60FPS camera to record both board and display and simply count the frames between the LED lighting and the action being played out on-screen.

Measurements of gun muzzle flashes, brake lights on cars and easily identifiable motions are good for getting metrics, and it's in this way that developers like BioWare and Infinity Ward measure how fast their games respond. It's also the way in which we determined that Criterion's Need for Speed: Hot Pursuit is the most responsive 30FPS video game we've ever tested on console, weighing in at an impressive 83ms - the same as a great many 60Hz titles out there.

So what do we see with OnLive? Here's where we had some problems. In our testing sessions with titles like LEGO Batman, Just Cause 2 and Deus Ex: Human Revolution we were unable to get any kind of consistency in our metrics - if we do the same motion in exactly the same place with absolutely nothing else occurring on-screen, we should reasonably expect a consistent latency, but we didn't get it. We can only assume that network conditions were challenging bearing in mind that all the measurements fluctuated at 200ms and above.

However, with our initial tests with Unreal Tournament 3 and Borderlands earlier in the day, we did get the consistency in the results to give us the confidence to go ahead with an intriguing comparison. Here, we are comparing the exact same conditions on PC, Xbox 360 and OnLive versions of the same game. Remember that OnLive and PC code is essentially identical, so these figures give us an idea of the overall encoding/transmission/decoding overhead, while the console latency gives us a ballpark target for OnLive to match.

- OnLive PC Xbox 360
Unreal Tournament 3 150ms 66ms 116ms
Borderlands 166ms 50ms 133ms
DiRT 3 216ms 100ms 116ms

Unreal Tournament suggests that at optimal conditions, OnLive adds a five-frame lag, equivalent to 83ms. This may sound quite alarming, but some flat panel displays out there do that locally. Borderlands' local lag was tremendously low on PC at 50ms (it's difficult to imagine how much faster it could actually be), but here we see the difference with OnLive scale up to 116ms - the equivalent of seven frames, and in some cases elsewhere we even recorded 183ms latencies. An explanation for this could be that the OnLive version definitely wasn't running at 60 frames per second when we played it, adding to the delay. Finally, with DiRT 3 we see a relatively high local latency on the PC version, and again, the same seven-frame delay on OnLive. Regardless, it all demonstrates that OnLive really needs titles with low local latency in order to provide a more satisfying level of response.

In all cases, OnLive doesn't really compare with the local option, but the fact that it is playable at all is a colossal achievement, and the time differential we see between PC and console versions gives some indication on the time window OnLive has in order to do its encoding/transmission/decoding magic. It's curious that while Steve Perlman accepts the general principle in how the gulf in response between PC and console makes OnLive playable, he has some very curious arguments about how latency can be "tuned".

"We don't tune the system by some sort of scientific measurement on latency," he reckons. "We tune the game system from a human perceptual point of view to try to make it so the game plays as good as possible."

The fact is that game developers who actually do "tune" with latency in mind - the Criterions and Infinity Wards of the world - absolutely do use scientific measurement for latency and you can bet your bottom dollar that behind the scenes, OnLive does too. The problem with human perception is simple: everyone sees the world in a different way. Science is the means by which we quantify and provide a basis for comparison and improvement.

Bearing in mind that the encoding/transmission/decoding always has to happen, we have to call into question just how OnLive can "tune" latency any way. It's far more likely that developers are briefed to bring their input lag down as low as possible server-side. Certainly OnLive's notion that the algorithms have improved doesn't seem to compute: the measurements this year are very similar indeed to the metrics we performed 14 months ago in the USA launch period.

As with a lot of OnLive PR, there's a sense that the company is talking about the future of cloud gaming as though it is a reality in the present day, and that technical limitations were all overcome years ago. One day we'll wake up and we'll be in a world where we all have fibre-optic connections with almost limitless amounts of bandwidth, with OnLive servers well within range, and the whole thing will just work by default. But in the meantime, there seems to be a combination of techno-babble and concepts that just don't really seem to make sense and certainly don't seem to tally up with the reality of the system.