GDC: Why OnLive Can't Possibly Work
Cloud computing or cloud cuckoo land?
I love industry-shaking announcements. I love new, game-changing hardware, and I'm absolutely, almost literally exploding with excitement about the new OnLive gaming concept. I love that front-end, and I love the way OnLive uses video because video is what my company, Digital Foundry, specialises in, and what I spend a lot of my time experimenting with. I want this to be brilliant so much that it's almost painful.
The concept is remarkably simple. The actual hardware generating the visuals and running the gameplay isn't owned by you. Instead it's held somewhere else in the world. That hardware then encodes its visual output and beams it to you over the internet. The player sitting at home simply uses an existing PC or Mac (or 'micro-console') to take the video stream over IP, beaming back control inputs to the server. The advantages are very straightforward - you don't need to upgrade your hardware, the people running the servers do. And that hardware can be state-of-the-art PC kit way in advance of what Xbox 360 or PS3 are capable of, and of course it's upgradable. You'll never need to buy a game again; you'll just rent time on the ones you want to play. You'll doubtless save money and the publishers will make more of it. Piracy will be impossible.
There's only one slight problem. Realistically, there is no way it can work to the extent suggested, and no way it can provide a gaming experience as good as the one you already have without inherent compromises. It's a great idea, and an intriguing demo that is amazing in that it actually works at all. However, away from the concept and the tech demos running in controlled conditions, OnLive raises so many technical questions and seemingly overcomes so many impossible challenges that it can't possibly work.
In essence, we're looking at several very specific challenges for OnLive to overcome - challenges that are either massive in scope, or technologically beyond the very best minds of their respective fields. For this to work, we're talking about a generational leap in not one, but several fields of technology.
The Hardware Question
To give the kind of performance OnLive is promising (720p at 60 frames-per-second) realistically its datacenters are going to require the processing equivalent of a high-end dual core PC running a very fast GPU - a 9800GT minimum, and maybe something a bit meatier depending on whether the 60fps gameplay claim works out, and which games will actually be running. That's for every single connection OnLive is going to be handling.
So, let's say that Grand Theft Auto V is released via OnLive, and (conservatively) one million people want to play it at the same time. We can talk about Tesla GPUs, server clusters, the whole nine yards, but the bottom line is that the computing and rendering power we're talking about is mammoth to a degree never seen before in the games business, perhaps anywhere. There may be a way how this can be handled (more on that later), but even having capacity for 'just' 5,000 clients running at the same time is a monumental effort and expense. It would be the equivalent of us running a single Eurogamer server for every reader who connects to the site at the same time. The expense involved is staggering (not to mention the heat all this hardware would generate - think of the children!).
The Video Encoding Conundrum
Not only will these datacenters be handling the gameplay, they will also be encoding the video output of the machines in real time and piping it down over IP to you at 1.5MBps (for SD) and 5MBps (for HD). OnLive says you will be getting 60fps gameplay. First of all, bear in mind that YouTube's encoding farms take a long, long time to produce their current, offline 2MBps 30fps HD video. OnLive is going to be doing it all in real-time via a PC plug-in card, at 5MBps, and with surround sound too.
It sounds brilliant, but there's one rather annoying fact to consider: the nature of video compression is such that the longer the CPU has to encode the video, the better the job it will do. Conversely, it's a matter of fact that the lower the latency, the less efficient it can be.
More than that, OnLive overlord Steve Perlmen has said that the latency introduced by the encoder is 1ms. Think about that; he's saying that the OnLive encoder runs at 1000fps. It's one of the most astonishing claims I've ever heard. It's like Ford saying that the new Fiesta's cruising speed is in excess of the speed of sound. To give some idea of the kind of leap OnLive reckons it is delivering, I consulted one of the world's leading specialists in high-end video encoding, and his response to OnLive's claims included such gems as "Bulls***" and "Hahahahaha!" along with a more measured, "I have the feeling that somebody is not telling the entire story here." This is a man whose know-how has helped YouTube make the jump to HD, and whose software is used in video compression applications around the world.
He recommended a series of settings and tweaks that would allow for h264 processing at the kind of latencies OnLive has to work with, so here's a comparison video: source on the left, 5MBps 60fps encode on the right. As is usual with my videos, the action is slowed down to eliminate macro-blocking on playback as much as possible. Burnout Paradise is the chosen game, which features heavily on OnLive's front-end demo, and is also a good test for arcade-style video.
It's not particularly pretty, but with the constrictions OnLive has to live with, this is the sort of performance the current market leader in compression has to offer. The bottom line here is that OnLive's 'interactive video compression algorithm' must be so utterly amazing, and orders of magnitude better than anything ever made, that you wonder why the company is bothering with videogames at all when the potential applications are so much more staggering and immense.
The Insurmountable Challenge: Latency
OnLive says that it has conducted years of 'psychophysical' research to lessen the effects of internet latency. That's the key issue here, and I can't see how OnLive can fudge its way around this one. In reality, it's going to need sub-150 millisecond latency from its servers at least, and a hell of a QoS (quality of service) to guarantee that this will in any way approximate the experience you currently have at home. The latency factor will probably need to be somewhat lower than that to factor in the video encoding server-side, and decoding client-side, which by any measurable standard right now is going to be impactful.