In Theory: Is this how OnLive works?
Digital Foundry mulls over shots and expert opinion.
First impressions are promising. There are obvious compression artefacts in the background foliage and the colour looks muted (streaming video runs in a lower-precision pixel format so this is to be expected). The fact that the leaker has chosen to take screenshots of almost completely static scenes means that the encoder is not really being stressed at all here. We're not seeing OnLive under challenging conditions, but in what amounts to a best-case scenario - beta testers have spoken of a blur that accompanies footage in motion. As an aside though, these shots do allow us to confirm that the image is native 720p.
On the flipside, claims that you'll be able to run Crysis at max settings on OnLive don't quite compute: having spent quite some time with the game recently, the lack of texture definition and advanced lighting suggests settings closer to the medium level. To be honest, that is probably for the best. More detail means more strain on the encoder, and if fine detail ends up being lost in the encoding process anyway, there is a strong argument that suggests that rendering it in the first place isn't such a great idea.
All of which suggests that OnLive has a place as a provider of a low to mid-range gameplay experience, but not the complete PC/console replacement that it has been marketed as thus far. But of course it all depends on what is defined as "low to mid-range". For us high-end gamers accustomed to pin-sharp HD gameplay, the inevitable variance in picture quality and the latency inherent in the concept and technology instantly puts OnLive at a disadvantage, but what of a more typical living room environment? Sitting 10 feet away from the TV, does compression artefacting matter so much? Will a more casual gamer really care if the controls are laggier?
Perlman's presentation demonstrates some intriguing new possibilities - specifically the notion of broadcasting gameplay to potentially hundreds of thousands of spectators over IP. To achieve this he talks about two levels of encoding: the first being the direct link to the gamer (Jason Garrett-Glaser's explanation making a lot of sense based on what has been seen so far, and can be put to the test when high-quality direct feed video becomes available), the second stream being dedicated to media and other broadcasting features. Presumably this is a high bandwidth output from the host machine which can then be re-encoded again to be beamed out to anyone whether they're using HD, SD or even mobile feeds.
There's nothing implausible in this concept and the potential here is indeed quite extraordinary. Game developers with software in beta can see exactly how the game is being played and what causes issues. They can even identify parts of the game design that don't work as well as they should through the process of watching gamers playing, and use that as the basis for improving their work. Viewers can watch expert players and see how to progress in tricky areas of the game. Players themselves can save off 15-second clips to share with friends, the so-called "Brag Clips" that are also derived from this media stream.
Having the ability to reach out like this with gameplay video and deliver it onto any home PC, Mac or even selected mobile phones demonstrates how the concept brings something new to gaming that we should fully expect to see cloned within PSN and Xbox Live at some point in the future. Were OnLive to integrate features like this outside of their own ecosystem (for example, with forum or Facebook embedding), that reach would be become pretty astonishing.
Our initial response to OnLive after last year's GDC 09 presentation was to challenge the claims and the technobabble surrounding the debut. In this respect, some things don't change. Watching the latest video from Columbia University still feels like an exercise in filtering out the stuff that simply doesn't make sense (at one point, when questioned on the compression algorithm, Perlman literally calls it a "bunch of mathematics") in an attempt to get to the heart of the tech and what it truly represents to gamers.
Perlman's example of 80ms latency from button-press to on-screen action still doesn't sit right in a world where the ultra-crisp Call of Duty 4 updates at 66ms locally when running in optimum conditions. Transmission, reception, encoding and decoding in 14ms on top of the controller latency borders on the unbelievable, and it's still not clear if OnLive is targeting 60FPS or not - the figure seemed absent in Perlman's latest presentation.
Ironically, OnLive's credibility got far more of a boost from the straight-talking Gaikai team: no technobabble about compression schemes, bandwidth consumption figures right there on-screen, no talk about 1ms encoders that outperform the best the broadcast industry has to offer, and no suggestions of a 720p60 console-replacement experience. Put Perry and Perlman's presentations side by side, factor in the extra bandwidth OnLive uses and factor out the marketing-speak; suddenly OnLive becomes a much less fantastical proposition.
Off the record, beta testers we've heard from still raise questions about the latency, and whether the picture quality is good enough, and obviously there remains a huge amount of uncertainty over server availability and how the service scales when the load increases exponentially as more users join. But in the here and now, the nuts and bolts of the system are working outside of OnLive's controlled condition demos, and despite the NDAs, details and experiences are emerging. However, whether the performance matches OnLive's many claims still remains to be seen...