LucasArts' 60FPS Force Unleashed II tech demo
A frame-rate upscaler that really works? Digital Foundry investigates.
At the recent SIGGRAPH 2010, LucasArts coder Dmitry Andreev showed off a quite remarkable tech demo based on work he carried out during the development of Star Wars: The Force Unleashed II. In a video demonstration running on Xbox 360, he showed the game operating at its default 30FPS, but then seemingly magically running at 60FPS - with no apparent graphical compromises aside from the removal of motion blur.
It's a demo that you can download and see for yourself right now, either in HD or else in a more bandwidth friendly standard def encode, with Andreev's original presentation also available to view. Two AVI videos (requiring an h264 decoder) are in the package: an original prototype, along with a further, more refined proof-of-concept running in the Star Wars: The Force Unleashed II game engine.
It's safe to say that it's impressive work that has generated a lot of discussion within the games industry.
On a general level, there's a reason why most console games operate at 30FPS as opposed to the more preferable 60: the more time a developer has to render a frame, the more complex and rich it can be. But what if a mixture of clever tech and exploitation of human perception could be used to create a 30FPS engine that looks like it's running at twice the speed?
Andreev and his colleagues have devised a system that gives an uncanny illusion of true 60FPS, and uses less system resources than its existing motion blur code. Swap out the blur for the frame-rate upscaler and you effectively have all the visual advantages of 60FPS for "free", as there's very little need to run full-on multi-sample motion blur if your game is already running at 60FPS.
Andreev first got the idea for the technique by studying 120Hz TVs that interpolate two frames in order to produce an intermediate image, producing a smoother picture. Software filters on some media players (for example Philips' Trimension as seen on the WinDVD player) were also considered. If this approach could be replicated within the game engine, an effect far more pleasing than most motion blur algorithms could be produced. Discussions after SIGGRAPH 2008 soon led to prototyping.
"So as soon as I got back home, I started to play with it and soon after that realised that there are a lot of issues," Andreev reveals.
"Mostly the artifacts of a different kind, that appear in more or less complex scenes, as well as performance issues (it is really slow when done properly). And to better understand the problem, I made a very quick and simple prototype to play with."
The prototype used the same techniques as the available solutions, studying the image for "motion vectors" that showed how elements of the image would move from one frame to the next. The problem was that it added obvious artifacting, because not enough information is available to rebuild the interpolated intermediate image. Additionally, hunting out the motion vectors is incredibly CPU-intensive (hence why decent video encoding takes so long).
Andreev soon realised that the building blocks of the rendering process itself could be re-purposed and used instead.
"We already know how things are moving as we have full control over them. This way we don't need to do any kind of estimation," Andreev says.
"Moreover, when we do the interpolation, we can handle different things differently, depending on the kind of quality we are happy with. On top of that, we can use different interpolation techniques for different parts of the image, such as layers of transparency, shadows, reflections and even entire characters."
But why interpolate at all? After all, there are some pretty impressive 60FPS titles on the market already. The reason is that dropping down to a capped 30FPS brings a whole range of new rendering technologies within reach for developers. Deferred lighting on the scale seen in games like Killzone 2, Blur and the forthcoming Need for Speed: Hot Pursuit can only really work on console with that extra rendering time available. It also makes life a hell of a lot easier for the basic process of building a game.
"It is not impossible to make a 60FPS game, obviously, but it requires a lot more strict production process for art, design and engineering," Andreev shares.
"It is fair to say that in a lot of cases, during pre-production, studios try to see what it would take to make a 60FPS game. Then, they get something that doesn't look very pretty when running at 60, realising that all the art has to be produced very carefully as well as level and game design."
One solution for making a solid 30FPS title smoother is to use motion blur, and there have been some pretty decent implementations that make the image seem much more realistic and more fluid. Motion blur requires the generation of a so-called velocity buffer, which defines the movement. However, rather than using it for creation of the motion blur, the buffer is repurposed to produce an interpolated, intermediate image that is drawn at the mid-point between two frames.
"Render the velocity buffer as we would do for motion blur. Build the middle frame. And present it at the moment in time it was intended for," Andreev says.
"Note that in case of 30 to 60FPS conversion, the inner frame has to be presented at the middle of the frame. This is all it is, no more, no less. The rest is implementation itself, which is rather tricky."
The key is to re-use as much of the available processing as possible. In the case of Andreev's demo, the depth buffer and velocity map for the next full frame are generated, but directly after this, midway through the processing, this data, combined with elements from the last frame, is used to interpolate the intermediate image before calculations on the next real frame continue.
You'd think that this technique would cause lag, but as the interpolated image is being generated using elements from the next "real" frame, it actually reduces latency. Andreev's technique is single-frame based rather than dual-frame. The latter approach would require buffering two images so has a big memory and latency overhead, while the technique Andreev used effectively interpolates on the fly using past and future rendering elements.
"The most simple and efficient solution is to do the interpolation in-place, during the current frame, while the previous one is on screen. This way the previous front buffer can be mapped as a texture (on Xbox 360) and used for interpolation directly," he explains.
"In terms of latency there is something interesting going on. I said that there is no extra latency, which is true. But if you think about it, latency is actually reduced because we get the new visual result 16.6 ms earlier. You see the result of your actions earlier."