Real-time or not?

I never know if we should be flattered how many time this comes up when we release stuff… and to save it having to be answered in a dozen different forums I’ll explain here how we made various things you’ve seen recently… that fair?

Lets start with the latest piccy released today? here Sony keep our PR on a pretty tight leash (cos they have plans etc.), but cos we have some nice fans on the ninja theory forums, Chris and Tam asked if we could give them something pretty. So Sony allowed us one shot… They decided that something that would make a nice desktop backdrop would be good.

As I understand it yesterday, two guys from our character art team (Stu and Ben) and Tameem booted up a PS3 and captured a shot from the engine, we have two ways of capturing, one via an expensive real-time capture device (using a ‘BlackMagic’ video capture hardware) and we can also dump frames direct from the engine.

I expect they used the second, which would produce a 1280×720 image (of course rendering with 4xAA). Because this was meant as a backdrop they then resized it, and touched it up a little. Key points are extra specular highlights on the eye and depth of field and blur to make it more dreamy and to reflect things that they are hoping to achieve before the games release (eye specular is an area of research for quite some time, makes such a difference).

That what you see, its rendered in engine on PS3 using our cut-scene version of Nariko but with a little photoshop magic on top. And yes it runs real-time etc.

Next up, the video footage seen in Heroes and the Production video earlier this week. Its in-game, not touched, direct captures done about 3-4 months ago. This is clear in that at least one frame you can see a ‘twisty’ bloke which happens when an animation hasn’t been re-exported, given the thousands of animations we have, we sometimes miss one and in this case it even made it into the video, doh!