Framestore takes flite with unreal

5 min read

PROJECT INSIGHT

Trevor Hogg learns from FLITE director Tim Webber about merging together two pipelines to make the ambitious short film

An image of a girl on a hoverboard, teetering on a window of a luxury high-rise building deciding whether to breakaway in a semi-submerged London of 2053 and the concept of a Memory Investigator provided Framestore’s VFX supervisor and creative director, Tim Webber, with the chance to create and direct a short film utilising FUSE [Framestore Unreal Shot Engine].

FLITE wasshot over five days and assembled into a 14-minute sci-fi drama. It revolves around a stranger trying to assist a young female champion hoverboarder attempting to leave her oppressive manager by recounting what he saw to local law enforcement.

Providing a guideline for the production methodology was the blockbuster feature film that saw Webber rewarded with an Oscar and BAFTA Award alongside Chris Lawrence, David Shirk and Neil Corbould. “We did a lot of things on Gravity that opened my eyes to the possibility of working in that way and using current technology to make it easier, better, quicker, and less expensive,” states Webber.

“One of the key things from Gravity wasdoing previs for the whole movie and going beyond normal previs, so you have the ability to make a more informed judgement of how well it works, even before getting to shooting.

Expanding that to the whole process enabled us to change the creative flow of how you develop the creative aspects of the story.

“Because we completed everything right to the final pixel in Unreal Engine, it meant that we could work up the previs with high-quality assets, and do further work in animation that you might not normally have done like designing the final lighting. You’re seeing the whole movie in a much more complete form.

“As you shoot, performances can quite easily be dropped in. As you animate, everything is live thanks to it being in Unreal Engine. You have a lot more context than you would normally have to make judgements about how the whole thing works.”

Virtual camera and location scouting were conducted in real-time through an iPad or VR headset. “I put the actors in the VR headsets so they could understand and explore the environment, whether it be the apartment or the city,” remarks Webber. “Because everything is more interactive and immediate, it enables lots of people working on various aspects to be looking at stuff in more context, and you’re all looking at the same thing together. Someone is tweaking the lighting while another is modifying the animation.

“Very quickly you see it all together and you’re looking at something that might not be final quality, but gives you a good

judge of how the end product will look and is more responsive. You can collaborate much better under those circumstances.”

Len