Animate a character with mocap

7 min read

RADICAL LIVE | UNREAL ENGINE

Paul Hatton shows you how to transfer motion-capture data onto a digital character model using RADiCAL Live

HUGE BENEFITS Tools such as RADiCAL Live are now benefiting a range of projects

Live link motion capture is becoming increasingly popular technology within virtual produc tion workflows. The requirement for high-end and expensive hardware and software is almost non-existent with free or inexpensive options centre stage.

The applications for this tech are widespread including film, TV, archviz, adver tising and videos for education. Some cases will still need higher-end solutions, but for a lot of the industry these emerging tools will begin to take centre stage. Projects that could have benefited from some form of custom animation but didn’t have the budget will now find their options wide open.

One example of this tech is Wonder Studio, an AI tool that transfers motion-capture data onto a CG character for use in a live-action scene. This does all the hard work of generating motiontracking data, roto masks and clean plates, and finally replacing a live actor with a CG character. Such developments are helping to revolutionise the way directors and creators are generating their content. Tools such as these are drastically reducing film and TV budgets, and making it possible to get live feedback on set.

Another example, this time with a game engine, is Live Link Face for Unreal Engine, which enables animators to capture facial movements to be used with MetaHuman Animator. This can all be achieved in real-time, but is unfor tunately limited to use on iPhones and iPads.

The raw video and depth data captured can be applied to any MetaHuman character and offers a quick workflow for creating virtual content from real human performances. It’s also possible, through its ARKit, to stream data live to Unreal Engine, removing unnecessary steps between the actor and final product.

RADiCAL makes use of similar technology, but applies it to a whole character rather than just faces. It’s also not limited to only MetaHuman characters, but any character that’s properly rigged and set up. The motion-tracking data can be linked into Unity, Unreal Engine and a number of other applications. If a character model includes blendShapes or Face Morphs and is connected to RADiCAL’s Quick Rig retargeter, you can use face tracking in your scene without the need for any extra equipment.

RADiCAL Live includes a free starter option, plus a professional alternative with live integration and FBX export. In this tutorial, we’ll explain how to get RADiCAL up and running in conjunc tion with Unreal Engine 5.2.

DOWNLOAD YOUR RESOURCES For all the assets you need go to https://bit.ly/3dworld-totalwar

01 PREPARE THE PLUGIN FILES

Download the Windows or Mac plugin from the RADiCAL