Sharing Facial Animations in Unreal Engine

Sharing Facial Animations in Unreal Engine

In Unreal Engine, skeletal animations can be shared between different skeletons through retargeting assets. Morph Target based facial animations can also be reused, but the specific approach depends on the target model.

This article focuses on retargeting ARKit facial animations, but the methods described apply to any Morph Target animation.

Prerequisites

First, we record a facial animation using Dollars MONO and import it into Unreal Engine. For notes on importing Dollars FBX files into Unreal, see this guide.

After importing, open the animation asset to see the ARKit standard animation Curve data contained within it.

ARKit animation Curve data
The imported animation asset contains ARKit standard Curve data

How you apply this animation to a new model depends on its Morph Target setup. We will cover three scenarios below.

1. Matching Morph Targets

If the target model already has ARKit Morph Targets with matching names, setup is straightforward. This Ready Player Me model, for instance, comes with the full set using Apple's naming convention.

Ready Player Me model ARKit Morph Targets
The Ready Player Me model contains a complete set of ARKit Morph Targets

This is the simplest scenario. Just open the target model's Skeleton asset and select Asset Details from the Window menu.

Opening the Skeleton Asset Details panel
Open the Asset Details panel via the Window menu in the Skeleton editor

In the panel that appears, find the Compatible Skeletons section and add the skeleton used by the Dollars facial animation.

Adding a compatible skeleton
Add the skeleton used by the Dollars animation to Compatible Skeletons

Once done, create an Animation Blueprint for the target model. You should now see the Dollars exported animation in the asset browser.

Dollars animation in the Animation Blueprint
The Dollars exported animation sequence appears in the Animation Blueprint's Asset Browser

After connecting this animation to the output node, you may notice the model suddenly changes size or orientation.

Model distortion issue
Playing the animation directly causes model distortion with increased size and incorrect orientation

This happens because the Dollars facial animation's Root bone carries rotation data and a non-unit Scale. When the animation plays, these values are applied to the final bone pose, causing the model to deform.

Root bone rotation and scale
The Dollars animation's Root bone has a 90° rotation and non-standard Scale values

The solution is to use a Layered Blend per Bone node in the Animation Blueprint. Set the facial animation as the Base Pose input and leave Blend Poses 0 empty. In the node's Layer Setup under Branch Filters, add the target model's root bone. For this Ready Player Me model, the root bone is named Hips.

Layered Blend per Bone node setup
Use the Layered Blend per Bone node and specify the root bone Hips in Branch Filters

For Hips and its child bones, this node uses the Blend Poses 0 input. Since that is left empty, the model maintains its original bone pose. For Curves and other channels, it uses the Base Pose input, which is the facial animation data. This way, bone position and scale remain unaffected while the facial Curve data is properly passed through.

2. No Matching Morph Targets, but with Pose Asset

Some models don't have ARKit named Morph Targets, yet ship with a Pose Asset that bridges the gap. MetaHuman is a typical example. Its Pose Asset maps ARKit Curves to facial Poses.

MetaHuman Pose Asset
MetaHuman's Pose Asset defines the mapping between ARKit Curve names and facial Poses

As before, add the Dollars skeleton as a compatible skeleton in MetaHuman's Skeleton, then play the Dollars facial animation in MetaHuman's Animation Blueprint.

The difference is that you need to add an Evaluate Pose node after the animation node. In its Pose Asset property, select MetaHuman's built-in ARKit mapping asset PA_MetaHuman_ARKit_Mapping. This converts the input Morph Target curves into MetaHuman's facial Poses.

Evaluate Pose node in MetaHuman Animation Blueprint
Use the Evaluate Pose node in the Animation Blueprint to map ARKit Curves to MetaHuman facial expressions

3. No Matching Morph Targets, No Pose Asset

When the target model has neither matching names nor a Pose Asset, you need to remap the curves yourself. This Daz model, for example, uses Jaw_Open instead of jawOpen, and Eye_Blink_Left instead of eyeBlinkLeft.

Daz model Morph Target list
The Daz model uses a different Morph Target naming convention from ARKit

We have prepared an open-source tool for this scenario.

https://github.com/SunnyViewTech/UE-MorphTarget-Retargeting-Tools

This is a Python script that renames the Morph Target curves in the source animation according to a mapping file, producing a new animation sequence that can be directly used on the target model.

After opening the script, configure the following key parameters.

SOURCE_ANIM_PATH  = "/Game/Capture.Capture"
MAPPING_FILE_PATH = "C:/arkit_mapping.txt"
TARGET_MESH_PATH  = "/Game/MyHead.MyHead"

SOURCE_ANIM_PATH is the asset path of the source animation sequence. You can get it by right-clicking the animation sequence in the Content Browser and selecting Copy Object Path.

Copying the animation sequence Object Path
Right-click the animation sequence and select Copy Object Path to get the asset path

MAPPING_FILE_PATH is the local path to the Morph Target mapping file. The mapping file is plain text, with two columns per line. The left column is the ARKit name from the source animation, and the right column is the corresponding Morph Target name in the target model.

Morph Target mapping file example
The mapping file maps ARKit standard names to Daz model naming. Left column is source, right column is target

TARGET_MESH_PATH is the asset path of the target model, also obtained via right-click Copy Object Path.

Once all parameters are set, go to the Tools menu in the Unreal Editor and select Execute Python Script, then choose the script to run.

Executing the Python script
Run the retargeting script via Tools > Execute Python Script

After execution, a new animation sequence asset is generated in the specified directory. Opening it reveals that the Curve names have been converted to the target model's naming convention according to the mapping file, ready for direct playback on the target model.

Generated retargeted animation sequence
The generated animation sequence has its Curves remapped to the target model's naming convention

One to Many Mapping

Sometimes a single source Morph Target needs to drive multiple targets. For example, jawOpen in the source might need to drive both jawOpen and tongue_jawOpen on the target model.

Multiple jaw-related Morph Targets in the target model
jawOpen and tongue_jawOpen are two separate Morph Targets in the target model

Simply add multiple lines for the same source name in the mapping file. The tool automatically detects duplicates and copies the same Curve data to all corresponding target channels.

One-to-many mapping file example
Add multiple lines for jawOpen in the mapping file to achieve one-to-many mapping

If your model doesn't fit any of the scenarios above, or you run into issues along the way, feel free to reach out. You can also open an issue on the GitHub repository or visit the Dollars MoCap documentation for more guides.