Tutorial – Using the Live Link Face app for iPhone 10+ to record facial animation assets for MetaHumans in Unreal Engine

This tutorial will walk you through the steps of bringing your MetaHuman to life with facial mocap straight from your iPhone. You will be guided through the process of setting up a new project readyu for animation, importing your MetaHuman and connecting it to Live Link, before finally recording your animation and saving it as a separate asset that you can reuse on any other MetaHuman later.

The Live Link Face app only works for the iPhone X or above, so you will need access to one of these before beginning this tutorial. The screenshots in this tutorial are from UE4.26.2 but the same steps will also work in UE5, although the UI will look different. The main difference is that in UE5, Quixel Bridge is incorporated directly into the engine, whereas if you are on Unreal Engine 4, you will need to download Quixel Bridge from here: www.quixel.com/bridge

1 – Starting a new project

2- Downloading and importing the MetaHuman into Unreal Engine

3 – Enabling necessary plugins

Enable Missing for all Plugins and Project Settings Warnings
Enable Live Link, Apple ARKit and Apple ARKit Face Support Plugins

4 – Placing your MetaHuman into the scene

Drag the MetaHuman Blueprint into your scene

5 – Connect your MetaHuman to the Live Link Face App on your Iphone X (or above)

6 – Recording a performance with the take recorder

7 – Saving the take as an animation for later use in sequencer

8 – Re-using your animation asset in sequencer

Connor Shine Avatar

Posted by

4 responses to “Tutorial – Using the Live Link Face app for iPhone 10+ to record facial animation assets for MetaHumans in Unreal Engine”

  1. Heya, how would one go about combining both a facial animation from Live Link and an existing body animation in the sequencer without losing the head rotation? I’ve tried so many things, and still stuck on the issue for almost 2 weeks (using UE5).

    Like

    1. hi Lena, you might look into using control rig or separating the head rotations specifically with IK animation nodes.

      We wrote a blog post a bit about this here:

      Building a responsive cinematic animation system in Unreal Engine 4

      Like

      1. Thank you, I will have a look at the post!

        Like

  2. Hello, everything works fine in my editor,
    but on build, the application does not see my phone (iPhone and iPad). Can you help me?

    Like

Leave a comment

Create a website or blog at WordPress.com