Next, you will need a Transform (Modify) Bone node for each head and neck bone in your character's skeleton that will be influenced by the rotation data. Starting from Output Pose, drag off from the Result and create a Component To Local node. The section shown above is added to the end of the Anim Graph, right before the Output Pose node. With the head and neck rotation data coming in from the app, it can now be applied to the character's rig in the Anim Graph: Using the Property Name box on each node, set one to headYaw, another to headRoll, and the last to headPitch.Ĭonnect the Valid Frame output of Evaluate Live Link Frame to the Set variables as seen in the image above.įinally, connect the Value outputs of each of the Get Property Value nodes to the float input on its corresponding Set variable node. ![]() ![]() These will be used to get the yaw, roll, and pitch from the Live Link app. Right click near Evaluate Live Link Frame and create three Get Property Value nodes. Open the Role dropdown menu and choose the LiveLinkBasicRole. Using the Subject dropdown menu, choose the subject that represents your device. This will provide the data from the Live Link source that will be stored in your float variables. Drag each into the Blueprint and choose the Set option from the menu.ĭrag off from the Event Blueprint Update Animation node and create an Evaluate Live Link Frame node. Next, in the My Blueprint panel create 3 float variables named HeadRoll, HeadYaw, and HeadPitch. This ensures the head position updates each frame there is movement. Right click in the Blueprint and add an Event Blueprint Update Animation node. It takes in yaw, roll, and pitch data from your Live Link performance and applies it to the appropriate bones in your character's rig. This Blueprint is placed in the Event Graph of your character's Anim Blueprint. To apply head rotation to the Actor using data from the Live Link Face App, you first need to set up Blueprints in the Event Graph and Anim Graph to drive the joints in the head. Tap the Record button again to stop the take. This begins recording the performance on the device, and also launches Take Recorder in the Unreal Editor to begin recording the animation data on the character in the engine. When you're ready to record a performance, tap the red Record button in the Live Link Face app. In the Details panel, ensure that the Update Animation in Editor setting in the Skeletal Mesh category is enabled.īack in Live Link Face, point your phone's camera at your face until the app recognizes your face and begins tracking your facial movements.Īt this point, you should see the character in the Unreal Editor begin to move its face to match yours in real time. In your character's animation graph, find the Live Link Pose node and set its subject to the one that represents your device.Ĭompile and Save the animation Blueprint. You should now see your device listed as a subject. ![]() In the Unreal Editor, open the Live Link panel by selecting Window > Live Link from the main menu. See also the Working with Multiple Users section below.įor details on all the other settings available for the Live Link Face app, see the sections below. If you need to broadcast your animations to multiple Unreal Editor instances, you can enter multiple IP addresses here. You'll typically need to do this in a third-party rigging and animation tool, such as Autodesk Maya, then import the character into Unreal Engine.įor a list of the blend shapes your character will need to support, see the Apple ARKit documentation. You need to have a character set up with a set of blend shapes that match the facial blend shapes produced by ARKit's facial recognition. You'll need to have an iOS device that supports ARKit and depth API.įollow the instructions in this section to set up your Unreal Engine Project, connect the Live Link Face app, and apply the data being recorded by the app to a 3D character.Įnable the following Plugins for your Project: You'll have best results if you're already familiar with the following material: The material on this page refers to several different tools and functional areas of Unreal Engine. This page explains how to use the Live Link Face app to apply live performances on to the face of a 3D character, and how to make the resulting facial capture system work in the context of a full-scale production shoot. If your iOS device contains a depth camera and ARKit capabilities, you can use the free Live Link Face app from Epic Games to drive complex facial animations on 3D characters inside Unreal Engine, recording them live on your phone and in the engine. Recent models of the Apple iPhone and iPad offer sophisticated facial recognition and motion tracking capabilities that distinguish the position, topology, and movements of over 50 specific muscles in a user's face.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |