After connecting with LIVE FACE and getting familiar with it, you can connect your CrazyTalk Animator character to real actors to control your CTA character's expressions.
By using the Facial Mocap Controls in the mask pane, you can mask out unwanted facial features on the dummy, and extract specific facial motions.
By default, all the face features are selected for full face capturing.
Real human expression. | All face features receive the movement data. |
You can use the multi-layer technique to separately record facial expressions for each facial feature instead of recording with the entire face. This is especially useful when a character has a start pose and you wish to gradually add expressions to facial features for various expressions.
Real human expression. | Only the eye balls receive the movement data. |
Real human expression. | Only the mouth and jaw receive the movement data. |
Real human expression. | Only the head receive the movement data. |
Open the Timeline panel and click the Face button,
you can find the recorded facial mocap data stored in a Facial Mocap Clip in the Facial Clip track.
You may collect and export the facial mocap clips, then apply to other characters. ( View video )
If you want to further adjust the captured facial expression clips, then please refer to the Five Approaches to Generating Facial Expressions section for more information.