Skip to main content

Animations

Animations in our Unity SDK are triggered via GestureEvents and/or EmotionEvents that we receive from the server. This package contains several animations, but they are not covering all the GestureEvents and EmotionEvents. It is recommended that you import your own animation clips, and set them as states of our Animator.

Demo

Before you go through this page, we highly recommend you check the Emotion Sample Demo Scene SampleEmo

Architecture

Our animator is called InworldAnimCtrl. It contains three different layers. These are the Idle Layer, the Gesture Layer, and the Face Layer. Please note the following,

  1. All three layers can pass IK data
  2. The Gesture Layer and Face Layer are additive to the Idle Layer
  3. The Face Layer masks faces only
  4. The Gesture Layer masks all the other parts of the character except the face

AnimLayer

1. Idle Layer

The Idle layer is triggered by the following game logic,

  1. Once the client and server established connection, when you face your character, it switches to a Hello state.
  2. The character that was previously communicating will switch to a Goodbye state
  3. When the current character receives audio, it will switch to a Talking state
  4. Based on breaks in the audio, the character will switch between Talkingand Neutral states

IdleLayer

In an Idle or Talking state, we have provided various types of Idle or Talking behavior based on the emotion of the character. These are triggered by EmotionEvents from the server.

IdleLayer

2. Gesture Layer

The gesture layer is triggered via EmotionEvents or GestureEvents sent by the server.

Gestures

3. Face Layer

The face layer is an interface that only has NeutralIdle. We implemented facial expressions by modifying Blend Shapes directly on this state, via EmotionEvents or GestureEvents sent by the server.

Interaction with Server

In live sessions with a character, the server will occasionally send GestureEvents or EmotionEvents based on the current dialog context. Our code will catch these events and display the appropriate animations. However, our animations and the server events are not in a one-to-one correspondence.

AllType

If you are interested in the mapping, please check BodyAnimation::HandleMainStatus(), BodyAnimation::HandleEmotion(), BodyAnimation::HandleGesture() for more details.

Using your custom body animations

You may have noticed that legacy characters offered through Ready Player Me Avatar are not T-Posed when you generate them from the studio website. We recognize that this can make it difficult to perform animation re-targeting. However, Ready Player Me Avatar does support Mixamo animation. To solve this problem, you can upload to the Mixamo server and download the uploaded fbx to replace any of our existing animations.

1. Importing the Ready Player Me SDK

Please visit this site to download the latest Ready Player Me SDK. Note the following,

  1. The Ready Player Me SDK, as well as Inworld AI, uses GLTFUtility as the .glb format avatar loader
  2. Newtonsoft Json was officially included in Unity since 2018.4. When you are importing a package, please exclude these two plugins to avoid a possible conflict.
  3. There is a small API formatting bug that involves GLTFUtility. It can be solved as follows,

RPMImport

2. Downloading Mixamo

⚠️ Note: To use your own animations, please select Upload Character on the Mixamo website.

When you are downloading the fbx from Mixamo, select Fbx for Unity, and Without skin. Drag it into Unity's Assets folder. These steps are shown below,

MixamoImport

3. Replacing animations

⚠️ Note: The default animation type in the fbx you downloaded from Mixamo is Generic, which you CANNOT re-target.

To solve this:

  1. Select the fbx that you imported at Rig section.
  2. Switch the Animation Type to Humanoid and click Apply.
  3. Right-click the fbx and choose Extract Animations. This function is provided by the Ready Player Me Unity SDK, and it will generate an animationclip.
  4. You can select any state in InworldAnimCtrl and replace the animationclip with your new clip. This is shown below,

MixamoFinal

Configure your custom facial animations

Instead of using Animation Clips, we directly modify the Blend Shapes on the SkinnedMeshRenderer of the avatar. You could check each state at Resources > Animations > Inworld Face Animations > Emotions.

FaceAnim

If you want to change the behavior, you can drag each index, until you find the face expression you like, then restore the data. FaceGif

⚠️ Note: We recommend you duplicate this Inworld Face Animations, then update and allocate your genenated asset to your InworldCharacter's Head Animation.

FaceAnim2