How to Add Custom Actions and Animations to AI MetaHuman Avatars in Unreal Engine with Convai

By
Convai Team
August 20, 2024

Key Takeaways

  • Custom animations in Unreal Engine are connected to Convai's Action System via named action labels registered in Project Settings.
  • Animation Montages control how custom animations play, blend, and transition, making them the right format for Convai-triggered actions.
  • Prompt-to-Action means players can trigger animations conversationally: saying 'show me a dance' causes the MetaHuman to physically perform it, not just describe it.
  • The Animation Blueprint needs conditional logic to handle Convai action triggers alongside normal locomotion and conversation states.
  • Animations can be sourced from Reallusion ActorCore, FAB Marketplace, or Mixamo and imported with standard Unreal Engine retargeting tools.

Convai-powered MetaHuman avatars in Unreal Engine do not just respond to player commands with dialogue. With custom actions connected through Convai's Action System, they physically perform the action in the scene. A player says 'show me the procedure.' The MetaHuman walks to the workstation, picks up the tool, and demonstrates the steps.

Watch this in-depth step-by-step guide in action on our youtube tutorial

In this blog post, we will explore how to add custom actions to Metahuman avatars using Unreal Engine, specifically focusing on Convai-powered AI characters (Download the Convai SDK for Unreal Engine). This guide is designed to help game developers, game designers, and virtual world builders enhance their projects by integrating custom animations and making their AI Metahumans more lifelike and engaging. To learn more about adding AI to your Metahuman avatars, read our in depth blog or watch our tutorial on integrating Metahuman avatars with Convai in Unreal Engine.

Why Custom Animations Matter for AI Metahumans

Custom animations are critical in creating a more immersive and realistic experience for players. When AI Metahumans perform unique actions—whether it's a simple dance move or complex scene interaction—they become more than just background characters. They add a layer of depth and realism that can significantly enhance the user experience.

Use Cases for Custom Actions:

  • Enhanced Player Interaction: AI Metahumans that respond to player actions with specific animations can make gameplay more engaging.
  • Storytelling: Custom animations allow for more expressive storytelling, where AI characters can physically react to events in the game world.
  • Cinematic Scenes: Adding custom animations can help in creating cinematic sequences that are crucial for narrative-driven games.
  • NPC Behavior: By customizing the behavior of AI NPCs (Non-Player Characters), developers can create more complex and intelligent interactions within the game environment.

Now without any further ado, let's look at some of the key steps into setting up custom actions for your AI metahumans!

Step-by-Step Guide to Adding Custom Animations to AI Metahumans

1. Setting Up Your Metahuman Avatar

To begin with, ensure you have your Metahuman avatar ready in Unreal Engine. Convai makes this process straightforward by providing seamless integration between the AI characters you craft and the Metahumans in Unreal Engine. Read our Unreal Engine documentation page to learn more.

2. Downloading an Animation

For this tutorial, we will use a dance animation. Animations can be sourced from various platforms, like Reallusion or the Unreal Marketplace. After downloading, import the animation into your Unreal Engine project.

3. Creating an Animation Montage

An Animation Montage allows you to control how your Metahuman transitions in and out of the animation. Here's how to set it up:

  • Right-click on the downloaded animation and select "Create Animation Montage."
  • Navigate to the asset details and modify the blend-in and blend-out times. This controls the smoothness of the transition in and out of the animation.

4. Integrating the Animation with the Character Blueprint

The next step is to integrate this animation into your Metahuman's character blueprint:

  • Open the character blueprint and drag out executable actions.
  • Type in "Play Montage" and connect it to your skeletal mesh component.
  • Select the dance animation you want to play.

5. Testing the Animation

Once everything is set up, hit play in Unreal Engine to test the animation. Your Metahuman should now perform the dance move seamlessly. We have set up everything for our test project in the Zen Garden Sample Scene provided by Unreal Engine.

6. Integrating the Animation with Convai

  • Convai’s Action System allows you to trigger these custom animations in response to various in-game events or commands:
  • Go to Edit > Project Settings in Unreal Engine, scroll down to Convai, and enable the Action System.
  • Select your character, go to the details panel, and add a new action under the Actions section. For instance, name it "Dance."
  • Now, whenever you call this action within the blueprint, your Metahuman will perform the dance animation.

7. Customizing the Animation Blueprint

Beyond just adding animations, Convai allows you to customize the animation blueprint to add more nuanced behaviors:

  • Modify the Animation Blueprint to include conditional checks or blend animations based on specific triggers.
  • This is particularly useful for complex AI behaviors, such as interacting with other characters or objects within the scene.

How Does Prompt-to-Action Work With Custom MetaHuman Animations?

Once an animation is registered in the Convai Action System, it becomes part of the character's Prompt-to-Action (P2A) vocabulary. A player or trainee can trigger it through natural speech, not just through a UI button.

The flow is: Player speaks a command. Dynamic Context API assembles the current scene context, character state, and game variables. The LLM maps the player's intent to the registered action label. The Unreal Engine blueprint receives the action trigger and plays the corresponding Animation Montage.

There is also action-to-action chaining: a MetaHuman can execute a greeting animation, then transition into a demonstration sequence, then return to idle, all without player input, triggered by a single conversational exchange.

For training simulations specifically, this is where Convai's approach becomes irreplaceable. A pharmaceutical sales trainer MetaHuman does not just talk through the procedure. It physically demonstrates each step, guided by a Knowledge Bank grounded in the actual protocol document.

Use Mindview to debug action selection. If the MetaHuman picks the wrong action, Mindview shows you whether the context variable describing the current scenario was actually present in the LLM call.

Importance of Configuring the Animation Blueprint

The Animation Blueprint is a powerful tool within Unreal Engine that allows developers to dictate how and when animations play. Proper configuration of the Animation Blueprint is crucial for achieving smooth and realistic animations. It allows developers to:

  • Control Animation Transitions: Ensure that animations blend seamlessly from one action to another.
  • Implement Complex Behaviors: Combine multiple animations and use conditional logic to create complex AI behaviors.
  • Optimize Performance: Properly configured Animation Blueprints can help in optimizing game performance by reducing unnecessary animation calls.

Tips for Success

  • Test Animations in Different Scenarios: Make sure to test your animations in various game scenarios to ensure they behave as expected.
  • Optimize Asset Sizes: Large animation files can slow down your game. Optimize your assets for better performance.
  • Use Root Motion When Necessary: For certain actions like jumping or dancing, root motion can provide more accurate movement by moving the entire character based on the animation data.

Also read: Integrating Dynamic NPC Actions for Game Development with Convai

Conclusion

Adding custom animations to AI Metahumans in Unreal Engine is a game-changer for developers looking to create more interactive and engaging virtual worlds. By following this tutorial, you can enhance the realism and depth of your AI characters, making them more than just background NPCs but integral parts of your game's narrative and player experience.

Custom actions not only improve gameplay but also contribute to the overall aesthetic and functional quality of your game. Whether you're a seasoned developer or just starting out, the ability to add and customize animations in Unreal Engine is an invaluable skill that will set your projects apart.

For more detailed tutorials and support, check out our YouTube Channel. Also, reach out to us at support@convai.com for any questions or concerns. Also, join our amazing developer community on Discord to learn more about integrating AI characters into different types of projects.

Frequently Asked Questions

How do you add custom animations to AI MetaHuman avatars in Unreal Engine with Convai?

Download your animation, create an Animation Montage, connect it to your MetaHuman character blueprint, then register the action name in Convai's Action System under Project Settings. When a player triggers the action via voice or command, the LLM maps the intent to the action label and the blueprint executes the montage.

What is an Animation Montage and why do you need one for Convai actions?

An Animation Montage is an Unreal Engine asset that controls how an animation plays, loops, and blends in and out. Convai's Action System triggers animations via montage playback, so custom actions require a montage to control transitions cleanly.

Can Convai MetaHuman avatars perform actions based on what a player says?

Yes. This is Convai's Prompt-to-Action capability. When a player says something implying an action, the Dynamic Context API maps the intent to the registered action label and the engine executes the corresponding animation.

Where do you source animations for Convai MetaHuman characters?

Animations can be sourced from Reallusion ActorCore, the FAB Marketplace, or Mixamo. After downloading, import into your Unreal project, retarget to the MetaHuman skeleton if needed, and create an Animation Montage.

Does the Animation Blueprint need to be modified to support Convai actions?

Yes. The Animation Blueprint needs conditional logic or state machine transitions to handle Convai action triggers. Configure blend-in and blend-out times in the Animation Montage for smooth transitions.