Quick Setup Guide: Add Conversational AI to Any Unreal Engine Project with Convai

By
Convai Team
March 4, 2026

Quick Setup Guide: Add Conversational AI to Any Unreal Engine Project with Convai 

What if you could drop a fully conversational, intelligent AI character into your Unreal Engine project in a matter of minutes?

Whether you are building a complex MetaHuman, a stylized custom character, or even a disembodied "Jarvis-style" AI guide, Convai’s updated Unreal Engine plugin makes it incredibly simple. Now officially available on Epic Games' FAB Marketplace, this plugin equips your characters with real-time lip-sync, dynamic emotional expressions, and ultra-low latency conversational AI.

In this quick setup guide, we will walk you through exactly how to install the plugin and get your first AI character talking.

Watch the full video tutorial here:https://www.youtube.com/watch?v=n-UG3nmMeZQ 

Why It Matters

For game developers, XR creators, and simulation designers, integrating Conversational AI has traditionally meant juggling complex API calls, managing audio latency, and painstakingly animating lips to match generated text-to-speech.

The new Convai Unreal Engine plugin democratizes this process. By acting as a universal bridge, it works across the full spectrum of avatar setups, from highly rigged Reallusion assets to characters with obscured faces, and even invisible, omnipresent AI assistants. You no longer need to be a senior network engineer or a master animator to create living, breathing AI NPCs.

What the Upgrade Brings

Built on top of Convai’s Live Character API, the updated plugin brings several major enhancements to your Unreal Engine 5 projects:

  • FAB Marketplace Integration: Download and update the plugin directly through the Epic Games launcher or the FAB Store with a single click.
  • WebRTC Protocol for Low Latency: Interactions are now powered by WebRTC, drastically reducing response times to ensure conversations flow as naturally as human-to-human interaction.
  • Universal NeuroSync: Our in-house AI model Neurosync generates real-time realistic lip-sync and facial animations that perfectly match the AI's responses, regardless of the avatar framework.
  • Dynamic Emotional Intelligence: Characters can now be assigned initial emotional states (like "Happy" or "Sad") that dynamically shift based on the context of the live conversation.
  • Enhanced Environmental Awareness: Characters can utilize streaming vision to understand and comment on their surroundings in real-time.

Also Watch: Give Your AI Characters Eyes & Ears in Unreal Engine: Streaming Vision + Hands-Free AI Voice Interaction

Step-by-Step Guide: Adding Conversational AI to Your Project

Let's dive into the editor. This tutorial assumes you already have a basic scene with an avatar (MetaHuman, Reallusion, or custom) ready to go.

Step 1: Install the Plugin from FAB

  1. Open the Epic Games Launcher and navigate to the FAB Store (or visit the plugin page directly).
  2. Search for Convai and click Install Plugin. Select your specific Unreal Engine project version.
  3. Open your Unreal Engine project.
  4. Go to Edit > Plugins, search for Convai, check the box to enable it, and restart your project.
  5. Upon reopening, sign in to your Convai account via the authentication popup.

Step 2: Add Components to Your Avatar

We need to give your avatar a brain and a face.

  1. Select your avatar in the Outliner and click Edit Blueprint.
  2. Click Add Component and search for BP_ConvaiChatbot. This component handles the AI logic, memory, and LLM connection.
  3. Click Add Component again and search for BP_ConvaiFaceSync. This component utilizes NeuroSync to drive the facial animations and lip-sync.

Step 3: Connect Your Character ID

  1. Head over to the Convai Dashboard.
  2. Create a new character or select an existing one (you can tweak their backstory, language, voice, and multimodal knowledge here).
  3. Copy the Character ID.
  4. Go back to your Unreal Engine Character Blueprint, select the Convai Chatbot Component, and paste the ID into the designated field.

Step 4: Configure the Player Settings & UI

Your character is ready to talk, but your player needs a way to communicate!

  1. Open your World Settings. Under Game Mode, expand Selected Game Mode to find your Default Pawn Class.
  2. Open your Player Blueprint.
  3. Add the BP_ConvaiPlayer component to the hierarchy.
  4. In the Details panel of the Player component, you can tweak the UI of the chat widget (try changing the interface selection to Style 2 or 3).
  5. Pro-Tip: Disable "Push-to-Talk" to enable a completely Hands-free Voice AI experience!

Step 5: Tweak Avatar Emotions

Want your character to greet players with a smile?

  1. Open your Character Blueprint and select the Chatbot Component.
  2. Expand the Default options and locate Initial Emotion.
  3. Set it to something like Happy. Because Convai's emotions are dynamic, the character will start the conversation smiling but will naturally adjust their expression based on the tone of the user's questions.

Hit Play! Walk up to your character and start chatting seamlessly.

Example Scenarios to Build Today

Because this plugin is avatar-agnostic, you can build wildly different experiences in minutes on the Convai playground. Here are two ideas to get you started:

1. The Omnipresent AI Guide ("Jarvis")

  • The Setup: Don't use a 3D avatar at all. Simply add the BP_ConvaiChatbot and BP_ConvaiPlayer components to an invisible actor in your scene or your main player controller.
  • The Interaction: Create a futuristic sci-fi game where an onboard ship AI guides the player through the level. Because of WebRTC, the AI can act as a real-time, low-latency companion that answers lore questions or gives hints without breaking immersion.

2. The Emotional Virtual Artist

  • The Setup: Use a high-fidelity MetaHuman or Reallusion character placed in a recording studio environment. Set their Initial Emotion to "Thoughtful."
  • The Interaction: Write a Convai backstory for a musician who just released an emotional album. As the player asks about the inspiration behind specific songs, NeuroSync will drive the character's face to show vulnerability and passion, turning a simple Q&A into a deeply cinematic Embodied AI moment.

Read Also: Learn How To Build Low-Latency Conversational AI Characters with High-Fidelity Lipsync & Facial Animation for Reallusion Avatars

Frequently Asked Questions (FAQs)

Q: Do I need a fully rigged face to use Convai? A: Not necessarily! While Convai's NeuroSync is perfect for fully rigged MetaHumans and Reallusion avatars, the plugin works with custom, partially rigged avatars, or even completely disembodied AI systems where no facial animation is required.

Q: What is WebRTC, and why does it matter for my game? A: WebRTC (Web Real-Time Communication) is an open-source project that provides real-time communication capabilities via simple APIs. For Convai, it drastically lowers the latency between a user speaking and the AI responding, making conversations feel highly natural.

Q: Where can I download the Convai Unreal Engine plugin? A: The official plugin is available directly on the Epic Games FAB Marketplace.

Q: Does Convai support environmental awareness? A: Yes! By utilizing Convai's streaming vision capabilities, your AI characters can actually "see" the Unreal Engine environment around them and comment on objects, actions, and the player's behavior in real-time.

Join the Convai Community

Ready to start building your own intelligent, fully interactive AI agents?

Don't forget to subscribe to our YouTube channel for more deep dives into Convai UE Integrations!