Whether you are building a complex MetaHuman, a stylized custom character, or even a disembodied "Jarvis-style" AI guide, Convai’s updated Unreal Engine plugin makes it incredibly simple. Now officially available on Epic Games' FAB Marketplace, this plugin equips your characters with real-time lip-sync, dynamic emotional expressions, and ultra-low latency conversational AI.
In this quick setup guide, we will walk you through exactly how to install the plugin and get your first AI character talking.
For game developers, XR creators, and simulation designers, integrating Conversational AI has traditionally meant juggling complex API calls, managing audio latency, and painstakingly animating lips to match generated text-to-speech.
The new Convai Unreal Engine plugin democratizes this process. By acting as a universal bridge, it works across the full spectrum of avatar setups, from highly rigged Reallusion assets to characters with obscured faces, and even invisible, omnipresent AI assistants. You no longer need to be a senior network engineer or a master animator to create living, breathing AI NPCs.
What the Upgrade Brings
Built on top of Convai’s Live Character API, the updated plugin brings several major enhancements to your Unreal Engine 5 projects:
FAB Marketplace Integration: Download and update the plugin directly through the Epic Games launcher or the FAB Store with a single click.
WebRTC Protocol for Low Latency: Interactions are now powered by WebRTC, drastically reducing response times to ensure conversations flow as naturally as human-to-human interaction.
Universal NeuroSync: Our in-house AI model Neurosync generates real-time realistic lip-sync and facial animations that perfectly match the AI's responses, regardless of the avatar framework.
Dynamic Emotional Intelligence: Characters can now be assigned initial emotional states (like "Happy" or "Sad") that dynamically shift based on the context of the live conversation.
Enhanced Environmental Awareness: Characters can utilize streaming vision to understand and comment on their surroundings in real-time.
Create a new character or select an existing one (you can tweak their backstory, language, voice, and multimodal knowledge here).
Copy the Character ID.
Go back to your Unreal Engine Character Blueprint, select the Convai Chatbot Component, and paste the ID into the designated field.
Step 4: Configure the Player Settings & UI
Your character is ready to talk, but your player needs a way to communicate!
Open your World Settings. Under Game Mode, expand Selected Game Mode to find your Default Pawn Class.
Open your Player Blueprint.
Add the BP_ConvaiPlayer component to the hierarchy.
In the Details panel of the Player component, you can tweak the UI of the chat widget (try changing the interface selection to Style 2 or 3).
Pro-Tip: Disable "Push-to-Talk" to enable a completely Hands-free Voice AI experience!
Step 5: Tweak Avatar Emotions
Want your character to greet players with a smile?
Open your Character Blueprint and select the Chatbot Component.
Expand the Default options and locate Initial Emotion.
Set it to something like Happy. Because Convai's emotions are dynamic, the character will start the conversation smiling but will naturally adjust their expression based on the tone of the user's questions.
Hit Play! Walk up to your character and start chatting seamlessly.
Example Scenarios to Build Today
Because this plugin is avatar-agnostic, you can build wildly different experiences in minutes on the Convai playground. Here are two ideas to get you started:
1. The Omnipresent AI Guide ("Jarvis")
The Setup: Don't use a 3D avatar at all. Simply add the BP_ConvaiChatbot and BP_ConvaiPlayer components to an invisible actor in your scene or your main player controller.
The Interaction: Create a futuristic sci-fi game where an onboard ship AI guides the player through the level. Because of WebRTC, the AI can act as a real-time, low-latency companion that answers lore questions or gives hints without breaking immersion.
2. The Emotional Virtual Artist
The Setup: Use a high-fidelity MetaHuman or Reallusion character placed in a recording studio environment. Set their Initial Emotion to "Thoughtful."
The Interaction: Write a Convai backstory for a musician who just released an emotional album. As the player asks about the inspiration behind specific songs, NeuroSync will drive the character's face to show vulnerability and passion, turning a simple Q&A into a deeply cinematic Embodied AI moment.
Q: Do I need a fully rigged face to use Convai? A: Not necessarily! While Convai's NeuroSync is perfect for fully rigged MetaHumans and Reallusion avatars, the plugin works with custom, partially rigged avatars, or even completely disembodied AI systems where no facial animation is required.
Q: What is WebRTC, and why does it matter for my game? A: WebRTC (Web Real-Time Communication) is an open-source project that provides real-time communication capabilities via simple APIs. For Convai, it drastically lowers the latency between a user speaking and the AI responding, making conversations feel highly natural.
Q: Where can I download the Convai Unreal Engine plugin? A: The official plugin is available directly on the Epic Games FAB Marketplace.
Q: Does Convai support environmental awareness? A: Yes! By utilizing Convai's streaming vision capabilities, your AI characters can actually "see" the Unreal Engine environment around them and comment on objects, actions, and the player's behavior in real-time.
Join the Convai Community
Ready to start building your own intelligent, fully interactive AI agents?