Meet Maya: A 3D AI Avatar You Can Talk to in Mixed Reality on Meta Quest 3

By
Convai Team
May 5, 2026

Most AI avatars live behind a screen. Maya lives in your room, ready to help you out with any questions you might have, or even just as a friendly AI companion.

Convai's XR demo for Meta Quest 3 brings a photorealistic conversational AI character into your physical space using passthrough mixed reality. You put on the headset, and Maya is standing there. You talk to her. She talks back. She moves around the furniture you actually own.

Watch the demo above to see it in action:

What Makes This Different

The demo showcases three things working together that rarely do:

Conversational depth. Maya is not running a script. When asked about herself, she gives a genuinely nuanced answer: acknowledging contradictions in her own personality, asking questions back, and responding naturally to whatever the user says. This is Convai's conversational AI running on-device via the Quest 3, with low response latency.

Spatial awareness. The avatar uses a dynamic navigation system that reads the room layout configured through Quest 3's setup settings. Maya can walk around your actual furniture, not a pre-built virtual environment. She navigates obstacles in real time.

Object interaction. Beyond navigation, Maya can recognize and interact with objects in the space — sitting on a sofa, picking up items from a table, placing objects down. The interactions are driven by the same conversational AI layer, so they happen in response to natural commands, not button presses.

The Technical Side

The avatar has been optimized for Android on Meta Quest 3, balancing the highest visual quality achievable on the platform with the performance headroom needed for real-time AI processing. Running a photorealistic avatar, a navigation system, object recognition, and a live LLM pipeline simultaneously on a standalone headset is a non-trivial engineering challenge. The demo shows it is possible.

The APK is available for download. Install it as an unknown source app on your Quest 3 and try it in your own space.

Why This Matters

The question of what AI avatars are actually for gets answered most clearly in spatial computing. A talking head on a screen is useful. An AI character that exists in your physical space, navigates around your furniture, and holds a real conversation is something categorically different.

Training simulations, guided onboarding, interactive companions, spatial customer service — all of these use cases become more compelling the moment the avatar can occupy real space rather than a flat screen.

Maya is an early demonstration of where this is going.

Also watch: How to Build Mixed Reality AI Characters in Unity with Convai on Meta Quest (2026)

And do read: How to Build Mixed Reality AI Characters in Unity with Convai

Join the Convai Community

Liked the XR AI demo with Maya? it is time to build your own!

Don't forget to subscribe to our YouTube channel for more deep dives into browser-based AI and digital human technology!

Frequently Asked Questions (FAQs)

What hardware do I need to run this?

The demo is specifically optimized for the Meta Quest 3. While it may run on the Quest Pro or Quest 2, the HQ visuals and complex spatial mapping are designed to take full advantage of the Quest 3’s upgraded chipset and high-resolution passthrough sensors.

How does Maya "see" my room?

Maya utilizes the Scene Mesh and Room Setup data from your Quest 3. During your initial headset setup, you map your walls and furniture; Convai’s navigation system reads this spatial data in real-time, allowing Maya to treat your actual sofa as a "seat" and your coffee table as an "obstacle."

Is the AI conversation pre-scripted?

No. Maya uses Convai’s generative AI pipeline. Every response is generated in real-time based on her unique personality profile and your specific input. She can be both organized and overwhelmed, just like a real person.

How do I install the demo?

Since this is an experimental XR demo, it is not currently on the official Meta Store. You must sideload the APK:

  1. Enable Developer Mode on your Quest via the Meta Horizon mobile app.
  2. Connect your headset to a PC/Mac.
  3. Use a tool like SideQuest to install the provided APK file.
  4. Locate the app in your headset under Library > Unknown Sources.

What is the latency like during a conversation?

The demo is optimized for a low response time. This is achieved by balancing on-device processing with efficient cloud-based LLM streaming, ensuring that Maya responds almost instantly without the awkward "processing" pauses common in older AI voice systems.

Can I build my own character like Maya?

Yes! Convai provides a Unity SDK and Unreal Engine Plugin that allow developers to integrate these same conversational and spatial capabilities into their own projects, by plugging them with HQ 3D avatars like Metahuman, Reallusion CC5 Actors and more. You can define your character's "brain," voice, and animations through the Convai Dashboard. Alternatively, you can craft these AI characters programmatically using Convai’s character crafting APIs.