Add a crosshair / reticle UI so there’s a clear target in the scene
In the Convai Unity package, look for a crosshair/canvas prefab (often something like Convai Crosshair Canvas). Drop it into your scene. This tells the system what the user is currently focused on.
Now try:
Aim at a control or object in the scene
Ask: “What is this?” or “What does this part do?”
The character should answer contextually, using both what it sees and what you’ve put in Knowledge Bank.
This is how you get exchanges like:
“What’s this gauge for?” “You’re pointing at the material removal gauge. It shows how much material is removed in a single pass.”
or
“Is this the right way to feed the board in?” “You’re holding the board flat against the table—that’s exactly right. Always keep it flat against the bed to avoid kickback.”
Step 5: Optional – Run it on Meta Quest
If you want this experience inside a Quest headset:
Switch build target
Go to File → Build Settings / Build Profiles.
Add your active scene (remove the sample scene if it’s still there).
Select Android / Meta Quest as the build platform and click Switch Platform.
Apply XR settings
Use Meta XR tools or the provided setup helper (again, usually a Fix All button).
Let it configure OpenXR, Android player settings, etc.
Build & run
Select your headset under Run Device (if supported).
Click Build & Run.
Put on your headset, look at your scene, and start talking.
Now you’ve basically turned your headset into a vision-powered AI assistant for your environment.
Design tips: making your Jarvis genuinely helpful
A few tweaks go a long way:
Keep answers short by default
Users are often standing, moving, or holding tools—no one wants a monologue.
Use follow-up questions for deeper detail.
Lead with safety
For tools, machines, labs: always mention PPE and safe posture early.
Example: “Before you turn that on, make sure your safety glasses are on.”
Encourage “show, don’t tell” questions
“Hold the part up to the camera and ask me what it’s for.”
“Point at the control you’re unsure about.”
Chunk the knowledge
One doc for controls, one for onboarding, one for troubleshooting.
Easier to maintain and easier for the model to use effectively.
Test with real phrasing
Don’t just test “What is the material removal gauge?”
Also test “What’s this thing?” / “What does this gauge do?” / “Am I using this right?”
Troubleshooting
If something feels off:
Character not responding
Double-check your API key and Character ID.
Make sure the scene has the Convai scripts/components enabled.
Can’t build for Quest
Re-run the setup tool and make sure the right scene is added to Build Settings.
Check Android / XR settings are applied.
Character gives vague answers
Tighten the Character Description (role, speaking style).
Add or refine Knowledge Bank documents.
Where you can use this
Once the pipeline is working, you can reuse the same pattern for: