Introduction to Mixed Reality on Meta Quest
Updated: May 29, 2025
Mixed Reality (MR) fuses virtual content with your physical environment—letting digital objects bounce off walls, hide under tables, or interact with real furniture. On Meta Quest, MR builds on VR immersion and AR overlays to deliver next-level experiences.
Choosing Your Immersive Mode
Virtual Reality (VR):
Completely immerses you in a digital world.
Ideal for full-scale simulations and games where you don’t need to see your real surroundings.
Augmented Reality (AR):
Overlays digital elements onto your real-world view.
Great for heads-up displays, navigation aids, or lightweight info pop-ups.
Mixed Reality (MR):
Blends virtual objects with your environment—letting digital content collide with walls, sit on tables, or hide behind furniture.
Perfect for spatially aware experiences that truly integrate with your room.
Static MR: Seated or fixed-location experiences.
Example: A strategy game on your coffee table, or a virtual screen on your wall with friends’ avatars around you.
Dynamic MR: Room-scale or multi-room adventures.
Example: A haunted-house scenario where ghosts emerge from corners, or a fitness game tracking movement across rooms.
2D & Classic MR: Flat UIs or 2D games placed in space.
Example: Floating puzzle pieces, card games on your desk, or watching a TV show on a virtual screen.
Popular Mixed Reality Use Cases
Education:
Shared spatial anchors place historical artifacts in a classroom. Students can handle virtual fossils, dissect 3D models, and take notes in real notebooks.
Entertainment:
Transform your living room into a concert hall or sports arena. Watch live games on a massive virtual screen or join esports events with surround sound.
Fitness & Wellness:
A virtual trainer appears in your space, demonstrating exercises and counting reps. Passthrough weights or tracked gear let you work out safely at home.
Productivity:
Virtual offices like Horizon Workrooms with keyboard tracking and hand gestures. Collaborate on 3D whiteboards, see colleagues’ avatars, and type on your physical keyboard seamlessly.
The Meta XR SDK’s Input & Interaction suite lets users naturally engage with virtual content. From low-level device setup in the Core SDK to advanced hand-tracking and gesture support, plus voice recognition, keyboard integration, and precise haptic feedback—it all works together so your MR experiences feel intuitive and responsive.

Core SDK
Initialize device, handle tracking lifecycle, and configure MR permissions.
View Documentation
Interaction SDK
Grab, throw, paint, or sculpt virtual objects using controllers or hand gestures.
View Documentation
Movement SDK
Body, face & eye tracking for avatars, fitness tracking, and expressive gestures.
View Documentation
Keyboard Input
Track real keyboards in MR so users can type naturally in virtual workspaces.
View Documentation
Voice SDK
Integrate speech recognition and conversational agents for voice commands and NPC dialog.
View Documentation
Haptics SDK
Deliver precise vibration patterns on controllers to enhance tactile feedback.
View Documentation
Immersion & Spatial Audio
Our Immersion & Audio tools ensure that virtual elements not only look real but also sound and blend seamlessly into the world around the user. Passthrough and Depth APIs keep users grounded in their surroundings, while the Audio SDK provides fully spatialized sound—bringing depth, direction, and realism to every MR scene.

Passthrough API
Render live camera feed of the room and overlay virtual content without losing real-world context.
View Documentation
Depth API
Query environment depth for realistic occlusion—virtual objects appear behind real obstacles.
View Documentation
Audio SDK
Spatialize sounds so virtual elements emit audio from their exact position in your room.
View Documentation
With Scene Understanding, comprising MR Utility Kit (MRUK), Scene API, Spatial Anchor API, and Passthrough Camera API, you can map, query, and anchor content to the real world. Detect floors, walls, and furniture; persist virtual objects; and synchronize shared anchors for multi-user alignment.

MR Utility Kit (MRUK)
High-level toolkit for spatial queries, content placement, manipulation, and sharing.
View Documentation
Scene API
Capture and model room geometry—walls, floors, furniture—for smart content placement.
View Documentation
Spatial Anchor API
Anchor virtual content to precise real-world points for persistent experiences.
View Documentation
Passthrough Camera API
Low-level access to passthrough camera frames and metadata for custom compositing.
View Documentation
Turn solo MR into a shared adventure with our Multiplayer & Social platform. The Platform SDK connects players, manages sessions, supports leaderboards and in-app purchases. The Avatars SDK brings realistic digital selves to life, and Colocation Discovery ensures everyone in the same room sees and interacts with the same virtual objects in real time.

Platform SDK
Invite friends, join sessions, cross-app travel, leaderboards, and monetization.
View Documentation
Avatars SDK
Create realistic, customizable avatars with body, face, and eye animation synced across sessions.
View Documentation
Colocation Discovery
Discover nearby headsets and automatically join colocated MR sessions.
View Documentation