Template project walkthrough
Updated: Oct 7, 2025
This page walks through Meta Android Studio Plugin’s project template for Spatial SDK. It details its structure, architecture, and how to start building spatial experiences for Meta Horizon OS.
This template demonstrates the fundamental building blocks for spatial development:
- 3D scene composition: Unlike mobile apps that work on flat screens, spatial apps exist in three-dimensional space around users. You need to understand how to create environments that feel natural and comfortable.
- Asset management: 3D assets are more complex than images - they have geometry, materials, animations, and physics. Understanding how to efficiently load and manage these assets is crucial for app performance.
- Panel integration: Users still need familiar UI elements (buttons, text, videos), but now these can float in 3D space or attach to surfaces. This is your bridge between traditional app development and spatial experiences.
- VR features: Your app’s virtual content can understand and respond to the real world. Imagine apps that can place virtual furniture in real rooms or overlay information on real objects.
To start development, you’ll need:
On your mobile device, open the Meta Horizon app.
In the app, tap the headset icon in the toolbar.
Your paired headset should appear at the top of the screen. Tap the headset item, which displays the model and status of your paired headset.
Tap Headset Settings beneath the image of your headset.
Tap Developer Mode.
Toggle Developer Mode to the on position.
Use a USB-C data cable to connect the headset to your computer.
Put on the headset.
In the headset, open the Quick Control menu item.
Select Open Settings, displayed as a gear icon. Then, open the Developer tab and toggle MTP Notification on.
When asked to allow USB debugging, select Always allow from this computer.
Important
Developer Mode is intended for development tasks such as running, debugging and testing applications. Engaging in other activities may result in account limitations, suspension, or termination. For more information, see Content Guidelines.Understanding these concepts will help:
Mental model for spatial development
Think of spatial app development in categories:
- The environment: Your 3D environment (like a stage or room)
- Objects: 3D models, interactive elements (like props on the stage)
- UIs: Traditional interfaces that float in 3D space (like holographic screens)
- The user: Hand tracking, gaze, controller input (how users interact)
Unlike mobile apps where everything happens on a flat screen, spatial apps need to consider:
- Depth: Objects can be near or far from the user
- Scale: A button might be small up close or too large far away
- Comfort: Users move their heads, so your app needs to respond appropriately
Traditional mobile apps work with screens. Spatial apps work with 2D panels, 3D assets, and immersive environments. A well-organized structure helps manage this complexity. It enables powerful development workflows like hot reload and visual scene editing with Meta Spatial Editor.
Project/
├── app/
│ ├── build.gradle.kts # Build configuration with Spatial SDK
│ ├── scenes/ # Meta Spatial Editor project files
│ │ ├── Main.metaspatial # Main scene project
│ │ ├── Composition/ # Exported scene composition
│ │ └── environment/ # 3D assets (models, textures, materials)
│ └── src/main/
│ ├── AndroidManifest.xml # VR app permissions & configuration
│ ├── assets/ # Runtime assets (IBL, licenses)
│ ├── java/.../ # Kotlin source code
│ │ ├── ImmersiveActivity.kt # Main VR activity
│ │ └── PanelActivity.kt # Compose-based panel
│ └── res/ # Android resources (layouts, drawables)
├── gradle/
│ └── libs.versions.toml # Dependency versions
└── build.gradle.kts # Project-level build configuration
- Contains your spatial scenes.
- Use Meta Spatial Editor for drag-and-drop 3D scene composition, real-time preview, and automatic optimization.
.metaspatial files are the project files for your spatial scenes.Composition/ contains the exported, optimized data your app actually loads.environment/ holds all your 3D assets organized by type.
- Contains your app’s Kotlin source code.
- In Spatial SDK, you manage immersive 3D environments and spatial interactions instead of activities and fragments.
ImmersiveActivity.kt: Your main function for VR. It sets up the 3D environment, manages user positioning, and handles spatial experiences.PanelActivity.kt: Familiar Android UI development, but these UIs now float in 3D space as panels.
- Contains your app’s runtime assets like shaders and textures.
- 3D environments need lighting, textures, and environmental effects that don’t exist in 2D mobile apps.
environment.env: Image-based lighting files that make your 3D content look realistic by simulating how light bounces in real environments.
- Contains your app’s permissions and capabilities.
- VR apps need access to cameras (for Passthrough), hand tracking, spatial positioning, and other sensors that regular mobile apps don’t use.
- This is where you enable features like hand tracking, mixed reality features, and spatial audio.
- Immersive activities must declare
android:configChanges to prevent activity restarts during device configuration changes. For details, see Handling configuration changes.
The Spatial SDK is organized into modular packages that you include based on your app’s needs. Rather than having one massive library, you pick specific packages to keep your app lightweight and focused. The template demonstrates this modular approach through its build.gradle.kts dependency declarations.
These core packages are already in the template:
meta-spatial-sdk: The foundation package required by all spatial apps.meta-spatial-sdk-vr: Adds VR-specific features essential for immersive experiences.meta-spatial-sdk-toolkit: Pre-built components and systems for common spatial app needs like lighting, audio, and user interactions.
This package system lets you start simple with the template’s three core packages, then add specialized functionality as your app grows. For a complete list of available packages, see
Spatial SDK Packages.
Understanding the architecture
Mobile apps manage screens and user touches. Spatial apps manage immersive environments, user positioning, controller input, and environmental awareness. The template demonstrates key architectural patterns that help manage this complexity.
Template-specific architecture
The template demonstrates a dual-activity pattern:
// Main VR Activity - manages the immersive 3D environment
class ImmersiveActivity : AppSystemActivity() {
// Registers VR features and manages the 3D scene
override fun registerFeatures(): List<SpatialFeature>
override fun onSceneReady()
override fun registerPanels(): List<PanelRegistration>
}
// Panel Activity - manages panels and UIs within the immersive world
class PanelActivity : ComponentActivity() {
// Uses Jetpack Compose for rich 2D UI
override fun onCreate(savedInstanceState: Bundle?)
}
This demonstrates:
- Separation of concerns: 3D environment logic stays separate from UI logic.
- Familiar tools: You can use Jetpack Compose for UI while learning spatial concepts.
- Gradual adoption: Existing Android developers can focus on spatial-specific concepts first.
Spatial Editor integration
The template includes
Meta Spatial Editor integration out of the box. Spatial Editor lets you visually compose 3D scenes and automatically export them for your app to load at runtime.
The template demonstrates this integration through:
- Pre-configured project structure with
scenes/ directory. - Build pipeline that exports your Spatial Editor project to optimized runtime assets.
- Example code showing how to load these exported scenes.
To get started with Spatial Editor:
- Open
app/scenes/Main.metaspatial in Meta Spatial Editor. - Follow the Connecting Spatial Editor to your project guide for complete setup and export configuration.
- Set up hot reload to see changes instantly in your headset.
Loading compositions at runtime
Static scenes would make your app huge and inflexible. Dynamic loading lets you have multiple scenes, update content without app updates, and manage memory efficiently.
private fun loadGLXF(): Job {
gltfxEntity = Entity.create()
return activityScope.launch {
glXFManager.inflateGLXF(
Uri.parse("apk:///scenes/Composition.glxf"), // Your exported scene
rootEntity = gltfxEntity!!, // Where to attach it
keyName = "example_key_name" // Reference name for later
)
}
}
Parameters:
- URI: Points to your exported scene file (automatically created by the build pipeline).
- rootEntity: The “parent” object that contains your entire scene.
- keyName: A name you can use later to find specific objects in the scene.
You could load different room layouts based on user preferences, or load additional content as users progress through your app.
Entity-Component System (ECS) concepts
The template demonstrates Spatial SDK’s Entity-Component System (ECS) architecture, which is fundamental to how spatial apps work. For complete ECS fundamentals, see the
Entity-Component-System (ECS) explainer.
The template shows a basic ECS concept by creating a skybox entity with multiple components (Mesh, Material, Transform).
Skybox creation (in ImmersiveActivity.kt):
// Create a skybox entity - demonstrates entity creation with multiple components
val skyboxEntity = Entity.create(
listOf(
Mesh(Uri.parse("mesh://skybox")), // Geometry component
Material().apply { // Material component
baseTextureAndroidResourceId = R.drawable.skydome
unlit = true // Template shows shader overrides
},
Transform(Pose(Vector3(0f, 0f, 0f))) // Position component
)
)
Development workflow tips
Recommended development order:
- Start with the template and get it running.
- Modify the 3D scene in Meta Spatial Editor.
- Add your UI panels using familiar Compose patterns.
- Implement interactions and business logic.
- Optimize performance and comfort.
Essential debugging tools:
Each sample builds on the concepts in this template. Based on your app goals explore: