Develop
Develop
Select your platform

Use Meta XR SDKs to access and handle user input

Updated: Jan 23, 2025
On this page, we provide an overview of the SDKs that enable you to access and handle user input in a Horizon OS app built in Unity.
Apps developed on Meta XR SDKs can access and handle input from a user’s head, hands, face, and voice using Meta Quest headset and Touch controller technology.

Core SDK

Meta XR Core SDK includes all of the fundamental tools and assets needed to start building XR apps for Meta Quest headsets, including the Project Setup Tool, the Meta XR Rig, and a number of low-level input features exposed by Core SDK APIs, like integrating virtual keyboards, and accessing and handling user data directly from input sources.
For more information about Meta XR Core SDK, see Meta XR Core SDK download page. For a detailed API reference for Meta XR Core SDK, see Unity Core SDK Reference.

Interaction SDK

Meta XR Interaction SDK includes prefabricated, customizable interaction components that you can leverage when developing interactive experiences for your end users.
For more information about Meta XR Interaction SDK, see Interaction SDK Overview.

Movement SDK

Meta XR Movement SDK enables you to incorporate body and face tracking into your app.
For more information about Meta XR Movement SDK, see Movement SDK Overview.

Voice SDK

Meta XR Voice SDK enables you to bring voice interactions to your app experiences.
For more information about Meta XR Voice SDK, see Voice SDK Overview.

Haptics SDK

Meta XR Haptics SDK enables you to incorporate haptic feedback into applications in order to create immersive and engaging experiences.
For more information about Meta XR Haptics SDK, see Haptics SDK Overview.
Did you find this page helpful?