Design

Hands Technology

Updated: Feb 19, 2026
This page provides insights into Hand Tracking (HT) technology. Exploring the details of how it functions, with a special focus on calibration and accuracy, and the inherent technological limitations, as well as the measures you can take to mitigate them.

How it works

GIF showing computer vision of hands being tracked

The Hand Tracking algorithm detects and tracks hands using the cameras on the Meta Quest headset.

Hand tracking technology utilizes the sensors and cameras on the Meta Quest headset (HMD) to capture hand and finger positions and movements. This data is processed by software using algorithms, often involving machine learning and computer vision, to recognize various hand positions and movements. The system then interprets these movements as specific commands or actions, allowing user interaction with the virtual or blended environment. Various factors can influence accuracy, including hand location relative to cameras, lighting conditions, occlusions, and more. For further details, refer to the page on Hands Best Practices.
Understanding these factors is crucial for designing an immersive experience.

Hand tracking modes

Horizon OS provides developers with several distinct hand tracking modes, each tailored to different use cases and interaction requirements:
  • Default mode: Offers hand tracking at 30Hz. Striking a balance between tracking accuracy and speed that is ideal for common interactions such as user interface navigation, object grabbing, and manipulation.
  • Fast Motion Mode (FMM): For applications that require rapid hand movements, such as fitness or rhythm apps, FMM offers a tracking rate to 60Hz, improving responsiveness for fast actions. For engine specific details on implementation see, Use Fast Motion Mode.
    • Tradeoff: The increased tracking rate can create a jitter. Which can make the user experience less enjoyable. Test your applications in the default mode first and enable FMM only if they observe significant tracking loss due to quick hand motions.
  • Wide Motion Mode (WMM): Is designed to maintain plausible hand poses even when users’ hands move outside the headset’s field of view. This is achieved through Inside Out Body Tracking (IOBT), which estimates hand positions when direct tracking is lost. WMM is particularly beneficial for social applications, such as for waving or gesturing, and for fitness or gaming experiences that involve wide, sweeping motions beyond the default tracking range. For engine specific details on implementation see, Wide Motion Mode.
  • Multimodal Mode: Enables simultaneous tracking of both hands and controllers, and can indicate whether the controllers are currently held. This mode is especially useful in games where one controller emulates a fixed object (like a golf club or ping pong paddle) while the other hand is free for UI interactions, locomotion, or enhanced social presence. By offering these flexible tracking modes, Horizon OS allows developers to optimize hand tracking for a wide variety of immersive experiences, ensuring both accuracy and user comfort. For engine specific details on implementation see, Multimodal Mode.

Calibration and accuracy

Image of a woman looking at her hands showing the headset mapping her fingers
Hand input accuracy is achieved by precisely tracking and recognizing hand movements, where the focus is on the pose of the hand (position and orientation) across frames. Abrupt or shaky movements are smoothed to achieve a lifelike motion.
Calibration involves the device recognizing the hand and measuring the distance between specific points to set the hand’s scale, which is essential for accurately sizing the hand mesh. To enhance hand tracking quality in various conditions, the model is trained on a diverse dataset that encompasses different lighting environments, hand shapes, skin tones, distances, poses, and orientations.

Next steps

More design resources on hands

Designing experiences

Explore more design guidelines and learn how to design great experiences for your app:

Developing experiences

For technical information, start from these development guidelines:

Meta Spatial SDK

Unity

Unreal

Did you find this page helpful?