Develop
Develop
Select your platform

Ray Interactions

Updated: Nov 4, 2025

Design Guidelines: Providing a comfortable hand tracking experience is essential for creating immersive and enjoyable apps. Refer to the Design Guidelines at the bottom of the page to learn about best practices and to minimize risks of user discomfort.

Ray interactions use a ray emitting from a point you define on the hand or controller to select objects. To trigger a selection when the ray is hovering over an object, the ray interaction uses a selection mechanism specified in the RayInteractor. The selection mechanism can be anything, like a button, a gesture, or a voice command, so long as the mechanism implements ISelector, since ISelector broadcasts select and release events for the interaction.
Note: If you are just getting started with this Meta XR feature, we recommend that you use Building Blocks, a Unity extension for Meta XR SDKs, to quickly add features to your project.

RayInteractor

A RayInteractor defines the origin and direction of raycasts for a ray interaction, as well as a max distance for the interaction. It does not provide a selection mechanism and hence it must be paired with a Selector.

RayInteractable

A RayInteractable defines the surface of the object being raycasted against. Optionally, a secondary surface can be provided to raycast against during selection. One use case for this is for dragging beyond the edge of a canvas, where the drag starts on the canvas and continues off of it.

Ray Interaction with Unity Canvas

RayInteractable can be combined with a PointableCanvas to enable ray interactions with Unity UI. To learn how to do this, see Create a Curved or Flat UI.

Ray Interactions with Hands

For hands, we recommend using the HandPointerPose component for specifying the origin and direction of the RayInteractor. This component uses the system-defined pointer pose.
Note: The pointer pose origin lies close to the wrist root and is not the same as the visual position. The visual position is used for the purely visual pincher mesh affordance, which is the teardrop-shaped mesh that appears between the index and thumb when performing ray interactions.

Debugging Ray Interactions

A RayInteractorDebugGizmos component can be used to visualize a ray and ray interactor state for a provided RayInteractor.

Learn more

Design guidelines

Did you find this page helpful?