Creating Actions, Action Sets, and Suggested Bindings
This topic discusses details of actions, action sets, and suggested bindings in OpenXR Native development. Familiarize yourself with the concepts outlined in the Input API topic before proceeding.
In OpenXR, interaction profiles are representations of a specific piece of hardware, which can be a physical input device. These profiles contain paths to identify hardware plus a list of paths for input components that apps can bind actions against. The OpenXR specification lists these as a collection of paths to set up action bindings.
The app suggests action bindings against any profile. The listed profile represents the devices that the app was built for and tested with. If the app runs on a system with an input device without suggested bindings, the runtime can run the app in compatibility mode for any interaction profile with suggested bindings.
A special exception to the rule applies when interaction profiles refer to concrete hardware, such as /interaction_profiles/khr/simple_controller. Tracked hand controllers widely support this, and it’s used in simple apps with few input buttons or as a fallback in complex apps.
For example, the Khronos Group’s hello_xr sample app:
Lists that it supports a simple controller based on the core (KHR) interaction profile, for example, input that relates to a simple button click.
Specifies different input against different controllers, for example, if the user uses a Meta Quest 2 device and squeezes the Grip button (returning a float value, rather than a boolean).
Keeps suggesting bindings against profiles based on other devices.
If the runtime uses input values from suggested bindings, it tries to bind these values to the action, making bindings behave similarly or as close as possible across devices. If this isn’t feasible, input values do not update the action state. If the app supports multiple suggested bindings for an interaction profile, the runtime keeps the last successful suggested binding for that profile, as indicated by an XR_SUCCESS return value.
Details on how this mechanism works in the hello_xr sample app follow.
Using Action Paths
Converting the string form of action paths for the components to bind against XrPath is essential. Paths return as XrPath handles through calling the xrStringToPath function.
For all path definitions that relate to the Meta Quest Touch Controller Profile, see the relevant section below.
Suggest Bindings
The app suggests action bindings against a simple (KHR) controller profile by calling xrSuggestInteractionProfileBindings. This default controller setting is expected to be successful on many devices that support OpenXR, yet it is still a suggestion or hint to the runtime. For example, here are the suggested bindings for some actions.
This app is built for and tested with many devices from other vendors, so more suggested bindings follow sequentially.
The runtime can use any of the interaction profiles with suggested bindings. In practice, if the app suggests bindings for an interaction profile that refers to the exact hardware of the user, then that interaction profile will almost always be picked.
Action Sets
The following code initially defines an action set (actionSet) and an action (grabAction). It initially sets these to XR_NULL_HANDLE, which is a value expected to change later if everything runs smoothly.
XrActionCreateInfo is a struct containing information such as name and description of the action set.
The localizedActionSetName value might be shown to the user in a system rebinding menu, and actionSetName might be stored by the system in a configuration file. This is the reason for these values to exist, although the hello_xr sample app doesn’t use these.
The app then retrieves the XrPath for two hands.
// Get the XrPath for the left and right hands - we will use them as subaction paths.
CHECK_XRCMD(xrStringToPath(m_instance, "/user/hand/left", &m_input.handSubactionPath[Side::LEFT]));
CHECK_XRCMD(xrStringToPath(m_instance, "/user/hand/right", &m_input.handSubactionPath[Side::RIGHT]));
Action Creation and Action State
You must create every app-specific action. For example, the following code creates the grab object action for left and right hands through calling the xrCreateAction function. This call again refers to a specific action set. Notice that either the left or right hand can perform this action:
// Create an input action for grabbing objects with the left and right hands.
XrActionCreateInfo actionInfo{XR_TYPE_ACTION_CREATE_INFO};
actionInfo.actionType = XR_ACTION_TYPE_FLOAT_INPUT;
strcpy_s(actionInfo.actionName, "grab_object");
strcpy_s(actionInfo.localizedActionName, "Grab Object");
actionInfo.countSubactionPaths = uint32_t(m_input.handSubactionPath.size());
actionInfo.subactionPaths = m_input.handSubactionPath.data();
CHECK_XRCMD(xrCreateAction(m_input.actionSet, &actionInfo, &m_input.grabAction));
To receive the action state through XrActionStateGetInfo, the app uses a loop:
for (auto hand : {Side::LEFT, Side::RIGHT}) {
XrActionStateGetInfo getInfo{XR_TYPE_ACTION_STATE_GET_INFO};
getInfo.action = m_input.grabAction;
getInfo.subactionPath = m_input.handSubactionPath[hand];
...
}
The XrActionStateGetInfo struct provides action paths when calling the xrGetActionState function.
This is how receiving state in float while polling for the grabAction state happens:
Apps create XrSpace handles based on pose actions. To define the position and orientation of the space origin within a reference space, apps can provide an XrPosef struct, which represents a position and orientation within the space.
Note: The app supplies a reference space every time it asks for the location of a space.
The first step is to create the space by calling the xrCreateActionSpace function. Attaching action spaces to a session through the action sets occurs by calling the xrAttachSessionActionSets function.
Important: Calling xrAttachSessionActionSets is essential in order to bind the actions and action sets to the session, but it doesn’t really impact action spaces. However, you must call this function before you can get any input. Creating action spaces is still possible after calling xrAttachSessionActionSets.
Because action sets have instances as parent handles and spaces have sessions as their parent, there is no direct link between action sets and spaces, so a reference to the session is required.
Note: Action sets can turn on or off through the xrSyncActions function.
Meta Quest Touch Controller Profile
The input and haptics interaction profile for the Meta Quest Touch controller follows.
Paths for both /user/hand/left and /user/hand/right:
.../input/squeeze/value
.../input/trigger/value
.../input/trigger/touch
.../input/thumbstick/x
.../input/thumbstick/y
.../input/thumbstick/click
.../input/thumbstick/touch
.../input/thumbrest/touch (not available on Quest 1 or Rift S, limited to Rift CV1 and Quest 2)
.../input/grip/pose
.../input/aim/pose
Hand specific:
Only on /user/hand/left
Only on /user/hand/right
.../input/x/click
.../input/a/click
.../input/x/touch
.../input/a/touch
.../input/y/click
.../input/b/click
.../input/y/touch
.../input/b/touch
.../input/menu/click
.../input/system/click (might be unavailable for app use)
Path for haptic output is .../output/haptic.
Meta Quest Touch Pro Controller Profile
This interaction profile represents the input sources and haptics on the Meta Quest Touch Pro controller. This is a superset of the existing Meta Quest Touch Controller Profile.
Additional supported paths enabled by this profile for both /user/hand/left and /user/hand/right:
…/input/thumbrest/force
…/input/stylus_fb/force
…/input/trigger/curl_fb
…/input/trigger/slide_fb
…/input/trigger/proximity_fb
…/input/thumb_fb/proximity_fb
…/output/trigger_haptic_fb
…/output/thumb_haptic_fb
Hand specific:
Only on /user/hand/left
Only on /user/hand/right
.../input/x/click
.../input/a/click
.../input/x/touch
.../input/a/touch
.../input/y/click
.../input/b/click
.../input/y/touch
.../input/b/touch
.../input/menu/click
.../input/system/click (might be unavailable for app use)
Code snippets in this document belong to hello_xr sample app, which is developed by The Khronos Group Inc. and licensed under the Apache License, Version 2.0.