This documentation is no longer being updated and is subject for removal.
This guide helps you integrate Physical Keyboard Tracking Native APIs in your Meta Quest apps by using OpenXR.
There are several components that you must set up to provide users with a rich tracked keyboard experience. These components are:
Tracked Keyboard
Hands
Passthrough
The Tracked Keyboard is the keyboard itself in virtual space. This component looks for a keyboard in physical space and tries to match it with the user’s selected keyboard. If the component tracks a physical keyboard, it will render it in virtual space.
The Hands component assists in tracking and displaying hand models in virtual space that correspond to the user’s actual hands.
Passthrough allows for the user’s real hands to be seen in virtual space using passthrough camera layers. When the hands are not near the keyboard, they are rendered as a VR model. When the hands are near the keyboard, rendering switches to using Passthrough mode, which displays the user’s actual hands to the user.
Make sure to use the latest version of the Meta Quest operating system and the Oculus OpenXR Mobile SDK
. To verify this, do the following:
In the headset, go to Settings > System > Software Update.
Check the version.
If the version is not 37 or higher, update the software to the latest available version.
Android Setup
The AndroidManifest.xml file requires the following features and permissions to unlock the essential functionality for rendering physically tracked keyboards.
<!-- Tell the system this app can render passthrough -->
<uses-feature android:name="com.oculus.feature.PASSTHROUGH" android:required="true" />
<!-- Tell the system this app uses render model extensions -->
<uses-feature android:name="com.oculus.feature.RENDER_MODEL" android:required="true" />
<uses-permission android:name="com.oculus.permission.RENDER_MODEL" />
<!-- Tell the system this app can handle tracked keyboards -->
<uses-feature android:name="oculus.software.trackedkeyboard" android:required="false" />
<uses-permission android:name="com.oculus.permission.TRACKED_KEYBOARD" />
Your Android project files must include a NativeActivity that will load the OpenXR library manually like in the following example:
public class MainActivity extends android.app.NativeActivity {
static {
System.loadLibrary("openxr_loader");
}
}
Native Integration
Using XrApp
We recommend your main application class to inherit from XrApp. This gives you a good starting point and access to many helpful methods and objects, one of which is the XrInstance object. You can use it in many OpenXR API calls.
class XrExampleApp : public OVRFW::XrApp
By implementing this class you can retrieve the XrInstance through calling GetInstance() or the XrSession through calling GetSession().
You can expose these extensions by overriding the GetExtensions method of XrApp and returning the result to ensure all essential extensions are returned when needed.
// Returns a list of OpenXr extensions needed for this app
virtual std::vector<const char*> GetExtensions() override {
std::vector<const char*> extensions = XrApp::GetExtensions();
extensions.push_back(XR_FB_KEYBOARD_TRACKING_EXTENSION_NAME);
extensions.push_back(XR_FB_RENDER_MODEL_EXTENSION_NAME);
extensions.push_back(XR_FB_PASSTHROUGH_KEYBOARD_HANDS_EXTENSION_NAME);
extensions.push_back(XR_FB_PASSTHROUGH_EXTENSION_NAME);
extensions.push_back(XR_FB_TRIANGLE_MESH_EXTENSION_NAME);
extensions.push_back(XR_EXT_HAND_TRACKING_EXTENSION_NAME);
extensions.push_back(XR_FB_HAND_TRACKING_MESH_EXTENSION_NAME);
extensions.push_back(XR_FB_HAND_TRACKING_AIM_EXTENSION_ANEM);
extensions.push_back(XR_FB_COMPOSITION_LAYER_ALPHA_BLEND_EXTENSION_NAME);
extensions.push_back(kbdExtension);
return extensions;
}
Initialization
Call to OpenXR to initialize specific component extensions by using the XrInstance. You must call the following extensions to initialize keyboard tracking with OpenXR:
Similarly, to allow users to see their hands through passthrough when using a tracked keyboard, you must initialize passthrough by using xrGetInstanceProcAddr. For example:
/// Hook up extensions for hand tracking
oxr(xrGetInstanceProcAddr(
instance, "xrCreateHandTrackerEXT", (PFN_xrVoidFunction*)(&xrCreateHandTrackerEXT_)));
oxr(xrGetInstanceProcAddr(
instance, "xrDestroyHandTrackerEXT", (PFN_xrVoidFunction*)(&xrDestroyHandTrackerEXT_)));
oxr(xrGetInstanceProcAddr(
instance, "xrLocateHandJointsEXT", (PFN_xrVoidFunction*)(&xrLocateHandJointsEXT_)));
/// Hook up extensions for hand rendering
oxr(xrGetInstanceProcAddr(
instance, "xrGetHandMeshFB", (PFN_xrVoidFunction*)(&xrGetHandMeshFB_)));
Finally, add the extensions for render model assistance:
/// Hook up extensions for device settings
oxr(xrGetInstanceProcAddr(
instance,
"xrEnumerateRenderModelPathsFB",
(PFN_xrVoidFunction*)(&xrEnumerateRenderModelPathsFB_)));
oxr(xrGetInstanceProcAddr(
instance,
"xrGetRenderModelPropertiesFB",
(PFN_xrVoidFunction*)(&xrGetRenderModelPropertiesFB_)));
oxr(xrGetInstanceProcAddr(
instance, "xrLoadRenderModelFB", (PFN_xrVoidFunction*)(&xrLoadRenderModelFB_)));
You can initialize and manage these extensions however you wish. We recommend you to break them apart into helper classes that can own their individual responsibilities, although this is not a requirement.
Query System for Keyboard Tracking Info
To receive tracked keyboard information so that you update your keyboard model, you must query the system. This will inform you if a keyboard exists and, if so, whether it can be tracked. Later on, you will also be able to query for updated state values of the keyboard.
if (xrQuerySystemTrackedKeyboardFB_) {
// current query
{
XrKeyboardTrackingQueryFB queryInfo{XR_TYPE_KEYBOARD_TRACKING_QUERY_FB};
queryInfo.flags = XR_KEYBOARD_TRACKING_QUERY_LOCAL_BIT_FB;
XrKeyboardTrackingDescriptionFB desc;
if (oxr(xrQuerySystemTrackedKeyboardFB_(session_, &queryInfo, &desc))) {
if ((desc.flags & XR_KEYBOARD_TRACKING_EXISTS_BIT_FB) != 0) {
// found keyboard
if (!systemKeyboardExists_ ||
systemKeyboardDesc_.trackedKeyboardId != desc.trackedKeyboardId ||
systemKeyboardDesc_.flags != desc.flags) {
ALOG(
"Found new system keyboard '%d' '%s'",
desc.trackedKeyboardId,
desc.name);
systemKeyboardExists_ = true;
systemKeyboardDesc_ = desc;
systemKeyboardConnected_ =
systemKeyboardDesc_.flags & XR_KEYBOARD_TRACKING_CONNECTED_BIT_FB;
if ((systemKeyboardDesc_.flags & XR_KEYBOARD_TRACKING_LOCAL_BIT_FB)) {
trackingSystemKeyboard_ = false;
if (trackSystemKeyboard_) {
if (systemKeyboardConnected_ ||
!requireKeyboardConnectedToTrack_) {
if (StartTrackingSystemKeyboard()) {
trackingSystemKeyboard_ = true;
}
}
}
if (!trackingSystemKeyboard_) {
StopTracking();
}
} else {
ALOG(
"Found new system keyboard '%d' '%s', but not tracking because it isn't local",
desc.trackedKeyboardId,
desc.name);
}
systemKeyboardStateChanged_ = true;
}
} else {
// no keyboard
if (systemKeyboardExists_) {
systemKeyboardExists_ = false;
if (trackSystemKeyboard_) {
StopTracking();
trackingSystemKeyboard_ = false;
}
systemKeyboardStateChanged_ = true;
}
}
}
}
}
if (keyboardSpace_ != XR_NULL_HANDLE) {
location_.next = nullptr;
return oxr(
xrLocateSpace(keyboardSpace_, currentSpace, predictedDisplayTime, &location_));
}
Start and Stop Tracking
The following code is an example of how you can tell the system to start and stop keyboard tracking.
bool StartTrackingSystemKeyboard() {
/// delete old ...
StopTracking();
if (xrCreateKeyboardSpaceFB_ && systemKeyboardExists_) {
XrKeyboardSpaceCreateInfoFB createInfo{XR_TYPE_KEYBOARD_SPACE_CREATE_INFO_FB};
createInfo.trackedKeyboardId = systemKeyboardDesc_.trackedKeyboardId;
if (XR_SUCCEEDED(
oxr(xrCreateKeyboardSpaceFB_(session_, &createInfo, &keyboardSpace_)))) {
size_ = systemKeyboardDesc_.size;
return true;
}
}
return false;
}
bool StopTracking() {
bool result = false;
if (keyboardSpace_ != XR_NULL_HANDLE) {
result = oxr(xrDestroySpace(keyboardSpace_));
if (result) {
keyboardSpace_ = XR_NULL_HANDLE;
} else {
ALOG("Failed to destroy keyboardSpace_ %p", keyboardSpace_);
}
}
return result;
}
Loading Keyboard Render Model
To load the keyboard render model, you must first enumerate all available render model paths. This involves making a few calls to retrieve the paths and their properties as shown in the example below.
XR_TYPE_RENDER_MODEL_PROPERTIES_FB is a struct definition that will be populated with the method xrGetRenderModelPropertiesFB_.
/// Enumerate available models
XrInstance instance = GetInstance();
if (xrEnumerateRenderModelPathsFB_) {
/// Query path count
uint32_t pathCount = 0;
oxr(xrEnumerateRenderModelPathsFB_(session_, pathCount, &pathCount, nullptr));
if (pathCount > 0) {
XRLOG("XrRenderModelHelper: found %u models ", pathCount);
paths_.resize(pathCount);
/// Fill in the path data
oxr(xrEnumerateRenderModelPathsFB_(session_, pathCount, &pathCount, &paths_[0]));
/// Get properties
for (const auto& p : paths_) {
XrRenderModelPropertiesFB prop{XR_TYPE_RENDER_MODEL_PROPERTIES_FB};
XrResult result = xrGetRenderModelPropertiesFB_(session_, p.path, &prop);
if (result == XR_SUCCESS) {
properties_.push_back(prop);
}
}
}
}
Once the paths are discovered, you can query the keyboard render model(s). First, execute a two-call pattern to get the buffer for the render model(s). The first call retrieves the buffer size and the second call retrieves the buffer.
Once the paths are retrieved, the modelKey can be passed into an XrRenderModelLoadInfoFB object and used to load the models themselves by calling xrLoadRenderModelFB_. Below is an example of this use.
std::vector<uint8_t> buffer;
XrInstance instance = GetInstance();
for (const auto& p : paths_) {
char buf[256];
uint32_t bufCount = 0;
// OpenXR two call pattern: first call gets buffer size, second call gets the buffer
// data
oxr(xrPathToString(instance, p.path, bufCount, &bufCount, nullptr));
oxr(xrPathToString(instance, p.path, bufCount, &bufCount, &buf[0]));
std::string pathString = buf;
if (pathString.rfind("/model_fb/keyboard", 0) == 0) {
XrRenderModelPropertiesFB prop{XR_TYPE_RENDER_MODEL_PROPERTIES_FB};
XrResult result = xrGetRenderModelPropertiesFB_(session_, p.path, &prop);
if (result == XR_SUCCESS) {
if (prop.modelKey != XR_NULL_RENDER_MODEL_KEY_FB) {
XrRenderModelLoadInfoFB loadInfo = {XR_TYPE_RENDER_MODEL_LOAD_INFO_FB};
loadInfo.modelKey = prop.modelKey;
XrRenderModelBufferFB rmb{XR_TYPE_RENDER_MODEL_BUFFER_FB};
rmb.next = nullptr;
rmb.bufferCapacityInput = 0;
rmb.buffer = nullptr;
if (oxr(xrLoadRenderModelFB_(session_, &loadInfo, &rmb))) {
XRLOG(
"Loading modelKey %u size %u ",
prop.modelKey,
rmb.bufferCountOutput);
buffer.resize(rmb.bufferCountOutput);
rmb.buffer = (uint8_t*)buffer.data();
rmb.bufferCapacityInput = rmb.bufferCountOutput;
if (!oxr(xrLoadRenderModelFB_(session_, &loadInfo, &rmb))) {
XRLOG(
"FAILED to load modelKey %u on pass 2",
prop.modelKey);
buffer.resize(0);
}
}
}
}
}
}
If the above code runs successfully, you will have a data buffer containing the raw render model data for the user’s selected keyboard (if supported). This raw data will come from the model file and you must parse it before using it as seen in the following section.
Parsing and Rendering
To render the keyboard models, you must first parse the data. Because all keyboard models will be in *.glb format, you can call the method to parse glb directly.
Once the data is parsed, there may be more than one model found. Additional setup can configure all the returned models like in the following example :
Finally, you can render the model by using the following method:
virtual void Render(const OVRFW::ovrApplFrameIn& in, OVRFW::ovrRendererOutput& out) override {
if (ShowModel && KeyboardModel != nullptr) {
for (auto& model : KeyboardModel->Models) {
ovrDrawSurface controllerSurface;
controllerSurface.surface = &(model.surfaces[0].surfaceDef);
controllerSurface.modelMatrix = Transform;
out.Surfaces.push_back(controllerSurface);
}
}
}
Updating
In addition to rendering the keyboard, you must track updates to the keyboard’s state. This can be things such as validity or position in 3D space. Updating the keyboard will ensure you are displaying the correct state of the keyboard to the user in real time.
The following example retrieves the space and time of the given update frame. Next, it checks if the latest space and location of the queried keyboard is valid. If so, it updates the keyboard pose to the values returned by the system for the given frame.
By using the XR_FB_PASSTHROUGH_KEYBOARD_HANDS extension, the Meta Quest system can render the passthrough view of the hands over the keyboard. You can accomplish that by creating a new passthrough layer of type XR_PASSTHROUGH_LAYER_PURPOSE_TRACKED_KEYBOARDS_HANDS_FB. You can set the intensity of each hand through the xrPassthroughLayerSetKeyboardHandsIntensityFB function.
Passthrough Window
Alternatively, you can use a GeometryRenderer and create a passthrough window cutout. Then in the Update method, update the plane that these should be rendered on.
To support KTX2 textures, you must include the Khronos KTX library in your project and then explicitly declare the use of KTX2 by defining the SUPPORTS_KTX2 configuration parameter. This enables parsing and using KTX2 textures. Otherwise, they will be ignored.