adb uninstall com.oculus.sdk.xrface cd XrSamples/XrFace/Projects/Android ../../../../gradlew installDebug
Visual means that facial movements are being estimated based on inward facing cameras and optional microphone data. Audio means that facial movements are being estimated by microphone data only.
XR_FB_face_tracking2 introduces an extension to provide the output of face. It takes as input the images from custom sensors and outputs the blendshape weights corresponding to action in different facial regions.
We highly encourage choosing XR_FB_face_tracking2 over a deprecated XR_FB_face_tracking extension, since XR_FB_face_tracking extension doesn’t support tongue tracking and audio-driven face tracking. If you are interested in old XR_FB_face_tracking extension, please visit Khronos OpenXR Registry for more details.RECORD_AUDIO permission, if you want facial movements to be estimated by audio from the microphone.<manifest xmlns:android="http://schemas.android.com/apk/res/android" > <!-- Tell the system this app can handle face tracking --> <uses-feature android:name="oculus.software.face_tracking" android:required="true" /> <uses-permission android:name="com.oculus.permission.FACE_TRACKING" /> <!-- Tell the system this app can use audio for face tracking --> <uses-permission android:name="android.permission.RECORD_AUDIO" /> <!-- Tell the system this app can handle eye tracking --> <uses-feature android:name="oculus.software.eye_tracking" android:required="true" /> <uses-permission android:name="com.oculus.permission.EYE_TRACKING" /> .... </manifest>
com.oculus.permission.EYE_TRACKING, com.oculus.permission.FACE_TRACKING, and android.permission.RECORD_AUDIO permissions are runtime permissions, so the application should explicitly ask the user to grant permission. For details about permissions, see Runtime permissions. The following example demonstrates how this can be handled. private static final String PERMISSION_FACE_TRACKING = "com.oculus.permission.FACE_TRACKING";
private static final String PERMISSION_EYE_TRACKING = "com.oculus.permission.EYE_TRACKING";
private static final String PERMISSION_RECORD_AUDIO = "android.permission.RECORD_AUDIO";
private static final int REQUEST_CODE_PERMISSION_FACE_AND_EYE_TRACKING = 1;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestFaceAndEyeTrackingPermissionIfNeeded();
}
private void requestFaceAndEyeTrackingPermissionIfNeeded() {
List<String> permissionsToRequest = new ArrayList<>();
if (checkSelfPermission(PERMISSION_EYE_TRACKING) != PackageManager.PERMISSION_GRANTED) {
permissionsToRequest.add(PERMISSION_EYE_TRACKING);
}
if (checkSelfPermission(PERMISSION_FACE_TRACKING) != PackageManager.PERMISSION_GRANTED) {
permissionsToRequest.add(PERMISSION_FACE_TRACKING);
}
if (checkSelfPermission(PERMISSION_RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
permissionsToRequest.add(PERMISSION_RECORD_AUDIO);
}
if (!permissionsToRequest.isEmpty()) {
String[] permissionsAsArray =
permissionsToRequest.toArray(new String[permissionsToRequest.size()]);
requestPermissions(permissionsAsArray, REQUEST_CODE_PERMISSION_FACE_AND_EYE_TRACKING);
}
}
XrInstance instance; XrSystemId system; XrSession session; XrSpace sceneSpace;
SampleXrFramework/Src/XrApp.h header.SampleXrFramework/Src/XrApp.cpp.XrInstance:std::vector<const char*> extensions;
XrInstance instance = XR_NULL_HANDLE;
XrInstanceCreateInfo instanceCreateInfo = {XR_TYPE_INSTANCE_CREATE_INFO};
....
instanceCreateInfo.enabledExtensionCount = extensions.size();
instanceCreateInfo.enabledExtensionNames = extensions.data();
....
OXR(initResult = xrCreateInstance(&instanceCreateInfo, &instance));
SampleXrFramework/Src/XrApp.cpp.#include <openxr/openxr.h>
XrSession Session;
XrSpace StageSpace;
SampleXrFramework\Src\XrApp.h header.XR_FB_FACE_TRACKING2_EXTENSION_NAME as an extension name.XR_FB_EYE_TRACKING_SOCIAL_EXTENSION_NAME with XR_FB_FACE_TRACKING2_EXTENSION_NAME, otherwise eye-related blendshapes EYES_LOOK_* will not be provided.XrInstance, you must receive the system properties through calling the xrGetSystemProperties function to validate this.XrSystemFaceTrackingProperties2FB struct that describes if a system supports eye tracking. Its definition follows.typedef struct XrSystemFaceTrackingProperties2FB {
XrStructureType type;
void* XR_MAY_ALIAS next;
XrBool32 supportsVisualFaceTracking;
XrBool32 supportsAudioFaceTracking;
} XrSystemFaceTrackingProperties2FB;
XrSystemFaceTrackingProperties2FB. XrSystemFaceTrackingProperties2FB faceTrackingSystemProperties{
XR_TYPE_SYSTEM_FACE_TRACKING_PROPERTIES2_FB};
XrSystemProperties systemProperties{
XR_TYPE_SYSTEM_PROPERTIES, &faceTrackingSystemProperties};
OXR(xrGetSystemProperties(GetInstance(), GetSystemId(), &systemProperties));
if (faceTrackingSystemProperties.supportsAudioFaceTracking ||
faceTrackingSystemProperties.supportsVisualFaceTracking) {
// face tracking is supported!
}
supportsAudioFaceTracking field of the XrSystemFaceTrackingProperties2FB struct returns true, audio-driven face tracking is supported. If supportsVisualFaceTracking field returns true, the device supports face tracking using inward facing cameras.xrGetInstanceProcAddr in the OpenXR spec. The following example demonstrates how to do this. PFN_xrCreateFaceTracker2FB xrCreateFaceTrackerFB_ = nullptr;
PFN_xrDestroyFaceTracker2FB xrDestroyFaceTrackerFB_ = nullptr;
PFN_xrGetFaceExpressionWeights2FB xrGetFaceExpressionWeightsFB_ = nullptr;
OXR(xrGetInstanceProcAddr(
GetInstance(),
"xrCreateFaceTracker2FB",
(PFN_xrVoidFunction*)(&xrCreateFaceTrackerFB_)));
OXR(xrGetInstanceProcAddr(
GetInstance(),
"xrDestroyFaceTracker2FB",
(PFN_xrVoidFunction*)(&xrDestroyFaceTrackerFB_)));
OXR(xrGetInstanceProcAddr(
GetInstance(),
"xrGetFaceExpressionWeights2FB",
(PFN_xrVoidFunction*)(&xrGetFaceExpressionWeightsFB_)));
XrFaceTracker2FB handle to a face tracker. To create and obtain an XrFaceTracker2FB handle to a face tracker, you must call the xrCreateFaceTracker2FB function, defined as:XrResult xrCreateFaceTracker2FB( XrSession session, const XrFaceTrackerCreateInfo2FB* createInfo, XrFaceTracker2FB* faceTracker);
xrCreateFaceTracker2FB. The following example demonstrates how to use it. XrFaceTracker2FB faceTracker_ = XR_NULL_HANDLE;
XrFaceTrackerCreateInfo2FB createInfo{XR_TYPE_FACE_TRACKER_CREATE_INFO2_FB};
createInfo.faceExpressionSet = XR_FACE_EXPRESSION_SET2_DEFAULT_FB;
createInfo.requestedDataSourceCount = 2;
XrFaceTrackingDataSource2FB dataSources[2] = {
XR_FACE_TRACKING_DATA_SOURCE2_VISUAL_FB,
XR_FACE_TRACKING_DATA_SOURCE2_AUDIO_FB};
createInfo.requestedDataSources = dataSources;
OXR(xrCreateFaceTracker2FB_(GetSession(), &createInfo, &faceTracker_));
com.oculus.permission.FACE_TRACKING permission in their manifest and a user must grant this permission.xrGetFaceExpressionWeights2FB immediately upon return of this call, as seen in the next section.XR_FACE_EXPRESSION2_COUNT_FB and XR_FACE_CONFIDENCE2_COUNT_FB enums. float weights_[XR_FACE_EXPRESSION2_COUNT_FB] = {};
float confidence_[XR_FACE_CONFIDENCE2_COUNT_FB] = {};
xrGetFaceExpressionWeights2FB function. This function obtains the weights and confidences for the 70 blendshapes that are tracked by the face tracker at a given point in time. Its definition follows:XrResult XRAPI_CALL xrGetFaceExpressionWeights2FB( XrFaceTracker2FB faceTracker, const XrFaceExpressionInfo2FB* expressionInfo, XrFaceExpressionWeights2FB* expressionWeights);
xrGetFaceExpressionWeights2FB.XrFaceExpressionInfo2FB struct is a xrGetFaceExpressionWeights2FB function parameter that describes the time at which face expressions are being requested. Callers should request a time equal to the predicted display time for the rendered frame. The system will return the value at the closest timestamp possible to the requested timestamp. The timestamp of the estimation is always provided, so that the caller can determine to the extent the system was able to fulfill the request. The system will employ appropriate modeling to provide expressions for this time. The definition of the XrFaceExpressionInfo2FB struct follows.typedef struct XrFaceExpressionInfo2FB {
XrStructureType type;
const void* XR_MAY_ALIAS next;
XrTime time;
} XrFaceExpressionInfo2FB;
XrFaceExpressionInfo2FB.XrFaceExpressionWeights2FB struct is a xrGetFaceExpressionWeights2FB function parameter that contains arrays describing the face tracking blendshape weights and confidences. Its definition follows.typedef struct XrFaceExpressionWeights2FB {
XrStructureType type;
void* XR_MAY_ALIAS next;
uint32_t weightCount;
float* weights;
uint32_t confidenceCount;
float* confidences;
XrBool32 isValid;
XrBool32 isEyeFollowingBlendshapesValid;
XrFaceTrackingDataSource2FB dataSource;
XrTime time;
} XrFaceExpressionWeights2FB;
XrFaceExpressionWeights2FB.xrGetFaceExpressionWeights2FB function. XrFaceExpressionWeights2FB expressionWeights{XR_TYPE_FACE_EXPRESSION_WEIGHTS2_FB};
expressionWeights.next = nullptr;
expressionWeights.weights = weights_;
expressionWeights.confidences = confidence_;
expressionWeights.weightCount = XR_FACE_EXPRESSION2_COUNT_FB;
expressionWeights.confidenceCount = XR_FACE_CONFIDENCE2_COUNT_FB;
XrFaceExpressionInfo2FB expressionInfo{XR_TYPE_FACE_EXPRESSION_INFO2_FB};
expressionInfo.time = GetPredictedDisplayTime();
OXR(xrGetFaceExpressionWeights2FB_(faceTracker_, &expressionInfo, &expressionWeights));
for (uint32_t i = 0; i < XR_FACE_EXPRESSION2_COUNT_FB; ++i) {
// weights_[i] contains one specific weight
....
}
xrDestroyFaceTracker2FB function.OXR(xrDestroyFaceTracker2FB_(faceTracker_));
XR_META_face_tracking_visemes extension can be used when you need visemes as a stop-gap solution, before your rig supports blendshapes.XrInstance, you must receive the system properties through calling the xrGetSystemProperties function to validate this.XrSystemFaceTrackingVisemesPropertiesMETA struct that describes if a system supports visemes. Its definition follows.typedef struct XrSystemFaceTrackingVisemesPropertiesMETA {
XrStructureType type;
void* next;
XrBool32 supportsVisemes;
} XrSystemFaceTrackingVisemesPropertiesMETA;
XrSystemFaceTrackingVisemesPropertiesMETA faceTrackingVisemesSystemProperties{
XR_TYPE_SYSTEM_FACE_TRACKING_VISEMES_PROPERTIES_META};
XrSystemProperties systemProperties{XR_TYPE_SYSTEM_PROPERTIES,
&faceTrackingVisemesSystemProperties};
OXR(xrGetSystemProperties(instance, systemId, &systemProperties));
if (faceTrackingVisemesSystemProperties.supportsVisemes) {
// visemes are supported!
}
xrGetFaceExpressionWeights2FB function, while adding XrFaceTrackingVisemesMETA to the next chain of XrFaceExpressionWeights2FB structure. The definition of the XrFaceTrackingVisemesMETA struct follows. Make sure you check the validity of the data on return by checking the isValid flag after the call to xrGetFaceExpressionWeights2FB.typedef struct XrFaceTrackingVisemesMETA {
XrStructureType type;
const void* next;
XrBool32 isValid;
float visemes[XR_FACE_TRACKING_VISEME_COUNT_META];
} XrFaceTrackingVisemesMETA;
xrGetFaceExpressionWeights2FB function to get visemes, instead of blendshapes. XrFaceExpressionWeights2FB expressionWeights{XR_TYPE_FACE_EXPRESSION_WEIGHTS2_FB};
expressionWeights.weightCount = 0;
expressionWeights.confidenceCount = 0;
XrFaceTrackingVisemesMETA visemeInfo{XR_TYPE_FACE_TRACKING_VISEMES_META};
expressionWeights.next = &visemeInfo;
XrFaceExpressionInfo2FB expressionInfo{XR_TYPE_FACE_EXPRESSION_INFO2_FB};
expressionInfo.time = GetPredictedDisplayTime();
OXR(xrGetFaceExpressionWeights2FB_(faceTracker_, &expressionInfo, &expressionWeights));
if (visemeInfo.isValid) {
for (uint32_t i = 0; i < XR_FACE_TRACKING_VISEME_COUNT_META; ++i) {
// visemeInfo.visemes[i] contains a weight of specific visemes
}
}
weightCount, weights, confidenceCount, and confidences, instead of assigning 0 to weightCount and confidenceCount, both blendshapes and visemes will be returned by xrGetFaceExpressionWeights2FB function.| Viseme | Phonemes | Examples | Mild | Emphasized | Rotated |
|---|---|---|---|---|---|
SIL | neutral | ![]() | None | ![]() | |
PP | p, b, m | put, bat, mat | ![]() | ![]() | ![]() |
FF | f, v | fat, vat | ![]() | ![]() | ![]() |
TH | th | think, that | ![]() | ![]() | ![]() |
DD | t, d | tip, doll | ![]() | ![]() | ![]() |
KK | k, g | call, gas | ![]() | ![]() | ![]() |
CH | tS, dZ, S | chair, join, she | ![]() | ![]() | ![]() |
SS | s, z | sir, zeal | ![]() | ![]() | ![]() |
NN | n, l | lot, not | ![]() | ![]() | ![]() |
RR | r | red | ![]() | ![]() | ![]() |
AA | A: | car | ![]() | ![]() | ![]() |
E | e | bed | ![]() | ![]() | ![]() |
IH | ih | tip | ![]() | ![]() | ![]() |
OH | oh | toe | ![]() | ![]() | ![]() |
OU | ou | book | ![]() | ![]() | ![]() |