Develop
Develop
Select your platform

Getting started

Updated: Oct 1, 2025
After completing this section, the developer should:
  1. Understand how to set up their Unreal project to use the Movement SDK including the use of Link.
  2. Understand how to set up permissions for eye tracking, face tracking or body tracking.
To set up a Movement SDK feature (body, face, or eye tracking) on your Unreal project for Meta Quest, you must first properly configure your project for Unreal development.

Unreal project setup

Prerequisites

To use Movement SDK for Unreal, the following are required:
  • A Meta Quest Pro headset for eye tracking and visual-based face tracking.
  • A Meta Quest 2, Meta Quest 3, or Meta Quest Pro headset for body tracking and audio-based face tracking.
  • Horizon OS v60.0 or higher.
  • Unreal 5.0 or newer installed, (Unreal 5.4 recommended).
  • An installed version of the Meta XR plugin for Unreal.
  • A developer team Mobile Device Setup.
If you are not familiar with setting up an Unreal project that builds and runs on your headset: Follow the Creating Your First Meta Quest VR App in Unreal Engine tutorial. This is the basis to create, configure, and build from scratch a very basic Unreal project that runs on your headset. After establishing the project, ensure that you confirm the following:
  • An installed version of the Meta XR plugin for Unreal.
  • Run the Meta XR Project Setup Tool and apply all required and recommended rules.
    Project Setup Tool
Note: The OpenXR, OpenXREyeTracker, OpenXRHandTracking, OpenXRMsftHandInteraction and OpenXRViveTracker are not compatible with Movement SDK and should be disabled. The Meta XR plugin will provide the OpenXR compatibility.

OculusXRMovement plugin

Finally, you must install the OculusXRMovement plugin. This plugin is necessary to support the tracking services. You can read more about the details of this plugin in OculusXRMovement Plugin Reference. Currently, the OculusXRMovement plugin is distributed in source. As such, you must recompile your project to include the plugin. Follow these steps:
  1. Download the Movement SDK Sample for Unreal.
  2. Locate the OculusXRMovement plugin within Unreal-Movement/Plugins/ and copy it over to your own project’s /Plugins/ folder.
  3. Ensure that your Unreal Project is a C++ project. You can convert it to one by adding a new C++ class to your Blueprint project by using Tools > New C++ Class.
  4. Recompile your project with a C++ IDE (such as Visual Studio).
  5. Rebuild and open your project.

Movement SDK configuration

Step 1: Enable Tracking Features.
  1. To enable the tracking features of the Meta XR plugin you must enable them in the project settings. Go to Edit > Project Settings.
  2. Scroll down to the Plugins header and select the Meta XR subheading.
  3. Under Mobile, scroll down and enable the tracking features you want to use: Unreal Movement Meta XR Plugin SettingsPlugins menu with body, eye, and face tracking selected.
  4. Under General make sure that XR API is set to Oculus OVRPlugin + OpenXR backend.
Step 2: Setup Android Permissions
  1. Unreal Engine exposes a Blueprint API to manage permissions see documentation. Your application must have the following permissions granted to utilize the tracking features in Movement SDK:
    • Body tracking: com.oculus.permission.BODY_TRACKING,
    • Face tracking: com.oculus.permission.FACE_TRACKING and android.permission.RECORD_AUDIO, and for
    • Eye tracking: com.oculus.permission.EYE_TRACKING.
  2. In your GameMode (or GameInstance) create a string array variable and name it Permissions.
  3. Make the variable Type an Array container by selecting the icon to the right of the variable type in the Variable Detail panel.
  4. Compile and add the relevant permissions to the Permissions variable.
    Request Permission Blueprint
  5. After the BeginPlay event, add a Request Android Permissions node to trigger the permission request.
  6. Compile and Save the GameMode (or GameInstance).
    Request Permission Blueprint
If you are using PCVR and want to use Link for play-in-editor:
  1. Open your Link app and go to Settings > Developer.
  2. Enable Developer Runtime Features.
  3. Enable Eye Tracking over Meta Horizon Link.
  4. Enable Natural Facial Expressions over Meta Horizon Link.
    Developer Runtime
See Meta Horizon Link for more details.

Test your setup

Before you continue, you should have a working sample. Verify by:
  • Save-All in your project.
  • Play in editor by selecting VR Preview while using Link, or build, deploy and run on your headset.

What’s next

Let’s implement this new functionality in Code Walkthrough.
Did you find this page helpful?