Develop
Develop
Select your platform

Haptic Feedback

Updated: Nov 11, 2025

Overview

You can trigger vibrations on Meta Quest controllers using the xrApplyHapticFeedback() function in OpenXR. You can either use simple haptics with the XrHapticVibration struct from the OpenXR core specification, or use the PCM haptics or amplitude envelope haptics extensions for more advanced haptic effects. This page describes these extensions, as well as a way to trigger localized haptics on the thumb and trigger haptic elements of Meta Quest Touch Pro controllers.
Note that changing the vibration frequency with the simple haptics API is not supported, the XrHapticVibration::frequency member is ignored. To trigger high-fidelity haptic effects that allow varying the vibration frequency, use the parametric haptics or PCM haptics API. Varying vibration frequency is supported for controllers with a VCM (Voice Coil Motor), which is Meta Quest Touch Pro, Touch Plus and later. Out of the two APIs, parametric haptics is recommended for most use cases due to its ease of use, wider availibility in other OpenXR runtimes, and device-agnostic data format.

Parametric haptics

Overview

Experimental API
The parametric haptics API is being released as an experimental feature. The Meta Quest Store will not accept any products that incorporate experimental features. These features are provided on an "as-is" basis, subject to all applicable terms set forth in the Meta Platform Technologies SDK License Agreement.

Using any experimental feature requires you to configure your app and Quest device. See the Manage experimental features page for enabling experimental features on the Quest device, and the Use Link for App Development page for enabling experimental features on Meta Horizon Link.
With the parametric haptics API, you can trigger a high-fidelity vibration with an intensity and frequency that vary over time. The vibration is described in a device-agnostic format.
The parametric haptics API is available with the XR_EXTX1_haptic_parametric OpenXR extension. As this is an experimental API, it is not part of the official OpenXR specification. Instead, you can find the extx1_haptic_parametric.h header for the extension API in the meta_openxr_preview/ directory in the Meta OpenXR SDK.

Data format

A parametric haptics vibration is described by a series of amplitude points, frequency points, and transients.
The amplitude points describe how the intensity of the vibration changes over time.
The frequency points describe how the frequency of the vibration changes over time, which is supported by Meta Quest Touch Pro, Touch Plus and later controllers. Frequency points are ignored for older controllers.
A transient is a short burst that has a strong and “clicky” characteristic. Transients are useful for adding a layer of distinct, discernible, and emphasized points to the resulting vibration.
A parametric haptics vibration, translated to a PCM waveform that drives a VCMA parametric haptics vibration, translated to amplitude steps that drive a LRA
The parametric haptics data is translated to a signal that drives the haptic motor. Meta Quest Touch Pro, Touch Plus and later controllers have a voice coil motor (VCM), which is driven by a PCM waveform. Previous controllers like the Meta Quest Touch have a linear resonant actuator (LRA), which is driven by stepped amplitude changes. On these controllers, the frequency points are ignored, as they vibrate at a fixed frequency.

Setup

Include the extx1_haptic_parametric.h header file from the Meta OpenXR SDK:
#include <openxr/openxr.h>
#include <meta_openxr_preview/extx1_haptic_parametric.h>
Check that the extension is supported on the headset by calling xrGetSystemProperties() and checking XrSystemHapticParametricPropertiesEXTX1::supportsParametricHaptics. Headsets before Quest 2 do not support parametric haptics.
XrSystemHapticParametricPropertiesEXTX1 systemHapticParametricProperties{
    XR_TYPE_SYSTEM_HAPTIC_PARAMETRIC_PROPERTIES_EXTX1};
XrSystemProperties systemProperties = {XR_TYPE_SYSTEM_PROPERTIES,
                                       &systemHapticParametricProperties};
xrGetSystemProperties(instance, systemId, &systemProperties);
if (systemHapticParametricProperties.supportsParametricHaptics == XR_FALSE) {
    // Parametric haptics not supported, don't use it
}

Triggering a parametric haptics vibration

You can trigger a parametric haptics vibration by making one call to the xrApplyHapticFeedback() function and passing the amplitude points, frequency points, and transients of the entire vibration:
std::vector<XrHapticParametricPointEXT> amplitudePoints
    {{{0, 0.0f}, {4000000000, 1.0f}, {10000000000, 1.0f}}};
std::vector<XrHapticParametricPointEXT> frequencyPoints
    {{{0, 1.0f}, {6000000000, 1.0f}, {10000000000, 0.0f}}};
std::vector<XrHapticParametricTransientEXT> transients
    {{{5000000000, 1.0f, 1.0f}}};

XrHapticParametricVibrationEXTX1 vibration{XR_TYPE_HAPTIC_PARAMETRIC_VIBRATION_EXTX1};
vibration.amplitudePointCount = amplitudePoints.size();
vibration.amplitudePoints = amplitudePoints.data();
vibration.frequencyPointCount = frequencyPoints.size();
vibration.frequencyPoints = frequencyPoints.data();
vibration.transientCount = transients.size();
vibration.transients = transients.data();
vibration.minFrequencyHz = XR_FREQUENCY_UNSPECIFIED;
vibration.maxFrequencyHz = XR_FREQUENCY_UNSPECIFIED;
vibration.streamFrameType = XR_HAPTIC_PARAMETRIC_STREAM_FRAME_TYPE_NONE_EXTX1;

XrHapticActionInfo actionInfo{XR_TYPE_HAPTIC_ACTION_INFO, nullptr};
actionInfo.action = action;
actionInfo.subactionPath = subactionPath;

xrApplyHapticFeedback(session, &actionInfo, reinterpret_cast<const XrHapticBaseHeader*>(&vibration));
The values for amplitude points, frequency points, and transients range from 0.0 to 1.0. The time values for these points are in nanoseconds since the start of the haptic vibration. The first amplitude point needs to be at time 0ns. Frequency points and transients are optional.
You can either define the amplitude points, frequency points, and transients in code, or use Meta Haptics Studio, export the haptic clip as a .haptic JSON file, and then read the data from that file.

Streaming

While you can trigger a vibration by passing the entire data upfront in one call to xrApplyHapticFeedback(), in some cases you need multiple calls to xrApplyHapticFeedback() over time, in which the data is passed piece-by-piece. This is called streaming. Streaming is needed in these cases:
  • When not all of the haptic data is known upfront, and is generated on-the-fly instead.
  • When the haptic data contains more than the maximum of XR_HAPTIC_PARAMETRIC_MAX_POINTS_TRANSIENTS_EXTX1 amplitude points, frequency points, or transients.
The haptic data passed in one API call is called a haptic frame. In the initial call to the API, you pass the first frame of haptic data. Before that frame has been fully played out, you call the API again with a new frame of haptic data. The first frame needs to contain at least two amplitude points, later frames need to contain at least one. For each call, set the streamFrameType member to the appropriate frame type. The example code above does not use streaming, so the frame type is set to XR_HAPTIC_PARAMETRIC_STREAM_FRAME_TYPE_NONE_EXTX1.
Call the xrHapticParametricGetPropertiesEXT() function to query the minimum duration the first frame needs to have, as well as the optimal timing interval for sending subsequent frames:
PFN_xrHapticParametricGetPropertiesEXTX1 xrHapticParametricGetPropertiesEXTX1 = nullptr;
xrGetInstanceProcAddr(instance, "xrHapticParametricGetPropertiesEXTX1",
                      (PFN_xrVoidFunction*)(&xrHapticParametricGetPropertiesEXTX1));

XrHapticParametricPropertiesEXTX1 properties{XR_TYPE_HAPTIC_PARAMETRIC_PROPERTIES_EXTX1};
xrHapticParametricGetPropertiesEXTX1(session, &hapticActionInfo, &properties);
The minimum duration the first frame needs to have is available in XrHapticParametricPropertiesEXTX1::minimumFirstFrameDuration, and the optimal timing interval for sending subsequent frames is available in XrHapticParametricPropertiesEXTX1::idealFrameSubmissionRate.

Absolute frequencies

The amplitude and frequency values range from 0.0 to 1.0, which are automatically mapped to the full intensity and frequency range supported by the controller.
For frequencies, you can also specify the absolute frequency range in Hertz. The absolute frequency range is specified in the first frame, and used for the entire haptic vibration. To specify the absolute frequency range, set minFrequencyHz and maxFrequencyHz to the respective values. The example code above uses the maximum frequency range supported by the controller, so both values are set to XR_FREQUENCY_UNSPECIFIED.
To query the maximum frequency range supported by the controller, call xrHapticParametricGetPropertiesEXTX1(). The supported frequency range is available in the minFrequencyHz and maxFrequencyHz members of XrHapticParametricPropertiesEXTX1.

PCM Haptics

With the PCM haptics API, you can trigger a vibration that is described by a PCM (Pulse Code Modulation) waveform. For controllers with a VCM (Meta Quest Touch Pro, Touch Plus and later), the PCM waveform directly drives the haptic motor. For other controllers (Meta Quest Touch and earlier), an equivalent haptic effect is played.
The PCM haptics API is available with the XR_FB_haptic_pcm OpenXR extension. Read its specification for a detailed description of the API.
To trigger a PCM haptics vibration, pass a XrHapticPcmVibrationFB struct to xrApplyHapticFeedback(). The struct contains a buffer with the PCM waveform that describes the vibration pattern. Here is an example of how to create a XrHapticPcmVibrationFB struct and pass it to xrApplyHapticFeedback:
uint32_t samplesUsed = 0;

XrHapticPcmVibrationFB vibration{XR_TYPE_HAPTIC_PCM_VIBRATION_FB, nullptr};
vibration.bufferSize = pcmSampleBuffer.size();
vibration.buffer = pcmSampleBuffer.data();
vibration.sampleRate = sampleRateHz;
vibration.append = XR_FALSE;
vibration.samplesConsumed = &samplesUsed;

XrHapticActionInfo actionInfo{XR_TYPE_HAPTIC_ACTION_INFO, nullptr};
actionInfo.action = action;
actionInfo.subactionPath = subactionPath;

xrApplyHapticFeedback(session, &actionInfo, reinterpret_cast<const XrHapticBaseHeader*>(&vibration));
To play a vibration at a constant amplitude and frequency, fill the buffer with samples describing a single sine wave:
std::vector<float>
makeSineWave(float amplitude, float frequencyHz, float sampleRateHz, size_t numSamples) {
    double phase = 0.0;

    std::vector<float> samples(numSamples);
    const double angleIncrement = frequencyHz * (2.0 * M_PI) / sampleRateHz;

    for (size_t i = 0; i < numSamples; ++i) {
        samples[i] = amplitude * sin(phase);
        phase = std::fmod(phase + angleIncrement, 2.0 * M_PI);
    }

    return samples;
}

const float sampleRateHz = 2000.0f;
const float amplitude = 0.5f;
const float frequencyHz = 120.0f;
const float durationSecs = 1.1f;
const auto numSamples = static_cast<size_t>(sampleRateHz * durationSecs);
const auto pcmSampleBuffer = makeSineWave(amplitude, frequencyHz, sampleRateHz, numSamples);
The above example uses a sample rate of 2000Hz for the generated haptic signal, and the system resamples the waveform to the sample rate of the controller. If you prefer to match the signal sample rate to that of the controller (saving the system the need to resample), use the sample rate returned by the xrGetDeviceSampleRateFB() function to generate the haptic data:
PFN_xrGetDeviceSampleRateFB xrGetDeviceSampleRateFB = nullptr;
xrGetInstanceProcAddr(instance, "xrGetDeviceSampleRateFB", (PFN_xrVoidFunction*)(&xrGetDeviceSampleRateFB));

XrDevicePcmSampleRateGetInfoFB sampleRateGetInfo{XR_TYPE_DEVICE_PCM_SAMPLE_RATE_GET_INFO_FB};
xrGetDeviceSampleRateFB(session, &actionInfo, &sampleRateGetInfo);
const float optimalSampleRateHz = sampleRateGetInfo.sampleRate;
If the controller does not support PCM haptics, xrGetDeviceSampleRateFB() will return 0. The returned value will change if the connected controller changes, so call this function whenever necessary to ensure your application has the latest information on the device sample rate.

Amplitude Envelope

With the amplitude envelope haptics API, you can trigger a vibration with an intensity that varies over time. How the intensity changes over time is described in the amplitude envelope that is passed upfront in a single API call.
Consider the following complex analog signal.
Analog signal
The amplitude envelope of a signal is a smooth curve outlining its extremes. Amplitude envelope for the above signal would look like this:
Touch Pro amplitude envelope
The amplitude envelope haptics API is available with the XR_FB_haptic_amplitude_envelope OpenXR extension. Read its specification for a detailed description of the API.
To trigger an amplitude envelope haptics vibration, pass a XrHapticAmplitudeEnvelopeVibrationFB struct to xrApplyHapticFeedback(). The struct contains a buffer of amplitudes. Here is an example of how to create an XrHapticAmplitudeEnvelopeVibrationFB struct and pass it to xrApplyHapticFeedback:
XrHapticAmplitudeEnvelopeVibrationFB vibration{XR_TYPE_HAPTIC_AMPLITUDE_ENVELOPE_VIBRATION_FB, nullptr};
vibration.duration = durationNanoSecs;
vibration.amplitudeCount = amplitudes.size();
vibration.amplitudes = amplitudes.data();

XrHapticActionInfo actionInfo{XR_TYPE_HAPTIC_ACTION_INFO, nullptr};
actionInfo.action = action;
actionInfo.subactionPath = subactionPath;

xrApplyHapticFeedback(session, &actionInfo, reinterpret_cast<const XrHapticBaseHeader*>(&vibration));

Localized Haptics

On the Meta Quest Touch Pro controller, there are three haptic elements: one VCM (Voice Coil Motor) and two LRAs (Linear Resonant Actuators) for the thumb and trigger. The LRAs can be triggered using simple haptics only, and vibrate at a fixed frequency.
The localized haptic elements are available in the XR_FB_touch_controller_pro interaction profile. Read its specification for a detailed description.
Here is an example of how to trigger a vibration on the thumb haptic element of the left controller for one second:
// Create paths
XrPath leftHandThumbHapticPath;
XrPath rightHandThumbHapticPath;
xrStringToPath(instance, "/user/hand/left/output/haptic_thumb_fb", &leftHandThumbHapticPath);
xrStringToPath(instance, "/user/hand/right/output/haptic_thumb_fb", &rightHandThumbHapticPath);
XrPath handSubactionPaths[2] = {leftHandThumbHapticPath, rightHandThumbHapticPath};

// Create action
XrActionCreateInfo actionCreateInfo{XR_TYPE_ACTION_CREATE_INFO, nullptr};
actionCreateInfo.actionType = XR_ACTION_TYPE_VIBRATION_OUTPUT;
strcpy(actionCreateInfo.actionName, "thumb_haptic");
strcpy(actionCreateInfo.localizedActionName, "Thumb Haptic");
actionCreateInfo.countSubactionPaths = 2;
actionCreateInfo.subactionPaths = handSubactionPaths;
XrAction thumbHapticAction;
xrCreateAction(actionSet, &actionCreateInfo, &thumbHapticAction);

// Create vibration
XrHapticVibration vibration{XR_TYPE_HAPTIC_VIBRATION, nullptr};
vibration.amplitude = 1.0f;
vibration.duration = 1000000000; // 1 second in nanoseconds
vibration.frequency = XR_FREQUENCY_UNSPECIFIED;

// Create haptic action info
XrHapticActionInfo actionInfo{XR_TYPE_HAPTIC_ACTION_INFO, nullptr};
actionInfo.action = thumbHapticAction;
actionInfo.subactionPath = leftHandThumbHapticPath;

// Trigger vibration
xrApplyHapticFeedback(session, &actionInfo, reinterpret_cast<const XrHapticBaseHeader*>(&vibration));

Known Issues

There is a known limitation in the current release: If haptics is triggered on the thumb LRA and on both the controllers using XR_PATH_NULL in subactionPath of XrHapticActionInfo, then haptics are played for double the time on the right controller’s thumb LRA. To mitigate this, the application will have to call xrStopHapticFeedback on the right controller.
Did you find this page helpful?