Avatar Developer Guide (Retired)
Updated: Jul 21, 2021
The example project demonstrates the following:
- Using avatar classes to create and destroy UE avatar objects.
- Hooking up the lip-sync component to drive expressive features.
- Tagging objects in the scene as gaze targets for the expressive avatar’s eyes.
- Changing hand poses to custom hand poses.
- Recording local avatar movement packets and replaying the packets back on remote avatars (including voice visualizations).
Note: Meta Avatars for UE are for C++ projects. A Blueprints version is not available at this time.
- Unreal Editor 4.21 or later from the Meta GitHub release
- Microsoft Visual Studio 2015 or 2017 with C++
Architecture of a UE Avatar Project
Avatars for UE are implemented as a plugin. Avatars are embodied within UOvrAvatarActorComponents that you can attach to the UE actors you desire. This lets you keep your game-side code separate from our avatar implementation.
Notable files in the sample project folder (UnrealEngine\Samples\Meta\AvatarSamples) include the following:
Config/DefaultEngine.ini: Contains the app ID and adds Meta Horizon platform as a subsystem. Each game has its own DefaultEngine.ini file where settings can be added and configured. When creating apps using avatars, you must edit this file to add the App ID.Config/Android/AndroidEngine.ini: Contains Android specific overrides for configuration the online subsystem to Meta.Source/AvatarSamples/LocalAvatar.cpp and RemoteAvatar.cpp: Contain the “game-side” classes that demonstrate how to attach avatar components to actor classes.AvatarSamples.uproject: Unreal project file for the sample.
Launching the Avatar Samples Unreal Project
- Download, build, and launch the Meta Source Distribution of the Unreal Engine.
- From the Unreal Project Browser window, press Browse and open AvatarSamples.uproject at
UnrealEngine\Samples\Meta\AvatarSamples. - Click Play > VR Preview.
- Put on your device.
You should see the hands of your avatar. This first-person view where you only see your hands is referred to as the local avatar.
The code that configures your avatar can be found in LocalAvatar.cpp:
void ALocalAvatar::LipSyncVismesReady()
{
if (UseCannedLipSyncPlayback)
{
AvatarComponent->UpdateVisemeValues(PlayBackLipSyncComponent->GetVisemes());
}
else
{
AvatarComponent->UpdateVisemeValues(LipSyncComponent->GetVisemes());
}
}
void ALocalAvatar::PreInitializeComponents()
{
Super::PreInitializeComponents();
if (UseCannedLipSyncPlayback)
{
FString playbackAssetPath = TEXT("/Game/Audio/vox_lp_01_LipSyncSequence");
auto sequence = LoadObject<UOVRLipSyncFrameSequence>(nullptr, *playbackAssetPath, nullptr, LOAD_None, nullptr);
PlayBackLipSyncComponent->Sequence = sequence;
FString AudioClip = TEXT("/Game/Audio/vox_lp_01");
auto SoundWave = LoadObject<USoundWave>(nullptr, *AudioClip, nullptr, LOAD_None, nullptr);
if (SoundWave)
{
SoundWave->bLooping = 1;
AudioComponent->Sound = SoundWave;
}
}
#if PLATFORM_WINDOWS
else
{
auto SilenceDetectionThresholdCVar = IConsoleManager::Get().FindConsoleVariable(TEXT("voice.SilenceDetectionThreshold"));
SilenceDetectionThresholdCVar->Set(0.f);
}
#endif
// TODO SW: Fetch Player Height from Meta Horizon platform?
BaseEyeHeight = 170.f;
AvatarComponent->SetVisibilityType(
AvatarVisibilityType == AvatarVisibility::FirstPerson
? ovrAvatarVisibilityFlag_FirstPerson
: ovrAvatarVisibilityFlag_ThirdPerson);
AvatarComponent->SetPlayerHeightOffset(BaseEyeHeight / 100.f);
AvatarComponent->SetExpressiveCapability(EnableExpressive);
AvatarComponent->SetBodyCapability(EnableBody);
AvatarComponent->SetHandsCapability(EnableHands);
AvatarComponent->SetBaseCapability(EnableBase);
AvatarComponent->SetBodyMaterial(GetOvrAvatarMaterialFromType(BodyMaterial));
AvatarComponent->SetHandMaterial(GetOvrAvatarMaterialFromType(HandsMaterial));
}
ALocalAvatar::ALocalAvatar()
{
RootComponent = CreateDefaultSubobject<USceneComponent>(TEXT("LocalAvatarRoot"));
PrimaryActorTick.bCanEverTick = true;
AvatarComponent = CreateDefaultSubobject<UOvrAvatar>(TEXT("LocalAvatar"));
PlayBackLipSyncComponent = CreateDefaultSubobject<UOVRLipSyncPlaybackActorComponent>(TEXT("CannedLipSync"));
AudioComponent = CreateDefaultSubobject<UAudioComponent>(TEXT("LocalAvatarAudio"));
LipSyncComponent = CreateDefaultSubobject<UOVRLipSyncActorComponent>(TEXT("LocalLipSync"));
}
void ALocalAvatar::EndPlay(const EEndPlayReason::Type EndPlayReason)
{
LipSyncComponent->OnVisemesReady.RemoveDynamic(this, &ALocalAvatar::LipSyncVismesReady);
PlayBackLipSyncComponent->OnVisemesReady.RemoveDynamic(this, &ALocalAvatar::LipSyncVismesReady);
if (!UseCannedLipSyncPlayback)
{
LipSyncComponent->Stop();
}
}
void ALocalAvatar::BeginPlay()
{
Super::BeginPlay();
uint64 UserID = FCString::Strtoui64(*OculusUserId, NULL, 10);
#if PLATFORM_ANDROID
ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Three;
if (AvatarComponent)
{
AvatarComponent->RequestAvatar(UserID, lod, UseCombinedMesh);
}
#else
ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Five;
IOnlineIdentityPtr IdentityInterface = Online::GetIdentityInterface();
if (IdentityInterface.IsValid())
{
OnLoginCompleteDelegateHandle = IdentityInterface->AddOnLoginCompleteDelegate_Handle(0, FOnLoginCompleteDelegate::CreateUObject(this, &ALocalAvatar::OnLoginComplete));
IdentityInterface->AutoLogin(0);
}
#endif
if (UseCannedLipSyncPlayback)
{
PlayBackLipSyncComponent->OnVisemesReady.AddDynamic(this, &ALocalAvatar::LipSyncVismesReady);
}
else
{
LipSyncComponent->OnVisemesReady.AddDynamic(this, &ALocalAvatar::LipSyncVismesReady);
LipSyncComponent->Start();
}
}
Press the thumb sticks to cycle through the following hand poses:
- A built-in pose for gripping a sphere:
AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripSphere);
- A built-in pose for gripping a cube:
AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripCube);
- A custom hand gesture built from an array of joint transforms, gAvatarRightHandTrans:
AvatarComponent->SetCustomGesture(ovrHand_Right, gAvatarRightHandTrans, HAND_JOINTS);
- A built-in pose depicting Touch controllers:
AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_Default);
AvatarComponent->SetControllerVisibility(ovrHand_Right, true);
The code snippets above are from LocalAvatar.cpp and set the poses for the right hand. For the left hand, substitute the appropriate left hand functions and constants.
Adding Avatars to An Existing Project
Avatars are implemented as a plugin in Unreal. To add avatar support to a new or existing Unreal project, select Edit > Plugins while in the Unreal Editor. From the Plugins window, search for Meta and enable the Meta Avatar and Online Subsystem Meta plugins. The Online Subsystem Meta plugin is necessary to query and retrieve avatars from the Meta Horizon platform.
You can also add the Meta Avatar and Online Subsystem Meta plugins by editing the Modules and Plugins sections of your .uproject file. Remember to add a comma (,) to the last item in any existing Modules or Plugins sections before pasting the additional lines. This example is from AvatarSamples.uproject
"Modules": [
{
"Name": "AvatarSamples",
"Type": "Runtime",
"LoadingPhase": "Default",
"AdditionalDependencies": [
"Engine",
"OnlineSubsystem",
"OnlineSubsystemUtils"
]
}
],
"Plugins": [
{
"Name": "OnlineSubsystemOculus",
"Enabled": true
},
{
"Name": "OculusAvatar",
"Enabled": true
}
]
Place your request to fetch the avatar wherever you have set up online login functionality. This example is from LocalAvatar.cpp:
void ALocalAvatar::OnLoginComplete(int32 LocalUserNum, bool bWasSuccessful, const FUniqueNetId& UserId, const FString& Error)
{
IOnlineIdentityPtr OculusIdentityInterface = Online::GetIdentityInterface();
OculusIdentityInterface->ClearOnLoginCompleteDelegate_Handle(0, OnLoginCompleteDelegateHandle);
bool UseCombinedMesh = true;
const uint64 localUserID = 10150022857753417;
#if PLATFORM_ANDROID
ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Three;
#else
ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Five;
#endif
if (AvatarComponent)
{
AvatarComponent->RequestAvatar(localUserID, lod, UseCombinedMesh);
}
}
This code also demonstrates how to enable combined meshes and set the level of detail (LOD) for avatars.
When requesting an avatar, set UseCombinedMesh to true to enable combined meshes, which reduce draw call overhead for the avatar’s body.
In this example, the LOD for the requested avatar is set to ovrAvatarAssetLevelOfDetail_Five (high LOD). This LOD setting uses high-resolution meshes and textures, but requires high performance. There are also lower LODs available, ovrAvatarAssetLevelOfDetail_One and ovrAvatarAssetLevelOfDetail_Three, which conserve more resources. Low LOD (ovrAvatarAssetLevelOfDetail_One) is ideal for distant avatars or crowds.
In this example, the LOD for the requested avatar is set to ovrAvatarAssetLevelOfDetail_Three (medium LOD). This improves performance on Android devices by using lower resolution meshes and textures. There is also a low LOD, ovrAvatarAssetLevelOfDetail_One, which conserves even more resources. Low LOD is ideal for distant avatars or crowds.
Expressive features give Meta Avatars more advanced facial expressions. Expressive features increase social presence and make interactions seem more natural and dynamic. Expressive features are comprised of the following:
- Realistic lip-syncing powered by Oculus Lipsync technology.
- Natural eye behavior, including gaze dynamics and blinking.
- Ambient facial micro-expressions when an avatar is not speaking.
Once you’ve completed your integration, you can test by retrieving some avatars in-engine. Use the following user IDs to test:
- 10150022857785745
- 10150022857770130
- 10150022857753417
- 10150022857731826