Jump to content

Vive Eye and Facial Tracking to control Unreal Metahumans' faces


Recommended Posts

Hi all,

 

I'm interested in using the Metahumans from Unreal Engine as avatars for a VR experience.

I was wondering if I can use the inputs from the Vive's eye tracking and facial tracker to control such characters' facial rig.

In the Unreal website the Vive compatibility is not mentioned:

"MetaHumans come with a full facial and body rig, ready to animate in Unreal Engine, either keyed, or using a performance capture solution like our own Live Link Face iOS app. There's also support in the works from the vendors of ARKit, DI4D, Digital Domain, Dynamixyz, Faceware, JALI, Speech Graphics, and Cubic Motion solutions."

 

Thanks in advance!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...