Jump to content

Search the Community

Showing results for tags 'face tracking'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • VIVE
    • VIVE Community Forums
    • VIVE Technical Support
    • VIVE Developer Forums


  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



About Me

Found 3 results

  1. Hi! I have been using VIVE Pro facial tracker for a while. I want to get the name or the label or weight at least, like the unique data of the facial expression being shown on the screen of Unity Engine when the avatar replicates the facial expression. I have edited the code in the script named as (SRanipal_AvatarLipSample.cs) in line (39). Here are some of the the results in the console- The value '0' is shown when I make no facial expressions, but other values are shown when I am making other facial expressions. Am I doing the right way, if the respective values of the real time facial expressions are being shown or does it show something related to the (Mouth_Smile_Right) expression only? Looking for a kind reply. Thanks!
  2. Is there any .csv's or data dumps that are publicly available for the face tracking streams?
  3. I am a VR developer and recently got the HTC Vive Lip Tracking module - and to my dissapointment it does not track accuratly for me at all, even when testing at different angles, lighting conditions, or different avatar softwares. For me and a few other people it does not register smiling, frowning, lip raising, and too often pokes the tongue out even when the mouth is closed. The original engineers who developed the units had particular face types, however, me and many of my coworkers have different face shapes and sizes. The lip tracker very badly needs a calibration option such as it asking you to take certain mouth shapes - JawOpen, Bare Teeth (raise upper lips and lower lower lips), and Smile (Pulling back and up the corners of the mouth only). We have found these mouth shapes to be the most innacurately tracked, but if we had a calibrator then the machine would see what our unique mouth shapes are in relation to what it needs. Currently, the only thing that is mildly accurate is JawOpen and Pout. As it is right now, for its price, its less accurate than a SnapChat filter. For many of us, until we can calibrate them it is sadly unusable in its current condition. Please HTC - create a calibration program - much like there is a calibration program for the eyetracking. @Dario
  • Create New...