Jump to content

Pinkrush

Verified Members
  • Posts

    5
  • Joined

  • Last visited

Everything posted by Pinkrush

  1. Thank you, I tried with the new Plugin and there it is running with 80fps. I used the Plugin 1.0.8 and there the eye gaze data is fetched differently. I will try and rewrite my code according to the Eye tracking example in the new Plugin. I hope that will solve the issue.
  2. Hi, I implemented the eyetracking in Unity like it is proposed in the Vive Tutorial (https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/integrate-facial-tracking-your-avatar/). But the programm is only running on 45fps. Without the eye tracking the application runs on 90fps. Any ideas why the fps are so low and how this can be improved?
  3. Hi, I implemented the eyetracking in Unity like it is proposed in the Vive Tutorial (https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/integrate-facial-tracking-your-avatar/). The eye tracking works in principle but it is slightly off. It gravitates to the left. Any ideas why this happens? I used this code to get the lookat position: if (NeededToGetData) { facialManager.GetWeightings(out EyeWeightings); Vector3 LeftEye = Vector3.zero; Vector3 RightEye = Vector3.zero; Vector3 GazeDirectionCombinedLocal = Vector3.zero; if (EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_IN_HTC] > EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_OUT_HTC]) { GazeDirectionCombinedLocal.x = EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_IN_HTC]; } else { GazeDirectionCombinedLocal.x = -EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_OUT_HTC]; } if (EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_UP_HTC] > EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_WIDE_HTC]) { GazeDirectionCombinedLocal.y = -EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_UP_HTC]; } else { GazeDirectionCombinedLocal.y = EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_WIDE_HTC]; } GazeDirectionCombinedLocal.z = (float)1.0; LeftEye = Quaternion.Euler(cam.transform.localEulerAngles) * GazeDirectionCombinedLocal; if (EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_IN_HTC] > EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_UP_HTC]) { GazeDirectionCombinedLocal.x = -EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_IN_HTC]; } else { GazeDirectionCombinedLocal.x = EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_UP_HTC]; } if (EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_UP_HTC] > EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_DOWN_HTC]) { GazeDirectionCombinedLocal.y = EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_UP_HTC]; } else { GazeDirectionCombinedLocal.y = -EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_DOWN_HTC]; } GazeDirectionCombinedLocal.z = (float)1.0; RightEye = Quaternion.Euler(cam.transform.localEulerAngles) * GazeDirectionCombinedLocal; Debug.Log($"Left x {LeftEye.x} Right x {RightEye.x} diff = {LeftEye.x - RightEye.x} __ LeftEye ? Right {Mathf.Approximately(LeftEye.x, RightEye.x) },{Mathf.Approximately(LeftEye.y, RightEye.y)}, {Mathf.Approximately(LeftEye.z, RightEye.z)}"); lookAtDir = (LeftEye + RightEye) / 2;
×
×
  • Create New...