Jump to content

DrAdamStreck

Verified Members
  • Posts

    5
  • Joined

  • Last visited

Posts posted by DrAdamStreck

  1. Hi,

    I'm testing using OpenXR and the new Vive WAVE.

    I'm able to run the demo as described here and the tracking works as it should: https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/integrate-facial-tracking-your-avatar/ .

    However when using the OpenXR Unity Gaze system, the device is not tracked. The following code prints 'Not tracked', meaning an EyeTracking device has been found, but is not tracking.
     

    private InputDevice eyeDevice;
    private static readonly List<InputDevice> Devices = new();
    
    private void Update()
    {
        if (!eyeDevice.isValid)
        {
            InputDevices.GetDevicesWithCharacteristics(InputDeviceCharacteristics.EyeTracking, Devices);
            if (Devices.Count > 0)
            {
                eyeDevice = Devices[0];
            }
            Debug.LogWarning("Not valid");
        }
        else if (!eyeDevice.TryGetFeatureValue(CommonUsages.isTracked, out bool isTracked))
        {
            Debug.LogWarning("Failed to access isTracked");
        }
        else if (!isTracked)
        {
            Debug.LogWarning("Not tracked");
        }
    }


    Vive Pro Eye
    Unity 2021.3.1f1
    SRanipal 1.3.5.4
    VIVE Wave OpenXR Plugin 1.0.4
    Both Facial tracking and Eye Gaze Profile are on:
    image.thumb.png.a5986b5c60798e2762d0caa2b1a2a8af.png

     

     

  2. On 11/16/2020 at 12:16 AM, Av said:

    Hi @Adam Streck

    Apologies for only now getting back to you, I missed your message. I don't have my HMD (it's in my research lab and it's tough to get to with COVID-19 restrictions) so it's hard for me to test your code and troubleshoot. What I can say is that we ended up using a different workflow that with some tweaking seems to work for our needs. I'm posting the steps we took below. Let me know if you have any questions.

    We started off by setting up a raycast with the camera as the origin, and gaze direction. We took the hit point we got from the raycast and used WorldToScreenPoint to convert the units into pixels. 

    There's a very obvious limitation that for the raycast to work, there needs to be a game object for the ray to collide with. Your method avoids that limitation which is awesome!

    Next time I'm able to access the HMD I'll give your code a try and see what I can find. Sorry I can't be more helpful, if you do decide to go with our solution and have questions, let me know. 

    Avi 
     

     

     

    Thanks for the reply either way. Raycast is not an option for us but I still don't know how it would help since I'd need to do the inverse projection matrix calculation and I don't know the projection matrix for the VR view.

    It seems that this whole issue is actually a Unity bug / behaviour so I'm trying to solve it with them now, see http://fogbugz.unity3d.com/default.asp?1304082_gsk09e85lf3m2k7i

  3. Once we start the calibration process from Unity, using the

    SRanipal_Eye_API.LaunchEyeCalibration(IntPtr.Zero);

    the environment changes to the grey calibration screen and displays "Loading" with the loading animation. This disappears after about 5 seconds, and only the gray background remains. Previously, we would see an animation that would advise us on correct positioning of the headset. 

    If we try to restart our application once the calibration has been started, our application freezes.

    This issue has appeared last week and it seems it follows after the SRanipal SW auto-updated. We have reproduced it on two different PCs.

×
×
  • Create New...