Jump to content

Vive Focus 3 Eye Tracking, Vive Business Streaming, and OpenXR


Recommended Posts

When running the eye tracker on the Focus3 over VBS, I don't see any input device signals that correspond to individual eyes, pupil dilation, or any other metrics.  We only see correct data from the pose of the combined eye ray. Some of the signals that are supposed to exist do not ever change value (such as the flags that indicate the eye tracker is tracking or that the user is wearing the HMD). Is this information available in Unity via OpenXR/VBS?  If not, would it be available over Wave SDK?  The documentation doesn't seem to indicate that using Wave over VBS is possible, and indeed many options are not available under Standalone configuration but are available under Android when the SDK is installed.

My current platform is:

  • Unity 2021.3.16f1
  • VIVE OpenXR Plugin – Windows 1.0.10
  • Unity’s XR Plugin Management 4.2.1
  • Unity’s Input System 1.4.4
Link to comment
Share on other sites

  • 3 weeks later...

When you're saying that you're running the eye tracker on the Focus3 over VBS, what does that mean? 

I've been trying for some time now to get any data out from the Focus 3 regarding the Eye Tracker, but I don't know how to go on about it...

Link to comment
Share on other sites

We gave up on using OpenXR. It's buggy, missing information. and the documentation is actively wrong and outdated in some areas.  Here's a few notes on what we put together to work:

With  Unity 2021.3.16f1, I used the eye tracking sdk (1.3.6.8) from https://developer.vive.com/resources/vive-sense/eye-and-facial-tracking-sdk/download/latest/  This is different from the wave sdk.    The examples from the SDK work out of the box, and only take minimal tweaking to include in your own project. One head's up with the SDK is that if the eye tracker isn't responding, initialization of the SDK can freeze your application for 30 to 45 seconds.  Make sure to use Version 2 of the API with the Focus 3.  Version 1 for Vive Pro Eye.

Because we aren't creating a native Android application for the Focus 3, we need to run VBS (https://business.vive.com/us/support/vbs/category_howto/vive-business-streaming.html) and SteamVR on the host PC.  When running VBS and the Focus 3, you need to make sure that "Compatibility mode" is turned off in the VBS settings under "Input".  Otherwise the controllers will not track.  You will also need to install the SR_anipal support software, which doesn't have it's own installer.  Install "Vive Console" from steam.   SR_anipal gives a lot of UAC prompts, we discovered that if you run it as administrator once you can skip the UAC elevation afterwards.  This is critical since our product is being distributed to users without administrator access.

The headset only has two USB-c ports, but one is low bandwidth for the facial tracker.  You can't use it for either the eye tracker or the VBS streaming cable.  So you will have to use WiFi for streaming over VBS.  This requires a WiFi 5 or 6 router.   We setup a dedicated WiFi access point for this and connected the PC to the router via ethernet.  I've run into a number of transient connectivity issues on this small isolated network, the software supporting VBS seems to be somewhat unreliable.   We also haven't figured out if there's a way to change which PC the VBS client on the headset is connected to without stopping the feed from the connected PC.  

We have not figured out how to automatically or programmatically trigger eye calibration on the Focus3.  The API method attempts to launch the calibration application on the PC, which fails and has to be killed from Task Manager. 

Link to comment
Share on other sites

  • 7 months later...

Hi everyone,

I just want to catch up with the mentioned topic. I'm working on a research project and want to setup a simulation in Unity and Unreal. It will be not possible to setup the simulation on the headset itself, so I need to use VBS too. The most important values in the simulation are the eye-tracking metrics, especially the eye-openess and the pupil diameter. I did not order the headset jet, so I can't try out the data gathering myself. Did the current SDK change in the way, that the reading of the eye-tracking data is now possible via VBS?

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...