Jump to content

Assistance with Eye Tracking in Unity


Recommended Posts

I'm looking to use a Vive Pro Eye to obtain eye tracking information for my research.  I've been wrestling with the SDK's and Unity to try to get something up and running and I've not been able to find a combination that works.

I've been assuming (incorrectly) that if I use the latest versions of both Unity and the HTC SDK's that I shouldn't encounter any errors.  However, I do see that some have had success getting these products integrated.  Would someone be able to share what versions of the SDK and Unity they've been able to successfully integrate and the steps they've taken? My research isn't on this specific topic so this has been a significant distraction to my main work.

I've been able to use Vizard but their academic pricing is extremely steep and we're unsure if our lab has that kind of capital to lay out.

I really appreciate any assistance that you can provide.

Cheers!

Link to comment
Share on other sites

57 minutes ago, MarcusN said:

Hi @MaxJay,

Have you seen this documentation for eye tracking? https://developer.vive.com/resources/vive-sense/eye-and-facial-tracking-sdk/download/latest/
You should be able to download the latest SDK and obtain the latest documentation.

Can confirm the latest SDK works with SteamVR + Vive Console (Vive console contains the latest SRanipal). I'm working on a software package called simpleXR which is intended to streamline VR for research.  I've been running into hiccups with getting the pro eye to work with OpenXR, but I can host a working Pro Eye Unity project with the required dependencies.  Give me ~an hour!

  • Like 1
Link to comment
Share on other sites

17 hours ago, MarcusN said:

Hi @MaxJay,

Have you seen this documentation for eye tracking? https://developer.vive.com/resources/vive-sense/eye-and-facial-tracking-sdk/download/latest/
You should be able to download the latest SDK and obtain the latest documentation.

Thank you @MarcusN

I have downloaded all of the latest versions of Unity and the SDK and when I try to run the examples in Unity it crashes (similar to the experiences others have posted elsewhere). I'm unsure that I'll be back into the lab before the beginning of next semester but when I am I will post the specific errors that Unity is throwing before it crashes. I have a co-researcher in another department who also experiences these crashes when she tries to run the VIVE SDK example files (from the latest SDK) in Unity.

I am able to run the latest SRAniPal executable and calibrate it within StreamVR.

Link to comment
Share on other sites

16 hours ago, Justin_K said:

Can confirm the latest SDK works with SteamVR + Vive Console (Vive console contains the latest SRanipal). I'm working on a software package called simpleXR which is intended to streamline VR for research.  I've been running into hiccups with getting the pro eye to work with OpenXR, but I can host a working Pro Eye Unity project with the required dependencies.  Give me ~an hour!

Thank you @Justin_K

I can confirm that my installation of SRAniPal is working within StreamVR and that it calibrates. I'm having issues specifically with Unity and the Unity SDK. I will post specific errors when I get to the lab (but that probably won't be until January). I can also confirm that I've got eye tracking working in the demo version of Vizard. This is very much a Unity SDK issue with Unity. Now I'll be the first to admit that I'm learning Unity too, but I've seen others having issues and it would be nice to have a thread that solves the issue step by step.

Thanks!

Link to comment
Share on other sites

20 hours ago, MaxJay said:

Thank you @Justin_K

I can confirm that my installation of SRAniPal is working within StreamVR and that it calibrates. I'm having issues specifically with Unity and the Unity SDK. I will post specific errors when I get to the lab (but that probably won't be until January). I can also confirm that I've got eye tracking working in the demo version of Vizard. This is very much a Unity SDK issue with Unity. Now I'll be the first to admit that I'm learning Unity too, but I've seen others having issues and it would be nice to have a thread that solves the issue step by step.

Thanks!

Sorry, this took much longer than an hour 😛

https://github.com/unity-sXR/sXR

It's not well documented yet, but I think all the main kinks have been worked out...  Should be getting a lot of updates in the next week. You should be able to download this Github library as a zip file, unzip it, and point Unity Hub at the project.  

It was built in Unity v2023.1.0a14 so I'd recommend going with that version/newer.  It also requires Microsoft's .Net2.1, Vive Console(through Steam), and SteamVR.  I believe it should have incorporated all the required packages (including the SRanipal SDK) into the project. 

The functionality you're looking for is in Assets/sxr/Backend/Singletons/GazeHandler.cs

To use SRanipal, click on the sxr tab on the task bar.  Open the settings menu and click "Use SRanipal".  It will work right now since the conditional formatting is commented out, but I plan on having this work for all devices.  In a soon-to-be update, you'll have to have the option clicked to use SRanipal instead of OpenXR.  There's also an option to use an autosave which is pretty nice.  It can also handle a VR-GUI (for displaying task instructions), writing data, full screen shaders, camera tracking, sounds, controllers, and a bunch of other nifty stuff.  Right now it's kind of spread out, but in the next week there will be simple one line commands for everything, all within the main sxr class e.g.:

if(sxr.ControllerButton(sxr.Buttons.RightTrigger){  
	sxr.ShowImageGUI("instructions"); 
	sxr.StartTimer("instructionTimer");
	sxr.PlaySound("instructionsSound");
	sxr.WriteToTaggedFile("subjectEyeData", sxr.GetFullEyeData());
}

Right now, you have to access the GazeHandler info by declaring the verboseData variable as public and then using GazeHandler.Instance.verboseData.  If you type that into your IDE, it will show you all the SRanipal stuff that you can access like pupil size, how open the eyes are, etc.  I should have all those implemented as sxr functions tomorrow (but I also thought it would take an hour to get this to you so maybe not :P). In the near future it's going to have more features (like the ability to replay trials with the user's gaze highlighted), plus Google Colab notebooks for visualizing paths, analyzing eye info, etc. I should be putting a paper out soon for it and would love to have feedback/feature requests. If you end up using it and run into questions, shoot me an email: "justin_kasowski@ucsb.edu".

Goodluck!

Edited by Justin_K
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...