cluelou Posted April 2 Share Posted April 2 Hello, I would like to use the VIVE Focus 3 with the belonging "Eye Tracker", and I've been trying to implement the tracking into Unity. My main goal is to collect Gazing data on a 360-degree video to get an understanding of where participants are looking at a specific view-out, see figure of example: Reference Picture I have been trying for hours and hours to get the Eye-Tracking working with Unity, but none of the tutorials regarding VIVE Wave or OpenXR works. Do any of you have any recommendations or step-by-step videos/pictures that could help me out? This is the first time using Unity, so it is quite a task. Best Regards, Louis H Link to comment Share on other sites More sharing options...
MarcusN Posted April 3 Share Posted April 3 Hi @cluelou, There is a sample project that you can reference inside the Wave SDK. It is located here: Assets/Wave/Essence/InputModule/5.2.0-r.8/Demo/EyeTracking.unity. You will also need to enable Eye Tracking under Project Settings > XR Plug-in Management > WaveXRSettings. Link to comment Share on other sites More sharing options...
cluelou Posted April 7 Author Share Posted April 7 Thank you for the tip @MarcusN 🙂 Firstly: After creating an XR Rig and choosing an Android application when pressing "Build and Run" in the "Build Settings", it only supports "Head movement" and not Eye Tracking. Meaning I can only toggle the buttons when the middle of the HMD is focused on it. Not my eyes. Does it make sense and do you know what I need to do to fix it? Link to comment Share on other sites More sharing options...
cluelou Posted April 7 Author Share Posted April 7 I got it to work with the eyes being tracked, but now i want to actually collect data and use a video to collect data from. Any information on this? Link to comment Share on other sites More sharing options...
Alex_HTC Posted April 7 Share Posted April 7 Sounds like you want a video player like avpro - here's an openxr example https://github.com/hardcoded2/AVProTest/tree/openxr And collecting data is a matter of saving the output of raycasts from the user's eyes in whatever format makes sense to you (json/xml/flatbuffers/whatever) Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now