Jump to content

woody loks

Verified Members
  • Posts

    5
  • Joined

  • Last visited

Reputation

0 Neutral

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I found the mistake that I was doing for QUERY 2. I haven't enabled this in my own project : "If you want to use Eye tracking in Editor, you need to tick the “Enable Eye by Default” box in Project Settings -> Plugins -> SRanipal of your UE4 project" Now I'm getting the directions of both the eyes but still fixation and confidence values are zero only as shown below and a new problem also came as mentioned below, A new problem here is: suddenly the tracking is going OFF i guess. I say this after seeing the values before and after that window from row 218 to 225 above. If you guys know how to get rid of that problem, pls share here... @Daniel_Y @MadisV @MariosBikos @cte @HackPerception @AnanyaSairaj @komnidan@paco67@mserdarozden @akali @Jori1110 @Franka Thanks & Regards Lokesh
  2. @Daniel_Y @MadisV I thought your insights could be valuable to my above queries...
  3. Hi @Marc Moukarzel I tried to extract data from SRanipal for eye direction, fixation, and confidence value with a unix timestamp using the Blueprint mentioned here by @cte and I am getting the x,y,z values for left eye and right eye directions in the columns B & A respectively as shown in the image attached below, BP: Query 1: But do you know why i am getting the fixation and confidence values (column C & D values) as zero every time? (I implemented the above actor blueprint that i have created in the sample level ("EyeSample") that came with HTC vive SRanipal sdk zipped folder. (the one which has DART board and a human head to track our eye movements) Query2: At least in this eye sample level, I was able to get the right and left eye directions but when I implemented the same blueprint in my own project (a VR project containing animations of several metahumans to mimick scenarios that are happening in an office environment that I created for my academic research purpose) after installing SRanipal & victory plugins & placing that actor BP inside the scene in that project as mentioned here, i didn't even got the data of the directions of both the eyes that i was getting earlier in the sample level that came with SRanipal SDK. The data that i got is attached below, Should I copy any other files for it to work in other projects? What mistake I am doing here? Query3: After getting the directions of both the eyes, how can I analyze it to use it for my research purpose? Does HTC VIVE has any eye tracking data analytics tool similar to what VARJO team has built? If NO, could you share any method to proceed after getting those data in a .CSV file? (I am new to this VR stuff and programming, so i am facing these difficulties) Could you please help me out? @MariosBikos @cte @HackPerception @AnanyaSairaj @komnidan@paco67@mserdarozden @akali @Jori1110 @Franka and other people who are working in the same domain... If anyone were able to successfully store and analyse the eye tracking data coming from HTC VIVE PRO EYE, please share your insights here so that it will be helpful for other people who are going to use this SRanipal SDK in future... Also if HTC VIVE team provides their own analytics tool like other competitors, it would be very much helpful. Thanks in Advance! Regards Lokesh
  4. @cte kudos for your effort. I tried to implement the same and i am getting the x,y,z values for left eye and right eye directions in the 2nd and 1st column respectively as shown in the image attached below. But do you know why i am getting the fixation and confidence values (3rd and 4th column values) as zero every time? I implemented the above blueprint in the sample project ("EyeSample") that came with vive SRanipal sdk zip folder. (the one which has DART board and a human head to track our eye) Were you able to get some data other than zero for fixation and confidence values? If yes, pls share your insights on it. Thanks in advance!! Have a great day!
  5. @patrikabroad @zws @Tony PH Lin Were you able to find any other source (YouTube, Reddit, etc.) to make use of the eye tracking option in unreal engine? I am a masters' student started to explore this eye tracking option with HTC VIVE PRO EYE device and I dont find any proper source to get started... I want to know how we can record and store the eye gazing data of the user from HTC VIVE PRO EYE and analyse it for my research purpose? If you could throw some light on it, it would be helpful. Thanks in Advance!! Lokesh
×
×
  • Create New...