Jump to content


Verified Members
  • Posts

  • Joined

  • Last visited


2 Neutral

1 Follower

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi! As a control experiment I am trying to track a 'mock' eye (currently a picture of an eye) but I am not able to do it (All or almost all frames are marked as invalid). Do you have some tips about how I can achieve good tracking of none real eye? (currently an eye picture but I am open to hear other suggestions). Thanks! Neomi
  2. @Corvus @Tony PH Lin Is there a link where it is possible to download all sdk and runtime versions? the 'archive' tab in -https://developer.vive.com/resources/vive-sense/eye-and-facial-tracking-sdk/download/archive/ is empty. Thanks
  3. @Corvus Just as a clarification to my previous post, I attach here a picture of the histogram of time intervals between frames I get. The intervals vary between 5- 10 ms. As I am using SR anipal for research I would like to get a much narrower range focused around 8.33 ms. Is that possible? How can I achieve that? Thank you again! Neomi
  4. @Corvus thank you for your code! I tried using it (as is with minor modifications for saving the data to a csv, see the attached) in different unity-sdk-runtime combinations (sdk- 1.3.1 and 1.3.3,runtime 1.3.1and 1.3.2, unity- 2019.0.4- 2019.4.17) In all combinations I am not able get a fixed 120 hz sampling. Usually 45% of the frames have an time interval larger than 8.33 ms. What is the reason for this behavior? I attach here steam vr system report. I will be glad to receive your response. Thank you in advance! Neomi SteamVR-2022-03-20-PM_03_17_29.txt Eyes_Corvus_Method.cs
  5. Hi, I working with eye tracking data that I collected with Sr anipal and Unity 2019.4 a few months ago. I have noticed that frames which are marked as valid sometime have non-valid data values (a series of zeros or -1 values). I attach here a figure exemplifying this issue. In this figure there are histograms of data values of gaze origin and pupil dilation. All data values included in these distributions have validity of four or greater than four. (which according to -https://www.frontiersin.org/articles/10.3389/fpsyt.2020.572938/full , should provide valid data). What is the reason for this issue? How can I solve it for next time using Sr anipal? Thanks a lot in advance! Neomi
  6. Hi qxxb, First of all- thank you for sharing the fix to the bug and your code (https://gist.github.com/qxxxb/d1357828c16d27873751280d7222ea25)- It helped me a lot in visualizing the gaze. I have one question: in your code you set the endpoint of the LineRender in the following way (line 133 ): llr.SetPosition(1, gr.leftOrigin + gr.leftDir * 20); My question is-why do you multiple the resulting eye direction by 20? When I convert gaze origin and direction to world coordinates as in your code -I receive world coordinates values which their direction seems a bit off to me. Am I missing a multipcation of the resulting direction? why is this multiplication needed? What specify the multipcation value (why 20)? Thank you so much in advance! Neomi
  7. Hi, I have a few questions about the eye tracker output which I was not able to deduce about from the sdk or the unity c# and sample example. I would greatly appreciated your feedback regarding the following topics: Validity of the data: I am currently using this condition (SRanipal_Eye_Framework.Status == SRanipal_Eye_Framework.FrameworkStatus.WORKING) within the update loop before colleting eyes data. Later, I am using the bitmask value (verboseData.left.eye_data_validata_bit_mask) as an indication of data validity (I use only frames in which the bitmask equals 31, as described in- https://www.frontiersin.org/articles/10.3389/fpsyt.2020.572938/full ). Yet, in some cases I still get values that seems incorrect. Are these two conditions the only conditions needed to ensure data validity? What is the meaning of the bitmask value? Calculation of visual angle: I will be glad if you confirm that I understand the eye tracker output and the necessary conversions correctly: In each frame I receive the normalized direction in x , y, z (verboseData.left.gaze_direction_normalized); In order to retrieve the horizontal angle and vertical angle between the eye and the lens, I use: Horizontal angle= arctan (Norm_dir_x/Norm_dir_z) Vertical_angle= arctan (Norm_dir_y/Norm_dir_z); Calculation of screen position in pixels: It is mentioned that the field of view is 110 deg and the the pixel resolution is 1440 x 1600 per lens (https://developer.vive.com/resources/vive-sense/hardware-guide/vive-pro-eye-specs-user-guide/). Does that mean the maximal horizontal angle is 110 deg and that the maximal vertical angle is 110 deg? Does that mean that in the horizontal axis the visual angle pixel ratio is 720 pixels /55 deg= 13.09? While in the vertical it is 800 pixels /55 deg= 14.5? Gaze origin and pupil position: As far as I Understand these variables are not necessary in order to find and analyze eyes position. Is there a specific reason why these are provided? Are they used for calibration or for a different function? Thank you in advance for your help and feedback! Neomi
  8. Hi, I have recently started working with unity and Sranipal. I have tried to follow the steps described in the recent posts here but still have a few questions related to the conversion between gaze direction screen and world coordinates... 1. In what way gaze direction is related to the screen coordinates in unity? is this the correct conversion to pixels? Left_E_gaze_direction_normalized = verboseData.left.gaze_direction_normalized; x_pix_resolution=1440; y_pix_resolution=1600; left_eye_x_pixels = Left_E_gaze_direction_normalized.x / x_pix_resolution; left_eye_z_pixels = Left_E_gaze_direction_normalized.z / y_pix_resolution; 2. Conversion to world coordinates: Left_eye_position_world_coordinates= Camera.ScreenToWorldPoint( left_eye_x_pixels, left_eye_z_pixels, left_eye_y_pixels); 3. In what way gaze origin is related to these transformations? /or if it isn't why is it provided? I would greatly appreciate your help:) Neomi
  9. Is it possible to record a video of the eyes while they are being tracked ? If so what is the best way to do it? Thanks, Neomi
  10. You are right I didn't update verbosedata with v2 ( SRanipal_Eye_v2.GetVerboseData(out verboseData); // v2) I get non zero values now. Thank you
  11. Thanks for your help. I tried the code below but still only zero values are returned. Dose the code below implements your suggestion correctly? using ViveSR.anipal.Eye; public class save_eyes_tracking_v2 : MonoBehaviour { public VerboseData verboseData; public float pupil_diameter; void Start() { } void Update() { SRanipal_Eye.GetVerboseData(out verboseData); pupil_diameter = verboseData.left.pupil_diameter_mm; } }
  12. Do you mean the sample scene of unity? -yes when I press play , the avatar eyes are moving according to my eyes, and the sranipal icon eyes are green. This is my code: using System.Collections; using System.Collections.Generic; using UnityEngine; using ViveSR.anipal.Eye; public class save_eyes_tracking_v2 : MonoBehaviour { public ViveSR.anipal.Eye.EyeData_v2 eyeData= new ViveSR.anipal.Eye.EyeData_v2(); public float pupil_diameter; void Start() { } void Update() { pupil_diameter = eyeData.verbose_data.left.pupil_diameter_mm; Debug.Log( pupil_diameter ); } } Thanks!
  • Create New...