Jump to content

mrk88

Verified Members
  • Posts

    26
  • Joined

  • Last visited

Reputation

3 Neutral

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. From physics, I am using this formula to calculate the distance between two gaze position vectors: Mathf.Sqrt(Mathf.Pow((curPos.x - previous.x), 2) + Mathf.Pow((curPos.y - previous.y), 2) + Mathf.Pow((curPos.z - previous.z), 2)); But this is not correct for displacement. Because the vectors are normalized and the distance between the two vectors is not actual values and not to the right scale (much smaller). I expect to have a figure like the one below, for hundreds of eye samples .. (the top figure) https://www.researchgate.net/publication/235788020/figure/fig5/AS:667218010456066@1536088578181/Example-of-eye-movement-data-Vertical-eye-position-upper-panel-and-eye-velocity-lower.ppm But. the plot I am getting looks like the attached figure. The vertical axis values should change with the respect to eye fixations and saccades. And when for example I fixate on a point for a several seconds it should remain constant over there (not quickly drop to zero).
  2. I would like to measure the displacement of eye movements between two eye samples. I was using GazeDirection vector for that. However, since these values are normalized between [-1,1], the values and the plot I am getting is wrong. Can you please guide me about how these values have been normalized? and how to calculate the value of the displacement between two consecutive eye samples. @Corvus @Daniel_Y
  3. Okay, thank you very much for the info. Is there a way I can measure it , and then keep it at a fixed distance? ... I mean do you know of any references or suggestions about that? I am trying to measure the angluar magnitude of an object being displaced from one position to another, and for that I need to know the distance from the subject (because it makes a lot of difference in the calculations).
  4. Hi, Does anybody know that when we put on the HTC Vive Pro Eye headset, how far our eyes are from the displays? (I'm guessing about a couple of inches, but I'd like to know the exact number for my research). Thanks.
  5. Thanks, but my question is how to calculate gaze position from gaze direction vector. For velocity calculation I need to have start and end gaze position.
  6. The problem with denormalizing is that I only have Height and Width of the display for each eye, but the Gaze Direction Vector is 3d. How to map the Z coordinate to the display resolution?
  7. Hello I am trying to calculate instant gaze velocity, using gaze position. I found out that pupil_position_on_sensor does not indicate the true gaze position. Therefore I have to use gaze direction vector, which is normalized. - How can I calculate instantaneous gaze velocity based on normalized Gaze Direction Vector? - Do I need to denormalize it? How? Thank you so much for nay ideas/help! @Corvus @Daniel_Y
  8. Hi Somnath I am also interested in knowing how to denormalize the gaze direction vector. Have you found out anything?
  9. So pupil position in sensor area cannot be used for gaze velocity calculation? What gaze position-related data can we use to calculate gaze speed? I am interested in doing a saccade detection algorithm, which is based on gaze speed (for which I need gaze position ) @Daniel_Y
  10. Thanks, I'm using Unity Recorder to record the HMD, (with Capture set to "GameView," and output resolution set to "Match Window Size")
  11. Also, why is the image resolution I have in the post above, less (536 x 1076) than the vive pro HMD resolution? Is that a Unity setting thing?
  12. Dear all, Does Gaze position data (pupil_position_in_sensor_area) correspond to the exact gaze position on the image being viewed? - It appears even when I look to the right most position of HMD, the values for gaze position (R and L) don't go above 0.7XXX. - When I map the points on the image, they don't match! Take a look at the attached image. The red crosses show gaze position during that frame. But I was looking at the yellow sphere during the whole time. - What does Pupil Position on Sensor Area mean? Is it different from where it would correspond to on the image being viewed?
  13. Hello! I am trying to measure the eye tracker latency (difference between when an eye sample is captured on the system sensor and the current time of the computer). Does anybody know what this timestamp of the computer actually shows? I know it is a number in millisecond, but what is the time zone of it? It should be comparable with the PC time in ms. - Would the headset's clock reset on every restart of the headset? - Are the timing matters similar to Tobii? Does this document from Tobii relate to the timings in Vive Pro as well? - Are there any more info on latency? (not on the forum posts, but official referable manuals/documents released by Vive Pro Eye people) Thanks Me and many others are using this headset for research, and things like timing are quite important to us (I am doing gaze-contingent stuff), and unfortunately there is nobody to clarify these things about the device! The manual is not very helpful (with many typos!), and the last place I find for getting more information is this forum, and I only find more people with questions like myself, and not many useful responses! @Daniel_Y @Corvus
  14. ViveProEyeProducerThread.cs Thanks. I attach this script to a GameObject in my scene. I am not able to collect any verbose data while using a thread. The verbose data I can only collect in Update function. and that is where I'm confused. If we can only collect verbose data in Update(), then we will be collecting them with 90Hz frequency not 120.
  15. Also, when collecting gaze data (Such as below), should we collect them in a thread (same as above)? or in the Unity's Update function? gazeOriginLeft = eyeData.verbose_data.left.gaze_origin_mm; //gaze origin gazeOriginRight = eyeData.verbose_data.right.gaze_origin_mm; gazeDirectionLeft = eyeData.verbose_data.left.gaze_direction_normalized; //gaze direction gazeDirectionRight = eyeData.verbose_data.right.gaze_direction_normalized; pupilDiameterLeft = eyeData.verbose_data.left.pupil_diameter_mm; //pupil size pupilDiameterRight = eyeData.verbose_data.right.pupil_diameter_mm; pupilPositionLeft = eyeData.verbose_data.left.pupil_position_in_sensor_area;// pupil positions pupilPositionRight = eyeData.verbose_data.right.pupil_position_in_sensor_area; eyeOpenLeft = eyeData.verbose_data.left.eye_openness; // eye openness eyeOpenRight = eyeData.verbose_data.right.eye_openness;
×
×
  • Create New...