Jump to content

mrk88

Verified Members
  • Posts

    26
  • Joined

  • Last visited

Everything posted by mrk88

  1. From physics, I am using this formula to calculate the distance between two gaze position vectors: Mathf.Sqrt(Mathf.Pow((curPos.x - previous.x), 2) + Mathf.Pow((curPos.y - previous.y), 2) + Mathf.Pow((curPos.z - previous.z), 2)); But this is not correct for displacement. Because the vectors are normalized and the distance between the two vectors is not actual values and not to the right scale (much smaller). I expect to have a figure like the one below, for hundreds of eye samples .. (the top figure) https://www.researchgate.net/publication/235788020/figure/fig5/AS:667218010456066@1536088578181/Example-of-eye-movement-data-Vertical-eye-position-upper-panel-and-eye-velocity-lower.ppm But. the plot I am getting looks like the attached figure. The vertical axis values should change with the respect to eye fixations and saccades. And when for example I fixate on a point for a several seconds it should remain constant over there (not quickly drop to zero).
  2. I would like to measure the displacement of eye movements between two eye samples. I was using GazeDirection vector for that. However, since these values are normalized between [-1,1], the values and the plot I am getting is wrong. Can you please guide me about how these values have been normalized? and how to calculate the value of the displacement between two consecutive eye samples. @Corvus @Daniel_Y
  3. Okay, thank you very much for the info. Is there a way I can measure it , and then keep it at a fixed distance? ... I mean do you know of any references or suggestions about that? I am trying to measure the angluar magnitude of an object being displaced from one position to another, and for that I need to know the distance from the subject (because it makes a lot of difference in the calculations).
  4. Hi, Does anybody know that when we put on the HTC Vive Pro Eye headset, how far our eyes are from the displays? (I'm guessing about a couple of inches, but I'd like to know the exact number for my research). Thanks.
  5. Thanks, but my question is how to calculate gaze position from gaze direction vector. For velocity calculation I need to have start and end gaze position.
  6. The problem with denormalizing is that I only have Height and Width of the display for each eye, but the Gaze Direction Vector is 3d. How to map the Z coordinate to the display resolution?
  7. Hello I am trying to calculate instant gaze velocity, using gaze position. I found out that pupil_position_on_sensor does not indicate the true gaze position. Therefore I have to use gaze direction vector, which is normalized. - How can I calculate instantaneous gaze velocity based on normalized Gaze Direction Vector? - Do I need to denormalize it? How? Thank you so much for nay ideas/help! @Corvus @Daniel_Y
  8. Hi Somnath I am also interested in knowing how to denormalize the gaze direction vector. Have you found out anything?
  9. So pupil position in sensor area cannot be used for gaze velocity calculation? What gaze position-related data can we use to calculate gaze speed? I am interested in doing a saccade detection algorithm, which is based on gaze speed (for which I need gaze position ) @Daniel_Y
  10. Thanks, I'm using Unity Recorder to record the HMD, (with Capture set to "GameView," and output resolution set to "Match Window Size")
  11. Also, why is the image resolution I have in the post above, less (536 x 1076) than the vive pro HMD resolution? Is that a Unity setting thing?
  12. Dear all, Does Gaze position data (pupil_position_in_sensor_area) correspond to the exact gaze position on the image being viewed? - It appears even when I look to the right most position of HMD, the values for gaze position (R and L) don't go above 0.7XXX. - When I map the points on the image, they don't match! Take a look at the attached image. The red crosses show gaze position during that frame. But I was looking at the yellow sphere during the whole time. - What does Pupil Position on Sensor Area mean? Is it different from where it would correspond to on the image being viewed?
  13. Hello! I am trying to measure the eye tracker latency (difference between when an eye sample is captured on the system sensor and the current time of the computer). Does anybody know what this timestamp of the computer actually shows? I know it is a number in millisecond, but what is the time zone of it? It should be comparable with the PC time in ms. - Would the headset's clock reset on every restart of the headset? - Are the timing matters similar to Tobii? Does this document from Tobii relate to the timings in Vive Pro as well? - Are there any more info on latency? (not on the forum posts, but official referable manuals/documents released by Vive Pro Eye people) Thanks Me and many others are using this headset for research, and things like timing are quite important to us (I am doing gaze-contingent stuff), and unfortunately there is nobody to clarify these things about the device! The manual is not very helpful (with many typos!), and the last place I find for getting more information is this forum, and I only find more people with questions like myself, and not many useful responses! @Daniel_Y @Corvus
  14. ViveProEyeProducerThread.cs Thanks. I attach this script to a GameObject in my scene. I am not able to collect any verbose data while using a thread. The verbose data I can only collect in Update function. and that is where I'm confused. If we can only collect verbose data in Update(), then we will be collecting them with 90Hz frequency not 120.
  15. Also, when collecting gaze data (Such as below), should we collect them in a thread (same as above)? or in the Unity's Update function? gazeOriginLeft = eyeData.verbose_data.left.gaze_origin_mm; //gaze origin gazeOriginRight = eyeData.verbose_data.right.gaze_origin_mm; gazeDirectionLeft = eyeData.verbose_data.left.gaze_direction_normalized; //gaze direction gazeDirectionRight = eyeData.verbose_data.right.gaze_direction_normalized; pupilDiameterLeft = eyeData.verbose_data.left.pupil_diameter_mm; //pupil size pupilDiameterRight = eyeData.verbose_data.right.pupil_diameter_mm; pupilPositionLeft = eyeData.verbose_data.left.pupil_position_in_sensor_area;// pupil positions pupilPositionRight = eyeData.verbose_data.right.pupil_position_in_sensor_area; eyeOpenLeft = eyeData.verbose_data.left.eye_openness; // eye openness eyeOpenRight = eyeData.verbose_data.right.eye_openness;
  16. and this is the code I collect the data with: void QueryEyeData() { while (Abort == false) { SRanipal_Eye_API.GetEyeData(ref eyeData); ViveSR.Error error = SRanipal_Eye_API.GetEyeData(ref EyeData); if (error == ViveSR.Error.WORK) { logResults(frameCount); logResults(eyeData.timestamp); logResults(eyeData.frame_sequence); frameCount++; logFile.WriteLine(" "); if (frameCount % 120 == 0) frameCount = 0; } Thread.Sleep(FrequencyControl); } }
  17. I am trying to poll eyeData in a thread, and I am recording eyeData.timestamp and eyeData.frame_sequence. - It seems that the timestamps are not received consecutively (there is a 7 or 8 difference between two timestamps). - Also, the frameCounts have missing frames too. For example in the samples below, the frame number 3506, 3511, 3518 and etc are missing: Frame #, eyeData.timestamp, eyeData.frame_sequence 0 599227 3500 1 599235 3501 2 599243 3502 3 599251 3503 4 599260 3504 5 599268 3505 6 599285 3507 7 599293 3508 8 599301 3509 9 599310 3510 10 599326 3512 11 599335 3513 12 599343 3514 13 599351 3515 14 599360 3516 15 599368 3517 16 599385 3519 17 599393 3520 18 599401 3521 19 599410 3522 20 599418 3523 21 599435 3525 22 599443 3526 23 599451 3527 24 599460 3528 25 599468 3529 26 599476 3530 27 599493 3532 28 599501 3533 29 599510 3534 30 599518 3535 31 599535 3537 32 599535 3537 33 599551 3539 34 599560 3540 35 599568 3541 36 599576 3542 37 599593 3544 38 599601 3545 39 599610 3546 40 599618 3547
  18. Can you elaborate a bit more about your calculations? are you using difference in gaze positions to compute velocity? How did you measure change in angle in 15 seconds?
  19. Hello! Thanks for sharing this. Can I ask how you actually measured these? Thanks
  20. Hello! Does anybody know if the SRanipal API, has any codes we can use for eye event detection? e.g. saccade detection or fixation detection? Or are we to implement that using gaze position of the data? @Corvus @Daniel_Y @zzy
  21. I fixed my issue by upgrading my windows to 10, and installing everything from scratch (including steamVR and Vive setup). This time I didn't have to install sranipall separately, it was included in the vive pro setup.
  22. Hi, I have been using the Vive pro on my current PC (Win7, Intel core i7 CPU 3.4GHz, 16GM RAM, and a NVidia GeForce GTX TITAN X Graphics card). Everything worked well until August, and I did not use it again until now. And now I just keep getting the " Initialization Failed" error on my calibration screen. I had a look at the solutions here, and tried them all, but none of them seem to help me. I feel that doing all these updates resulted in this failure. Has there been any other updates in the past 3-4 months, that I have missed? This is my school project and I really need to speed up my work, can't afford to be stuck on just a calibration failure for long. Thanks for any help/suggestions. @Corvus @Daniel_Y
  23. I have Win 7. I was able to do the calibration in July without any problems. When I try it now, I get the Initialization Error. Could it be because I need to upgrade to Win 8 or later? But it worked fine before with my current windows! @Corvus @Daniel_Y
  24. Great, Thanks! So this code return all zero values. Do I need to do anything before getting these data?
  25. Hi all, I've just got my hands on the Vive Pro and like to get some gaze data out of it in Unity. I notice that in SRapinal_eye.cs all the functions are boolean. Can somebody tell me how can I get some gaze position data and other gaze features (such as timestamp and pupil position , etc ) in my own c# file?
×
×
  • Create New...