Jump to content

How to get output from the eye-tracker faster than Update in Unity


Recommended Posts

@Asish @shomik

1. To be clear the VR HMD screen refresh rate is not the same as the eye tracking data output frequency. If you're not using the callbacks or multi-threading the eye tracking data will be restricted by the main thread which will generally run at the HMD screen refresh rate. If using the callbacks the eye tracking data will output on average at 120Hz. To be clear, measuring the game frame rate will not reflect the callback frequency.

2. Yes, generally the VR HMD will have a different refresh rate than the monitors.

3. To measure the refresh rate for eye tracking you can record and compare the timestamp data.

4. Yes, on average the eye tracking data is output at 120Hz (120 times per second). If using callbacks the data output will not be limited by the refresh rate/fps of the games main thread.

Link to comment
Share on other sites

Hello @Corvus, Thanks for your clarification. 

I recorded data by "Enable Eye Data Callback" and version 2. I modified the following code and recorded data. I got around 15k row for 70 seconds but it's supposed to record 8k (70 sec * 120 FPS = 8400 rows) row as output frequency is 120hz. Could you please tell me what's wrong with this?  

 

 

 public class SRanipal_GazeRaySample_v2 : MonoBehaviour
            {
                public int LengthOfRay = 25;
                [SerializeField] private LineRenderer GazeRayRenderer;
                private static EyeData_v2 eyeData = new EyeData_v2();
                private bool eye_callback_registered = false;

                private StringBuilder csv;
                string strFilePath = @"D:\Unity_workspace\eye_tracking2020\Tutorial4\Assets\Data.csv";
                string strSeperator = ",";
                static StringBuilder sbOutput = new StringBuilder();

                private void Start()
                {
                    sbOutput = new StringBuilder();
                    if (!SRanipal_Eye_Framework.Instance.EnableEye)
                    {
                        enabled = false;
                        return;
                    }
                    Assert.IsNotNull(GazeRayRenderer);
                }

                private void Update()
                {

                    if (Input.GetKey("up"))
                    {
                        startRecord();
                    }

                    if (Input.GetKey("down"))
                    {
                        stopRecord();
                    }
                }
                private void Release()
                {
                    if (eye_callback_registered == true)
                    {
                        SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
                        eye_callback_registered = false;
                    }

                   
                }

                private void stopRecord()
                {
                    File.WriteAllText(strFilePath, sbOutput.ToString());
                    File.AppendAllText(strFilePath, sbOutput.ToString());
                }

                private void startRecord()
                {
                    if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING &&
                     SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.NOT_SUPPORT) return;

                    if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false)
                    {
                        SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
                        eye_callback_registered = true;
                    }
                    else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true)
                    {
                        SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
                        eye_callback_registered = false;
                    }

                    Vector3 GazeOriginCombinedLocal, GazeDirectionCombinedLocal;

                    if (eye_callback_registered)
                    {
                        if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
                        else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
                        else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
                        else return;
                    }
                    else
                    {
                        if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
                        else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
                        else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
                        else return;
                    }

                    Vector3 GazeDirectionCombined = Camera.main.transform.TransformDirection(GazeDirectionCombinedLocal);
                    GazeRayRenderer.SetPosition(0, Camera.main.transform.position - Camera.main.transform.up * 0.05f);
                    GazeRayRenderer.SetPosition(1, Camera.main.transform.position + GazeDirectionCombined * LengthOfRay);

                }
                private static void EyeCallback(ref EyeData_v2 eye_data)
                {
                    eyeData = eye_data;

                    var L_dia = eyeData.verbose_data.left.pupil_diameter_mm;
                    var l_openness = eyeData.verbose_data.left.eye_openness;
                    var l_gaze_origin = eyeData.verbose_data.left.gaze_origin_mm;
                    var l_gaze_dir_norm = eyeData.verbose_data.left.gaze_direction_normalized;
                    var l_pupil_pos = eyeData.verbose_data.left.pupil_position_in_sensor_area;
                  

                    var R_dia = eyeData.verbose_data.right.pupil_diameter_mm;
                    var R_openness = eyeData.verbose_data.right.eye_openness;
                    var R_gaze_origin = eyeData.verbose_data.right.gaze_origin_mm;
                    var R_gaze_dir_norm = eyeData.verbose_data.right.gaze_direction_normalized;
                    var R_pupil_pos = eyeData.verbose_data.right.pupil_position_in_sensor_area;

                    var newLine = string.Format("{0},{1},{2},{3},{4},{5},{6},{7}", L_dia, l_openness, l_gaze_origin, l_gaze_dir_norm, l_pupil_pos, R_dia, R_openness, R_gaze_origin, R_gaze_dir_norm, R_pupil_pos);

                    sbOutput.AppendLine(newLine);
                    


                }
            }

 

Link to comment
Share on other sites

Dear @Asish and @shomik

As @Corvus mentioned, we need to see Unity FPS and eye sampling seprately. These two parameters: frame rate and sampling frequency of eye tracker are indepedent from each other. For example, while the Unity FPS changes depending on the VR design and the computer specification (e.g. sometimes it is 50FPS and it changes to 100FPS at another time), the sampling frequency of eye tracker is basically consistent at a specific value. If you use the callback fucntion, the sampling frequency should be more or less at around 120Hz, whereas the Unity FPS may fluctuate. Specifically, while Unity Update() is running on one thread, Eye call back funciton is running on another thread.

However, as I wrote in another thread (

), it seems that the current SRanipal SDK does not properly output the time stamp data. Thus, it may be difficult to evaluate the sampling interval or frequency of eye tracker correctly, using the time stamp data. I instead used DateTime.Now.Ticks property on C# and observed that the sampling frequency was more or less at around 120Hz.

I hope this helps.

Best regards,

imarin18

Link to comment
Share on other sites

  • 3 months later...

@Asish What issues did you need to solve? I am also getting around 60/70 Hz output. I added a timestamp in the log and there is 15 or 16 miliseconds between the entries in the csv. It makes no difference if I run the scene in the editor or in a build. I have set the fixed Timestep in Unity to 0.008.

Also the gaze ray does not work (which looks logical, as it is only set in StartRecord, so that would mean only once, right?).

Link to comment
Share on other sites

  • 4 weeks later...

Dear @prajaktakhandve

As people have mentioned already, Unity FPS and sampling frequency of eye tracking should be handled and considered separtely. The eye callback function that HTC provides enables you to sample eye data at 120Hz, while FPS changes depending on the VR environment you design. It seems the following article will be of a good help for you.

https://www.frontiersin.org/articles/10.3389/fpsyt.2020.572938/full

  • Like 1
Link to comment
Share on other sites

@imarin18 Thanks a lot for this link. I helps me understand the eyetracking a lot better.

And I used the project as a check to see if I could get 120Hz. But even at the end of the output file I still see 16/17 ms between samples. So I think this indicates that my computer is simply too slow (it's a five year old laptop, high-end at the time, but not anymore 😃).

Strangely, I did get errors in Unity when I wanted to build the application to see if that made any differences in the sample speed compared to running it in the editor:
'Assets\Scripts\Saccade_measure_rev1.cs(227,17): error CS0234: The type or namespace name 'EditorApplication' does not exist in the namespace 'UnityEditor' (are you missing an assembly reference?)'

I used Unity 2019.2.17, since I had that already installed, but I find it weird that those errors don't show up when I run the application in the editor. I did notice that after running the application in the editor and clicking Play again to stop the application the editor is hanging. And I think I've seen that in another forum post somewhere.

Link to comment
Share on other sites

Dear @jboss

As the paper reported, the longer sampling interval of 16/17 ms would be due to the clock timing of your laptop. Or if you evaluated the sampling interval for the initial 30 seconds, it would be better to check the data over 30 seconds after you start the recording.

According to the script provided in the paper, the recording continues in a while loop for a specific number of samples (see the line # 175) regardless of whether you click Play or not. Thus, the editor may have been frozen because the while loop was not completed.

Best regards,
imarin18

 

Link to comment
Share on other sites

Thanks @imarin18

I did let the test run for a while. I did the practice tests and the first run (60 seconds, if I remember correctly). But even at the end of the output file the interval is 16/17 ms. Do you know if there is a way to check the clock cycle, besides running an empty while loop. Is that related to the CPU frequency?

And thanks for the explanation of the 'hanging'. I didn't actually examine the script, I just ran the test. I did notice that he output file size was still increasing when I thought the editor was hanging, so now I understand.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...