Jump to content

Eye Tracking Using Video Player (HTC Vive Pro Eye, Unity, SRanipal)


Recommended Posts


Hi,
I am doing a research project where I would like to gather eye-tracking data while the user is watching a 3D side by side stereoscopic video using a  HTC Vive Pro Eye.
So far, I am using the SRanipal SDK and the HTC Vive Media Decoder Plugin. I tried using the video component of unity but it seems that it can not be focused on (the z-coordinates are very high -> user looks through the video player?). If I add the video player component to a Game Object the video player looses the option to play side by side 3D videos.
Additionally, the position of the gaze direction seems always a bit off when visualizing the gaze point with a dot on the video (when using the decoder plugin).

So my questions are:
1. Am I doing the transformation of the gaze position wrong? (see below - I took parts of the gazeray sample and tobiis gaze visualizer sample)
2. How do I play a 3D side by side video on a Game Object?

Thank you!
Martina

 

 private void Update()
    {
        if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING &&
            SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.NOT_SUPPORT) return;

        if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false)
        {
            SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
            eye_callback_registered = true;
        }
        else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true)
        {
            SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
            eye_callback_registered = false;
        }

        Vector3 GazeOriginCombinedLocal, GazeDirectionCombinedLocal;

        if (eye_callback_registered)
        {
            if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
            else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
            else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
            else return;
        }
        else
        {
            if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
            else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
            else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
            else return;
        }

        Vector3 GazeDirectionCombined = _mainCamera.transform.TransformDirection(GazeDirectionCombinedLocal);
        SetPositionAndScale(GazeOriginCombinedLocal, GazeDirectionCombined);
        _spriteRenderer.enabled = true;
   }

	private void SetPositionAndScale( Vector3 origin, Vector3 direction)
    {
        RaycastHit hit;
        var distance =  _defaultDistance;
        if (Physics.Raycast(origin, direction, out hit))
        {
            distance = hit.distance;
        }

        //var interpolatedGazeDirection = Vector3.Lerp(_lastGazeDirection, direction, Time.unscaledDeltaTime);

        var usedDirection = direction.normalized;

        var tr_pos = _mainCamera.transform.position + usedDirection * distance;
       
        _spriteRenderer.transform.position = tr_pos;//origin + usedDirection * distance;
        
        transform.localScale = Vector3.one * distance * 0.03f;
       

        transform.forward = usedDirection;

        
        _lastGazeDirection = usedDirection;
    }
    

 

Link to comment
Share on other sites

Howdy @Martina_H

1) It looks like your gaze posiiton is correct. The gaze direction is a vector that points from the eye to the point on the screen where the user is looking. When you transform this vector, you are essentially converting it from eye coordinates to world coordinates. This is necessary in order to render the gaze point on the video.

2)To play a 3D side by side video(also referred to as stereoscopic video) on a Game Object, you can use the following steps:

  1. Create a new Game Object and add a Video Player component to it.
  2. Set the Video Player component's source to the 3D side by side video that you want to play.
  3. In the Video Player component's inspector, enable the "Stereoscopic" option and select the "Side by Side" option.
  4. Set the Video Player component's z-position to a negative value. This will cause the video to be rendered in front of the user.

An example code snippet for stereo is 
 

        // Create a new Game Object and add a Video Player component to it.
        GameObject videoObject = new GameObject();
        videoObject.AddComponent<VideoPlayer>();
        
        // Set the Video Player component's source to the 3D side by side video that you want to play.
        videoObject.GetComponent<VideoPlayer>().source = VideoSource.Url;
        videoObject.GetComponent<VideoPlayer>().url = "https://www.youtube.com/watch?v=6ZfuNTqbHE8"; //alternatively "file://"+ Application.streamingAssetsPath + "3Dtest.mp4";
        
        // Enable the "Stereoscopic" option - side by side in this example
        videoObject.GetComponent<VideoPlayer>().targetCamera3DLayout = Video3DLayout.SideBySide3D;
        
        // Set the Video Player component's z-position to a negative value.
        videoObject.transform.position = new Vector3(0, 0, -1);

Note that the path to the video will be relative the the build - so if you want to build and run the app and always have it include the video, you could put the video inside the streaming assets path in your project and then use the Application.streamingAssetsPath to reference the folder name as referenced ehere https://docs.unity3d.com/ScriptReference/Application-streamingAssetsPath.html . Interestingly enough, that an example showing how to use new unity video player component as well at that link.

u could use the standard vive media decoder, split the material down the middle yourself and show one half on one eye and another on another eye using layer masks and make a vr rig with two eyes, one for the left and one for the right, with the culling mask excluding the other eye layer

Thanks,
Alex

Link to comment
Share on other sites

@Alex_HTC Thank you for your quick reply! I tried the script and steps you provided to play a stereoscopic video on a Game Object but it does not seem to work. I am seeing only parts of the Game Object (and the Gaze Point) behind the Video playing or not at all. Also I do not see an option to adapt the Video Player Components size. Is there maybe a way to work with render textures? I have already tried this one (https://github.com/Unity-Technologies/SkyboxPanoramicShader) but I can not attach it to a GameObject and the quality of the video is very poor.

The HTC ViveMediaDecoder Plugin ( https://developer.vive.com/resources/vive-sense/tools/vive-media-decoder/) seems to have ability to split down the material on its own and also requires one to use layer masks etc. (as specified in their readme). I followed their instructions but sadly the quality is not good and the video seems not to be 3D. Do you know why this is the case? 

Link to comment
Share on other sites

hi, I have the same research goal as you. But first of all, I have a question. Is there any open source SDK that allows me to obtain the trajectory of head movement and eye gaze? My device is HTC VIVE Pro Eye. 

Link to comment
Share on other sites

@llzHi! I am using the Sranipal sdk https://developer.vive.com/resources/downloads/


@Alex_HTC I have another question regarding the transformation from local to world coordinates: Because I am using the HTC ViveMediaDecoder I have two cameras to create the 3D illusion and no main camera. For the transformation of the eye_data does it matter if I use the left_eye_camera or the right_eye_camera? Can I use both? If yes how?

Thanks!

Martina

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...