Jump to content

Martina_H

Verified Members
  • Posts

    3
  • Joined

  • Last visited

Reputation

0 Neutral
  1. @llzHi! I am using the Sranipal sdk https://developer.vive.com/resources/downloads/ @Alex_HTC I have another question regarding the transformation from local to world coordinates: Because I am using the HTC ViveMediaDecoder I have two cameras to create the 3D illusion and no main camera. For the transformation of the eye_data does it matter if I use the left_eye_camera or the right_eye_camera? Can I use both? If yes how? Thanks! Martina
  2. @Alex_HTC Thank you for your quick reply! I tried the script and steps you provided to play a stereoscopic video on a Game Object but it does not seem to work. I am seeing only parts of the Game Object (and the Gaze Point) behind the Video playing or not at all. Also I do not see an option to adapt the Video Player Components size. Is there maybe a way to work with render textures? I have already tried this one (https://github.com/Unity-Technologies/SkyboxPanoramicShader) but I can not attach it to a GameObject and the quality of the video is very poor. The HTC ViveMediaDecoder Plugin ( https://developer.vive.com/resources/vive-sense/tools/vive-media-decoder/) seems to have ability to split down the material on its own and also requires one to use layer masks etc. (as specified in their readme). I followed their instructions but sadly the quality is not good and the video seems not to be 3D. Do you know why this is the case?
  3. Hi, I am doing a research project where I would like to gather eye-tracking data while the user is watching a 3D side by side stereoscopic video using a HTC Vive Pro Eye. So far, I am using the SRanipal SDK and the HTC Vive Media Decoder Plugin. I tried using the video component of unity but it seems that it can not be focused on (the z-coordinates are very high -> user looks through the video player?). If I add the video player component to a Game Object the video player looses the option to play side by side 3D videos. Additionally, the position of the gaze direction seems always a bit off when visualizing the gaze point with a dot on the video (when using the decoder plugin). So my questions are: 1. Am I doing the transformation of the gaze position wrong? (see below - I took parts of the gazeray sample and tobiis gaze visualizer sample) 2. How do I play a 3D side by side video on a Game Object? Thank you! Martina private void Update() { if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING && SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.NOT_SUPPORT) return; if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false) { SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback)); eye_callback_registered = true; } else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true) { SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback)); eye_callback_registered = false; } Vector3 GazeOriginCombinedLocal, GazeDirectionCombinedLocal; if (eye_callback_registered) { if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { } else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { } else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { } else return; } else { if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { } else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { } else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { } else return; } Vector3 GazeDirectionCombined = _mainCamera.transform.TransformDirection(GazeDirectionCombinedLocal); SetPositionAndScale(GazeOriginCombinedLocal, GazeDirectionCombined); _spriteRenderer.enabled = true; } private void SetPositionAndScale( Vector3 origin, Vector3 direction) { RaycastHit hit; var distance = _defaultDistance; if (Physics.Raycast(origin, direction, out hit)) { distance = hit.distance; } //var interpolatedGazeDirection = Vector3.Lerp(_lastGazeDirection, direction, Time.unscaledDeltaTime); var usedDirection = direction.normalized; var tr_pos = _mainCamera.transform.position + usedDirection * distance; _spriteRenderer.transform.position = tr_pos;//origin + usedDirection * distance; transform.localScale = Vector3.one * distance * 0.03f; transform.forward = usedDirection; _lastGazeDirection = usedDirection; }
×
×
  • Create New...