Asish
-
Posts
30 -
Joined
-
Last visited
Content Type
Profiles
Forums
Gallery
Blogs
Events
Store
Downloads
Posts posted by Asish
-
-
Dear All,
I'm trying to calculate distance between gaze_origin(provided by API) and the target object using following code but the units are not same for these two points. So, how to get same type of coordinate system?
var d = GameObject.Find("CubeObject").transform.position;
var combine_gaze = eyeData.verbose_data.combined.eye_data.gaze_origin_mm;
float dist = Vector3.Distance(d, combine_gaze);
Thanks in advance
-
Hello guys,
I'm trying to calculate distance between gaze_origin(provided by API) and the target object using following code but the units are not same for these two points. So, how to get same type of coordinate system?
var d = GameObject.Find("CubeObject").transform.position;
var combine_gaze = eyeData.verbose_data.combined.eye_data.gaze_origin_mm;
float dist = Vector3.Distance(d, combine_gaze);
Thanks in advance
-
Hello @Corvus @HackPerception
I'm trying to calculate distance between gaze_origin(provided by API) and the target object using following code but the units are not same for these two points. So, how to get same type of coordinate system?
var d = GameObject.Find("CubeObject").transform.position;
var combine_gaze = eyeData.verbose_data.combined.eye_data.gaze_origin_mm;
float dist = Vector3.Distance(d, combine_gaze);
Thanks in advance
-
@Corvus @Tony PH Lin why timestamp feature is more important than other features like eye_diameter, eye_openness, eye_wideness? I used SRanipal 1.1.0.1.
and chi square feature selection techniques to understand the important feature and it shows timestamp is more important. I need to understand why timestamp is more important.
I also found that timestamp has negative values but I don't know why. Could you please clarify it?
-
@Annabell Did you get head rotation? If so, how did you detect head rotation?
-
@Corvus @HackPerception How can I get standard head rotation value using VIVE pro eye while watching different object in the scene? Any idea to get standard head rotation value?
-
Hello @lawwong @h-shirakami How can I get head rotation value using VIVE pro eye while watching object in the scene? Any idea to get standard head rotation value?
Thanks in advance
-
@Corvus I did not find it in the SRanipal SDK documentation. Could you plz send me the link of this documentation? Thanks in advance
-
I'm trying to measure gaze angles in these way from the scene Please let me know what you think @Corvus.
-
Thanks @Corvus , I also need to measure gaze angles with different objects in the scene. Any suggestions how to measure it in a standard way?
-
Hello @Corvus @zhaoyanbo I got eye data using API but how did you get head motion? How can I get head rotation velocity using SDK api?
Thanks in advance
-
Thanks @Corvus
Is tobii license free or paid? Can I take free license as a student? Do you have any contact information whom should I contact to get tobii license?
Thanks again
-
Hello @Corvus @VIVE_Jason_Lu
Is there any way to capture eye image using vive pro? I searched but did not find the answer. Any suggestions?
Thanks in advance
Asish
-
Hello @Corvus @VibrantNebula @Hank_Li
Would you please let me know how to calculate the number of blinks per min from eye openness data? Any standard way to get it correctly?
Thanks
-
Dear All,
Is there any standard way to calculate the number of blinks per min from eye openness data?
Thanks in advance
Asish
-
Hello @Corvus, Here is the log file you requested. Please take a look and let me know how to fix the issue.
-
Hello @Corvus, @VibrantNebula
I created project and imported VIVE SR(SRanipal sdk 1.1.0.1) but EyeSample_v2 does not run in unity 2017.4. When I run it, simply crashed unity and shut down.
Log file shows Unknown caused an Access Violation (0xc0000005)
in module Unknown at 0033:10740380.Also I tried with unity version 2018.2 and 2018.3 but sample project does not run on it.
Any suggestions?
-
Hello @Hank_Li I tried it many times and failed. If you say vive pro can't collect eye data in the last part then it does not make sense to me as I tried it many times and it failed. But last week calibration was successful many times while I was testing my project. Could you please provide any other suggestions.
Thanks
-
Hello @VibrantNebula @Corvus
Did you check my SR_RUNTIME log file to identify the issue why calibration is not initialized? I tried all solutions mentioned in this forum but still calibration is not working. Please take a look at my log file in the previous post and suggest me to fix this issue.
Thanks
-
ViveSR_Log.cab Hello @VibrantNebula Here is the SR_RUNTIME log file.
-
I restarted my pc, run SR_RUNTIME and unity as admin but still calibration is failed. I tried at least times but no luck, even it was working last week. Please take a look at log and let me know how can I fix it.
Thanks.
-
@Daniel_Y, Link does not exist from previous post. Here is a similar question about gaze origin, https://community.viveport.com/t5/Vive-SRanipal-SDK/Vive-Pro-Eye-Finding-a-single-eye-origin-in-world-space/gpm-p/31592#M20.The diagram is updated. The gaze origin is with repect to System Origin.
@iuilab, Did you get your answer? If so, Please let me know about gaze_origin_mm interms or left and right eye. If I take average of these two points, then does it mean the position of the object in the scene where user looked at?
Thanks
-
Hello,
I solved my issues. Now I'm getting 120hz output frequency.
Thanks
-
Hello @Corvus, Thanks for your clarification.
I recorded data by "Enable Eye Data Callback" and version 2. I modified the following code and recorded data. I got around 15k row for 70 seconds but it's supposed to record 8k (70 sec * 120 FPS = 8400 rows) row as output frequency is 120hz. Could you please tell me what's wrong with this?
public class SRanipal_GazeRaySample_v2 : MonoBehaviour
{
public int LengthOfRay = 25;
[SerializeField] private LineRenderer GazeRayRenderer;
private static EyeData_v2 eyeData = new EyeData_v2();
private bool eye_callback_registered = false;private StringBuilder csv;
string strFilePath = @"D:\Unity_workspace\eye_tracking2020\Tutorial4\Assets\Data.csv";
string strSeperator = ",";
static StringBuilder sbOutput = new StringBuilder();private void Start()
{
sbOutput = new StringBuilder();
if (!SRanipal_Eye_Framework.Instance.EnableEye)
{
enabled = false;
return;
}
Assert.IsNotNull(GazeRayRenderer);
}private void Update()
{if (Input.GetKey("up"))
{
startRecord();
}if (Input.GetKey("down"))
{
stopRecord();
}
}
private void Release()
{
if (eye_callback_registered == true)
{
SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
eye_callback_registered = false;
}
}private void stopRecord()
{
File.WriteAllText(strFilePath, sbOutput.ToString());
File.AppendAllText(strFilePath, sbOutput.ToString());
}private void startRecord()
{
if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING &&
SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.NOT_SUPPORT) return;if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false)
{
SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
eye_callback_registered = true;
}
else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true)
{
SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
eye_callback_registered = false;
}Vector3 GazeOriginCombinedLocal, GazeDirectionCombinedLocal;
if (eye_callback_registered)
{
if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
else return;
}
else
{
if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
else return;
}Vector3 GazeDirectionCombined = Camera.main.transform.TransformDirection(GazeDirectionCombinedLocal);
GazeRayRenderer.SetPosition(0, Camera.main.transform.position - Camera.main.transform.up * 0.05f);
GazeRayRenderer.SetPosition(1, Camera.main.transform.position + GazeDirectionCombined * LengthOfRay);}
private static void EyeCallback(ref EyeData_v2 eye_data)
{
eyeData = eye_data;var L_dia = eyeData.verbose_data.left.pupil_diameter_mm;
var l_openness = eyeData.verbose_data.left.eye_openness;
var l_gaze_origin = eyeData.verbose_data.left.gaze_origin_mm;
var l_gaze_dir_norm = eyeData.verbose_data.left.gaze_direction_normalized;
var l_pupil_pos = eyeData.verbose_data.left.pupil_position_in_sensor_area;
var R_dia = eyeData.verbose_data.right.pupil_diameter_mm;
var R_openness = eyeData.verbose_data.right.eye_openness;
var R_gaze_origin = eyeData.verbose_data.right.gaze_origin_mm;
var R_gaze_dir_norm = eyeData.verbose_data.right.gaze_direction_normalized;
var R_pupil_pos = eyeData.verbose_data.right.pupil_position_in_sensor_area;var newLine = string.Format("{0},{1},{2},{3},{4},{5},{6},{7}", L_dia, l_openness, l_gaze_origin, l_gaze_dir_norm, l_pupil_pos, R_dia, R_openness, R_gaze_origin, R_gaze_dir_norm, R_pupil_pos);
sbOutput.AppendLine(newLine);
}
}
Gaze origin and direction issue from GetGazeRay in Unreal Plugin
in VIVE Eye and Facial Tracking SDK
Posted
@Krattrym If I divide by 1000 to convert into meter then I get all zero as gaze origin vector values are smaller. I actually need to calculate distance between object position in the scene and gaze origin where I got mostly static distance by using conversion. Any other way how to handle it?
Thanks in advance.