-
Posts
15 -
Joined
-
Last visited
Content Type
Profiles
Forums
Gallery
Blogs
Events
Store
Downloads
Posts posted by Hank_Li
-
-
Hi @patrickabroad:
Version 2 has two more BlendShapes than Version 1 (eye_wide and eye_squeeze).
eye_wide is a value representing how open eye widely.
eye_squeeze is a value representing how open eye closed tjghtly.
Thank you.
- 1
-
-
Hi @goose_r_s
We will fix it in the next version.
And maybe you can refer to this article .
Thank you.
-
-
Hi Asish,
We did not provide such API for calculating the number of blinks per minute.
Maybe you can calculate it according to the value of this blendshape (blinks)Thanks
-
-
Hi @banila2
Please give me your log file which path is C:\Users\user_name\AppData\LocalLow\HTC Corporation\SR_Logs
thanks
-
Hi @nbhatia
In the current version,there is no way yet to get the system time stamps.
And we will implement that in the next version.
Thanks
-
According to the log message you provided , we consider that the device can't collect your eye's data in the last part.
Please, make sure you can see the point certainly.Thanks
-
-
Hi
According to the log file you provided , we consider that the device can't collect your eye's data in the last part.
Please, make sure you can see the point certainly.Thanks
-
Hi @hollandera
Please refer to this unity code
And you can get nose tracking you want in unity.
int dart_board_layer_id = LayerMask.NameToLayer("NoReflection");
Vector3 direction = Vector3.forward;
Vector3 origin = Vector3.zero;
Ray rayGlobal = new Ray(Camera.main.transform.position, Camera.main.transform.TransformDirection(direction));
RaycastHit hit;
Physics.Raycast(rayGlobal, out hit, 20, (1 << dart_board_layer_id));You can download the unity sample here:
https://developer.vive.com/resources/knowledgebase/vive-sranipal-sdk/
Thank you.
- 1
-
Hi @hollandera
Do you want to use the SRanipal function in unity code on Vive pro(no eye tracking machine)?
Thanks.
-
Hi Nikki:
If you want to initial Eye V2 engine:
1. Please do not modify any header file we provide. E.g. in SRanipal_Eye.h, the following line should be:
const int ANIPAL_TYPE_EYE_V2 = 2;
2. You need a separate flag to check if eye v2 is enabled, e.g.,
int error = ViveSR::anipal::Initial(ViveSR::anipal::Eye::ANIPAL_TYPE_EYE_V2, NULL);
if (error == ViveSR::Error::WORK) {
EnableEyeV2 = true;}
Then in the function void streaming(), you need separate code to handle eye v2 data, e.g.,
if (EnableEyeV2) {
int result = ViveSR::anipal::Eye::GetEyeData_v2(&eye_data_v2);
if (result == ViveSR::Error::WORK) {
float *gaze = eye_data_v2.verbose_data.left.gaze_direction_normalized.elem_;
printf("[Eye v2] Wide: %.2f %.2f \n", eye_data_v2.expression_data.left.eye_wide, eye_data_v2.expression_data.right.eye_wide);
}
}- 1
Access to system timestamps
in VIVE Eye and Facial Tracking SDK
Posted
Hi @VGagliano
Yes it was~.
Please flow the sample code to get system timestamps.
int result = ViveSR::anipal::Eye::SRanipal_UpdateTimeSync();
if( result == ViveSR::Error::WORK ){
int64_t time;
ViveSR::anipal::Eye::SRanipal_GetSystemTime(&time);
printf("%u",(unsigned )time);
}