nbhatia
-
Posts
18 -
Joined
-
Last visited
Content Type
Profiles
Forums
Gallery
Blogs
Events
Store
Downloads
Posts posted by nbhatia
-
-
Hi All,
Where can I find the changelogs for the recent SRAnipal SDK and runtime? I am still using the older SDK and was wondering what has changed in last few releases.
Many Thanks,
Nitesh
-
On 8/10/2020 at 11:02 PM, michael_wang said:
Hi @michael_wang, Would you be able to confirm if the said functionality has been added in recent srnanipal release 1 3 0 9? Many thanks. @Tony PH Lin
-
10 minutes ago, cte said:
Trying to work trough it now. I see the different levels in Unreal but don't know where to start with the data extraction. I examined the dartboard actor but am unsure how to use what I'm looking at here (DartBoard.h)
---------------------------------
Do I need to modify this code so as to utilize it as a kind of template?
- If so, where do I modify and what structure do I use?
// ========= Copyright 2019, HTC Corporation. All rights reserved. =========== #pragma once #include "SRanipal_FunctionLibrary_Eye.h" #include "SRanipal_Eye.h" #include "Engine/Classes/Camera/PlayerCameraManager.h" #include "Runtime/Engine/Classes/Materials/Material.h" #include "CoreMinimal.h" #include "GameFramework/Actor.h" #include "DartBoard.generated.h" UCLASS() class SRANIPAL_API ADartBoard : public AActor { GENERATED_BODY() public: // Sets default values for this actor's properties ADartBoard(); UPROPERTY(EditAnywhere) FVector Position; // The dartboard's position UPROPERTY(EditAnywhere) UStaticMeshComponent* BoardMesh; UPROPERTY(EditAnywhere) UMaterial* ParentMaterial; UPROPERTY() APlayerCameraManager* PlayerCameraRef; private: FVector FocusPosition; UMaterialInstanceDynamic* DartBoardMaterial; protected: // Called when the game starts or when spawned virtual void BeginPlay() override; public: // Called every frames virtual void Tick(float DeltaTime) override; private: // Focus relative parameter FFocusInfo FocusInfo; FVector GazeOrigin, GazeDirection; float RayLength = 1000.0f; float RayRadius = 1.0f; // Move dart board FVector InitialPosition; bool FirstUpdate = true; bool AlreadyMoveDart = false; void MoveDartBoard(); float SignedAngle(FVector v1, FVector v2, FVector v_forward); };
Sorry if this is a basic question but I'm brand new to C++ 👶
Hi,
I am away from my PC atm. But I remember examples written in C were easiest to understand the API. One example had a file writer if I remember correctly.
Also check some older posts in this forum. There is one post for unity in C# where data is handled in a separate thread. Else, consider writing a new post. People in this forum is very good.
- 1
-
3 minutes ago, cte said:
@nbhatia how are you going about getting the data from the SDK? I'm new to this and trying to figure out how to pull that data into a writable export file of some kind (ideally a CSV).
Hi, have you checked the sdk? There are dome examples already.
-
Hi All,
I am thinking of getting View Wireless Adapter for my Vive Pro Eye headset. I wanted to know if it also wirelessly links the additional USB accessory port located inside the Vive Pro to the PC.
Many thanks.
-
Yup.
Tobii XR SDK is free but EULA has to be accepted. -
Hi,
I was looking into timestamps generated by SRAnipal as well as Tobii SDK. In case of SRANipal, the timestamps correspond to local time stamp of eye tracker. In case of Tobii, the sdk provides two time stamps: local time stamp (as timestamp_tracker_us) which is similar to what we get in sranipal and an additional one, system (timestamp_system_us) which is synced with computer time. Is there a way to get the system time stamps in sranipal sdk via some sort of patch, etc?Regards,
Nitesh -
-
-
Ok. That clears a lot of confusion. Thank you @Daniel_Y🙂
-
Thanks a lot @Daniel_Y🙂
In the Vive Pro Eye tech specs it is stated that the eye-tracker supports 120Hz sampling frequency. Is it steady or variable? I am hoping that the callback should be working at similar sampling rate. -
Hi,
In the function, "SetEyeParameter(EyeParameter parameter)" declared in SRAnipal_Eye.h, the "EyeParameter" takes "GazeRayParameter" which has "sensitive_factor" as a variable. To my understanding, this is meant for smoothing the gaze data which could be similar to moving average, etc. Am I getting it right? I was wondering if someone could explain the internal working. What is the default value if we do not declare it?Many thanks,
Nitesh -
I see. Thank you. Is this the only difference between v1 and v2? I was wondering if there would be any difference in terms of efficiency, precision and sampling rate.
Many Thanks,Nitesh
-
Hi,
I am using SRAnipal SDK in C. In SRAnipal_Eye.h, I see that some APIs (pasted below) are declared to register callbacks. Can anyone help me by providing an example of using this callback?
/* Register a callback function to receive eye camera related data when the module has new outputs.
[in] function pointer of callback
[out] error code. please refer Error in ViveSR_Enums.h
*/
SR_ANIPAL int RegisterEyeDataCallback(VivesrEyeDataCallback callback);/* Unegister a callback function to stop receiving eye camera related data.
[in] function pointer of callback
[out] error code. please refer Error in ViveSR_Enums.h
*/
SR_ANIPAL int UnregisterEyeDataCallback(VivesrEyeDataCallback callback);
Many Thanks,
Nitesh -
-
This function seems to be working fine with Tobii XR SDK. For your work, Tobii’s G2OM implementation of finding gaze focused objects might as well help. I still have to check if both the sdks (SRAnipal and Tobii) can work together.
-
Dear All,
I am trying to find out if the support for the following parameters have been added in latest SDK i.e. Eye Tracking SDK (SRanipal) v1.1.0.1 as of 2020. In the previous posts on this forum, it has been reported than they are not supported.
Regards,
Niteshpublic bool convergence_distance_validity; public float convergence_distance_mm;
Strabismus calibration and individual eye tracking
in VIVE Eye and Facial Tracking SDK
Posted
Hi Davey, I see what you are trying to achieve. To answer your first question, in the API, there is a method "ViveSR::anipal::Eye::TrackingImprovements" to modify calibration data; however, I am not sure how helpful it would be. Using raw data gaze data for each eye, one can write a custom calibrator. There is no built-in method to calibrate individual gaze.
Since the default calibration process involves a six-point 2D plane calibration. In my opinion, even if the gaze is misaligned, it would lead to somewhat proper calibration maybe for some population. The user would still see a double image as natural.