Jump to content

Search the Community

Showing results for tags 'sranipal'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • VIVE
    • VIVE Community Forums
    • VIVE Technical Support
    • VIVE Developer Forums


  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



About Me

Found 20 results

  1. Vive Pro Eye Calibration Initialization Error Troubleshooting Problem: Launching calibration shows error: "Initialization Failed" Solutions: - Run "EyeCalibration.exe" manually in C:\Program Files\VIVE\SRanipal\tools\eye_calibration - Power off/on PC & Link Box. - Run 'sr_runtime.exe' as Admin. Default install path: 'C:\Program Files (x86)\VIVE\SRanipal'. - Update SteamVR Runtime. - Update graphics card drivers. - Navigate to "Change User Account Control settings" via Windows Settings search or start menu search. Change to "Never Notify" and click "OK". - Possible issue with some models of MSI laptops fixed with rollback to earlier NVIDIA driver. Fresh install of NVIDIA driver 417.71 (GPU BIOS not being updated and it does not support the latest NVIDIA driver). - Uninstall 'VIVE_SRaniaplInstaller' & 'Tobii VRU02 Runtime'. Restart and Install latest 'VIVE_SRanipalInstaller_1.1.0.1.msi'. Plug in HMD and wait for any updates to complete. - Update motherboard integrated Intel Graphics Driver (fixes system reference to incorrect openCL.dll (intel) instead of NVidia). - Disable integrated graphic card - Possible issue with wireless adapter, try running calibration with wired. - Possible issues with early dev kits. Check System Requirements: - Use DisplayPort, mini-DisplayPort, or USB-C output from the dedicated GPU. - Windows 8.1 or later (64-bit). Problem: Error in logs: "This HMD doest NOT has eye-tracking feature, Error : -5" Solutions: - Launch "SR_Runtime" and start calibration or eye tracking application - Check Device Manager for "EyeChip" under "Universal Serial Bus Devices" - Check Services for "Tobii VRU02 Runtime" - Remove any other Tobii services installed - Disconnect any other eye tracking devices attached
  2. HI! I'm using HTC Vive Pro Eye in Unity with SRanipal to track user's ocular movementes. I can save at 120HZ gaze direction from both eyes. Gaze Direction is a vector3 that return data between -1 and 1.Is it possible to covert this value in pixels? Best Regards, Mattia
  3. Hello Everyone, I recently got the Vive Pro eye and downloaded all that is required including VIVE_SRanipalInstaller_1.3.2.0 - However, this does not seem to be working. Every time I run the installer, it gives me the following error "Error 1001 An exception occured in the OnAfterInstall event handler of system. ServiceProcess ServiceInstaller --> Cannot start service SRanipalService on computer '' --> The system cannot find the file specified Any help would be appreciated, I am trying to collect data for my dissertation and I hope to get this runnning asap Thanks, Karan.
  4. I want to perform something like the video Raycasting video where using raycasting , I could get the object display name in unreal. In this video, I used first person camera to get the world Location and world rotation. Now, I would like to do the same using user gaze data to observe at which objects the user is gazing currently. I have used many blueprints using getGazedata but nothing seems to be working efficiently like the video above. I have attached a blueprint as an example of what I am currently trying to do. Gazedata blueprint Please tell me how can I get something like the video above using the Sranipal eye tracking data using blueprints in unreal.
  5. Hi! I have been using VIVE Pro facial tracker for a while. I want to get the name or the label or weight at least, like the unique data of the facial expression being shown on the screen of Unity Engine when the avatar replicates the facial expression. I have edited the code in the script named as (SRanipal_AvatarLipSample.cs) in line (39). Here are some of the the results in the console- The value '0' is shown when I make no facial expressions, but other values are shown when I am making other facial expressions. Am I doing the right way, if the respective values of the real time facial expressions are being shown or does it show something related to the (Mouth_Smile_Right) expression only? Looking for a kind reply. Thanks!
  6. I try to get the pupil diameter data in Unreal/ C++ which is not included as a function in the SRanipal_FunctionLibrary_Eye. That's the reason why I tried to directly get the verboseData or EyeData through SRanipal_Eye eg. like this: int result = SRanipal_Eye::Instance()->GetEyeData_(&eye_data) For some reason I always get the compiler error "error LINK2019: unresolved external symbol". So far I could not work out my mistake so I would be very grateful for a short example/ code snippet on how to access the pupil diameter data in Unreal/ C++ with SRanipal (not Tobii!). Thanks in advance @MariosBikos_HTC
  7. Hello! I am developing with Unity2017.4.34f1 using Vive pro eye. In the project I developed, scene transitions occur frequently, but at that time, an error [[SRanipal] Initial Eye: DEVICE_NOT_FOUND] occurs and the eye tracking function stops responding. Please help me if you know the cause or solution of this problem. Thank you! @Daniel_Y @zzy
  8. I installed SRanipal in UE4 and was able to move the character's expression morph target. However, when I recorded it using TakeRecorder, the expression morph target was not recorded as a sequence! Only the eye movements were recorded. Can you please tell me how to record the morph targets?
  9. Hi, Is there a way to start the eye calibration without using the controller and having to go through the menu? I'm planning to use it in an experiment and want to start it directly (ideally from Unreal 4.23 because I'm using that) without having to tell my participants which buttons they have to click all the time. In the SRanipal Unreal SDK document the function 'LauchEyeCalibration' is listed, which sounds promising, but this does not show up in the list of available functions in the Unreal Blueprint. How do I run this function, then? Thanks @Corvus @Daniel_Y @zzy
  10. Hi! I've been trying to get the Vive Pro Eye to work for a while now but am running into issues. In particular, I am unable to get the SRanipal robot tray icon to turn its eyes from orange to green and I've run out of ideas. I've been following the Getting Started guide but Step 3 says to "Make sure the VIVE Sranipal SDK works before going to the next step" and I am not entirely certain what that means in practice. It seems to be compressing a lot (like a potential whole other Getting Started Guide) into that one short sentence. I've downloaded and unzipped the SDK. I then downloaded and installed United and imported the package into the sample scene, as instructed in the SDK guide. I tried playing but nothing really happens so I can't verify if it's actually working or if the sample is just pretty plain. I've also had a bear of a time calibrating the Pro Eye. Out of the more than fifty times (over several days), I've only succeeded once, somewhere around the 30th attempt. Every other time, after following the dot, I get a "Calibration Failed" message. It would be helpful to know what exactly is causing the failure since the routine gets very frustrating if I don't know why it's failing. I've cleaned the lens, etc., but to no avail. I'm wondering if I'm missing something, the Getting Started Guide needs some serious editing, or the technology is really ready for use. Probably a bit of each...Thanks in advance for any help!
  11. Hi everyone, I'm struggling with reaching the specified sampling rate of 120 Hz (8.33 ms) on a VIVE Pro Eye HMD. We use the SRanipal eye callback: SRanipal_Eye_v2.WrapperRegisterEyeDataCallback() in a script, derived from MonoBehaviour. The registered callback is only called every 14~16 ms, which leads to approx. 62 Hz. Way below the targeted 120 Hz. I think the PC specs are quite decent and should allow for 120 Hz sampling: Windows 10Pro Intel i7-10750H (specs can be found here) 32GB Ram GeForce RTX 2070 with Max-Q Design Following tool versions are used: SRanipal SDK & Runtime Unity 2019.4.18f1 Pleas note that I am aware of these threads and articles, but did not find a explanation/solution that fits for me: Getting VerboseData at the fastest rate possible. - Vive Eye Tracking SDK - Community Forum Assessing Saccadic Eye Movements With Head-Mounted Display Virtual Reality Technology (nih.gov) Already many thanks, Scot
  12. Hello Forum, We are a group of students using HTC Vive Pro eye for Eye Tracking as part of our academic project. We need to get the eye tracking data at its maximum capture frequency, having tried the following methods we still fail to capture data at 120Hz, 1. We tried to set the Application.targetFrameRate to more than the default targetFrameRate but it did not work as the number of records did not change. We added it to the start function as in the link below: https://docs.unity3d.com/ScriptReference/Application-targetFrameRate.html 2. We tried implementing threads as mentioned in this link: https://forum.vive.com/topic/5897-getting-verbosedata-at-the-fastest-rate-possible/page/2/ . But Unity froze and the application stopped working. There are two codes, one Sample_MainThread.cs creates a new thread which creates an object of type Sample_GetDataThread.cs to get the data. There was a solution to add counter to the while loop which is in the class "Sample_GetDataThread.cs" to keep a counter and avoid unity from freezing. 3. Currently, we are using EyeDataCallback version 2 in SRanipal. We get data at 30-80 records per second and it varies. Please could someone suggest if this can be done and if yes how do we achieve it? Help will be much appreciated, Thank you!
  13. Hello all, I am desperate: Since Monday our eye tracking is not working anymore. As you can see in the attached Tobii-Logfile, SRanipal requires to update the firmware of the eye tracking camera. This fails and is immediately retried, resulting in an endless loop of failed firmware updates. Unfortunately this firmware update is required and prevents the regular functioning. Also the EyeChip-device is always ejected and recognized again during this process. The HTC-Vive headset and ts cameras are working fine. I tried already many things: Several SRanipal-Versions: and Several USB-PCIe cards: Mainboard: Asmedia, Inateck KTU3FR-4P (Fresco Chip), Inateck KT4004 (Fresco Chip) Several USB drivers: Microsoft 10.0.18362.693, Fresco, Fresco 3.8.35514.0 Several USB-Cables Is there a way to update the firmware manually? Is there a way to disable the firmware update? Any help is appreciated. Thank you, Flo Tobii-VR4U2P2.zip SteamVR-2020-10-09-PM_04_44_35.zip ViveSR_Log.zip
  14. Hi, Since the SRanipal documentation is not that great, and it took me a long time to get things up and running, I thought I'd share my solution here to help getting other people started. I'm not claiming it's the best solution and it's definitely not the only one, so if there are suggestions to improve it, let me know 🙂. What I'm trying to do here is sending the location and rotation of the HMD as OSC messages, as well as the eye angle measured by the HTC Vive Pro Eye. These OSC messages can be logged, or even processed in Matlab in real time, which is very useful when doing research, for example if you have to wait for the participants in an experiment to look back to the front before starting the next trial. To get them in Matlab, a tool called LabStreamingLayer (LSL) can be used (https://github.com/sccn/labstreaminglayer/wiki/Tutorial-4-a.-Receive-Data-streams-in-MATLAB). OSC messages can be converted into LSL streams using this tool: https://github.com/gisogrimm/osc2lsl. To check my implementation here, I tried putting an object in the measured gaze direction. I used the following software: - Unreal Engine 4.23.1 - SRanipal_SDK_1.1.0.1 - SR_Runtime - OSC plugin for Unreal Engine 4 Blueprints (https://github.com/monsieurgustav/UE4-OSC) To set this up, first of all, create a new C++ Basic Code project in Unreal. Close it, and install the SRanipal and OSC plugins as explained in their documentation (create a 'Plugins' folder in your project folder and copy the plugin folders there), also copy the SRanipal Content folder to the Content folder of your project. Open the project again, and check that the two plugins show up and are enabled (Edit --> Plugins), and import the new content. Now duplicate the EyeSample2 level in the SRanipal content, and rename it. Open the new level and delete all the unnecessary things, leave only: Atmospheric Fog, Light Source, PlayerStart, Sky Sphere and SRanipal_Eye_Framework. Now, VR has to be added to the project, click Add New --> Add feature or content pack... --> Blueprint Feature (Virtual Reality) --> Add to project. Find the Motion Controller Pawn in the newly added VirtualRealityBP --> Blueprints and make a copy of it, I named this 'VR_controller'. Add the VR_controller to the level and put it in (0,0,0). Click 'Edit VR_controller' in the World Outliner to open the Blueprint. In the Event Graph, add the components shown in the picture (I hope this is readable), then Compile (ignore the warnings) and Save. Don't forget to enter the IP address of the PC you want to send your OSC messages to in the Add Send Osc Target block (I'm sending it to two PC's simultaneously), it should also be sent to the PC you're running Unreal on for things to work later on. Now, we'll add an object that will point in the measured gaze direction. To do this, click Add new --> Blueprint Class --> Actor, and name this one 'GazeSteeredObject'. Add it to the level, also in (0,0,0) and click Edit. Add an OscReceiver (Add Component --> OSC Receiver), and also add an object to be in the direction of the gaze (I added a sphere). Give it a nice material if you like. The Sphere should be the child of the DefaultSceneRoot. Now edit the Blueprint. First of all, we need to give the object the same location and rotation as the HMD. To do this, add the following to the blueprint: Now, the GazeSteeredObject should be following the head direction. To put it in the gaze direction, we need to get the eye angle and add this. We'll also send this data as OSC message. Add the components in the blueprint as shown in the picture. Make sure the initial X location of the Sphere is set to 100 (or some other positive non-zero value). The 'Get Gaze Ray' function sets the combined 'Gaze Direction', which is a vector of 3. It is not documented what this is exactly, but the Y-value seems to be the horizontal eye angle and the Z-value the vertical eye angle in degrees. The X-value is always close to 100, I don't know what this is. After compiling and saving you can play the level in VR, the sphere should now move with your gaze direction (of course calibrate the eyetracker first). The sampling rate is determined by time value in the 'Set Timer by Event' block. I hope this was useful. Finally, I have some questions for the VIVE staff: - Is it true that the 'Direction' of the 'Get Gaze Ray' function gives the combined horizontal eye angle as Y-value and the combined vertical eye angle as Z-value, in degrees? And what is the X-value, some confidence value? - Using the time value in 'Set Timer by Event' I can determine the sampling rate with which my OSC messages are sent. Values of 100 Hz and 125 Hz seem to work fine for the GazeDirection, and I even tried 200 Hz for the HMD location and rotation. The data points that come out do have the set sampling rate, and it's not sending the same value multiple times, so this looks good. But what is the maximum sampling rate that the sensors can deliver, and what happens when I set my sampling rate higher? Are these the actual measured data points or are the values extrapolated somehow? Best, Maartje @Daniel_Y @zzy
  15. Hello. Is it possible to display the eyes of the user wearing Vive pro eye as an infrared image on the monitor? It will be helpful if someone tells me.
  16. I would like to make sure about GetValidity() in SRanipal Unreal SDK. (I'm using Unreal Engine 4.22.3, SRanipal_Runtime, SRanipalSDK I have checked the source code concerned to GetValidity() in SDK (see TEXT1 below) and seemed to discover that GetValidity() is incorrect in a part of the code. If the validity as an argument with GetValidity() is SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY and its flug also be been on in an eye_data_validata_bit_mask, the function should return the TRUE, however, it returns FALSE. Because, SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY defines 0 in the SingleEyeDataValidity(TEXT1) as enum. Therefore, the source code should be corrected as GetValidity() in TEXT2, or SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY should be not 0, how is this? -TEXT1(This is SRanipal_Eyes_Enums.h)- enum SingleEyeDataValidity { SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY, SINGLE_EYE_DATA_GAZE_DIRECTION_VALIDITY, SINGLE_EYE_DATA_PUPIL_DIAMETER_VALIDITY, SINGLE_EYE_DATA_EYE_OPENNESS_VALIDITY, SINGLE_EYE_DATA_PUPIL_POSITION_IN_SENSOR_AREA_VALIDITY }; typedef struct SingleEyeData { /** The bits containing all validity for this frame.*/ uint64_t eye_data_validata_bit_mask; bool GetValidity(SingleEyeDataValidity validity) { return (eye_data_validata_bit_mask & (uint64)validity) > 0; } }SingleEyeData; -TEXT2- bool GetValidity(SingleEyeDataValidity validity) { return (eye_data_validata_bit_mask & ((uint64)1 << (uint64)validity)) > 0; } Thank you. @Daniel_Y @zzy
  17. Help me! The line of sight is no longer displayed on the screen after calibration is complete. Also, it was displayed without problems until last week. If anyone knows a solution or cause, please let me know. Thank you. @Corvus @zzy @Daniel_Y
  18. Hello. I want to calculate the gaze point in Unreal Engine. I have some questions about EyeData. (I'm using Unreal Engine 4.22.3, SRanipal_Runtime, SRanipalSDK 1. Shouldn't gaze_origin_mm be used to calculate the gaze point? Why EyeFocusSample doesn't use gaze_origin_mm? 2. Is eye_data_validata_bit_mask a value that is set a bit corresponding to SingleEyeDataValidity? For example, if only gaze_origin_mm and gaze_direction_normalized are valid, eye_data_validata_bit_mask is ((1<<SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY) | (1<<SINGLE_EYE_DATA_GAZE_DIRECTION_VALIDITY))? (SingleEyeData::GetValidity() seems to be wrong.) 3. The convergence_distance_validity seems to be always FALSE. Is convergence_distance_mm supported now? Thank you. @Corvus @Daniel_Y
  19. Hi, I'm new to this and I'm trying to set up SRanipal in Unreal so that I can get the eyetracking data. I have followed the manual and can get to the point where I have to enable the plugin, but it does not show up in the list of plugins in Unreal. Any ideas what could be wrong? I tried copying the unzipped 'Plugins' folder into the project folder and also tried copying the 'SRanipal' folder into the 'Plugins' folder of the game engine. Best, Maartje @Daniel_Y @zzy @Corvus
  20. Cory, from HTC VIVE, will conduct a free workshop on Eye Tracking development using HTC VIVE’s SRAnipal SDK. Topics will include eye tracking, foveated rendering, variable-rate shading, and using eye tracking for avatar animations. If you are interested in using eye tracking or foveated rendering in your VR content then come to learn, network, ask questions, and try a demo! This workshop is free and opened to the public. You will not need a SIGGRAPH badge to attend. RSVP for your free ticket here This workshop is in partnership with LA Unity 3D Meetup, XRLA Meetup, and Two Bit Circus. There's going to be a strong & passionate community of XR developers attending, and it'll be a great opportunity to connect / network. Location: Two Bit Circus – 634 Mateo St, Los Angeles, California Date: 7/30/2019, Tuesday Time: 6:30pm - 9:00pm Hope to see you there! Check back here on our developer blog for more workshop dates in the future. - Global Developer Relations Team
  • Create New...