Jump to content

Streaming and real-time processing of data with Unreal (my solution)


Recommended Posts

Hi,

Since the SRanipal documentation is not that great, and it took me a long time to get things up and running, I thought I'd share my solution here to help getting other people started. I'm not claiming it's the best solution and it's definitely not the only one, so if there are suggestions to improve it, let me know 🙂. What I'm trying to do here is sending the location and rotation of the HMD as OSC messages, as well as the eye angle measured by the HTC Vive Pro Eye. These OSC messages can be logged, or even processed in Matlab in real time, which is very useful when doing research, for example if you have to wait for the participants in an experiment to look back to the front before starting the next trial. To get them in Matlab, a tool called LabStreamingLayer (LSL) can be used (https://github.com/sccn/labstreaminglayer/wiki/Tutorial-4-a.-Receive-Data-streams-in-MATLAB). OSC messages can be converted into LSL streams using this tool: https://github.com/gisogrimm/osc2lsl. To check my implementation here, I tried putting an object in the measured gaze direction. I used the following software:
- Unreal Engine 4.23.1
- SRanipal_SDK_1.1.0.1
- SR_Runtime 1.1.2.0
- OSC plugin for Unreal Engine 4 Blueprints (https://github.com/monsieurgustav/UE4-OSC)

To set this up, first of all, create a new C++ Basic Code project in Unreal. Close it, and install the SRanipal and OSC plugins as explained in their documentation (create a 'Plugins' folder in your project folder and copy the plugin folders there), also copy the SRanipal Content folder to the Content folder of your project. Open the project again, and check that the two plugins show up and are enabled (Edit --> Plugins), and import the new content. Now duplicate the EyeSample2 level in the SRanipal content, and rename it. Open the new level and delete all the unnecessary things, leave only: Atmospheric Fog, Light Source, PlayerStart, Sky Sphere and SRanipal_Eye_Framework. Now, VR has to be added to the project, click Add New --> Add feature or content pack... --> Blueprint Feature (Virtual Reality) --> Add to project. Find the Motion Controller Pawn in the newly added VirtualRealityBP --> Blueprints and make a copy of it, I named this 'VR_controller'. Add the VR_controller to the level and put it in (0,0,0). Click 'Edit VR_controller' in the World Outliner to open the Blueprint. In the Event Graph, add the components shown in the picture (I hope this is readable), then Compile (ignore the warnings) and Save. Don't forget to enter the IP address of the PC you want to send your OSC messages to in the Add Send Osc Target block (I'm sending it to two PC's simultaneously), it should also be sent to the PC you're running Unreal on for things to work later on.

745400618_VR_controllerblueprint2.thumb.png.7b0d61980ae56f79081ff7a776cb40c8.png

Now, we'll add an object that will point in the measured gaze direction. To do this, click Add new --> Blueprint Class --> Actor, and name this one 'GazeSteeredObject'. Add it to the level, also in (0,0,0) and click Edit. Add an OscReceiver (Add Component --> OSC Receiver), and also add an object to be in the direction of the gaze (I added a sphere). Give it a nice material if you like. The Sphere should be the child of the DefaultSceneRoot. Now edit the Blueprint. First of all, we need to give the object the same location and rotation as the HMD. To do this, add the following to the blueprint:

79704484_GazeSteeredObjectblueprintsetlocationrotationtoHMD.thumb.png.8654026cb4b7af53d9eb64bc4b638045.png

spacer.pngspacer.pngspacer.pngspacer.pngNow, the GazeSteeredObject should be following the head direction. To put it in the gaze direction, we need to get the eye angle and add this. We'll also send this data as OSC message. Add the components in the blueprint as shown in the picture. Make sure the initial X location of the Sphere is set to 100 (or some other positive non-zero value). The 'Get Gaze Ray' function sets the combined 'Gaze Direction', which is a vector of 3. It is not documented what this is exactly, but the Y-value seems to be the horizontal eye angle and the Z-value the vertical eye angle in degrees. The X-value is always close to 100, I don't know what this is. After compiling and saving you can play the level in VR, the sphere should now move with your gaze direction (of course calibrate the eyetracker first). The sampling rate is determined by time value in the 'Set Timer by Event' block.

1871861431_GazeSteeredObjectblueprintaddeyeangle.thumb.png.1c193f3f525b969821d59477db4fe97a.png

I hope this was useful. Finally, I have some questions for the VIVE staff:
- Is it true that the 'Direction' of the 'Get Gaze Ray' function gives the combined horizontal eye angle as Y-value and the combined vertical eye angle as Z-value, in degrees? And what is the X-value, some confidence value?
- Using the time value in 'Set Timer by Event' I can determine the sampling rate with which my OSC messages are sent. Values of 100 Hz and 125 Hz seem to work fine for the GazeDirection, and I even tried 200 Hz for the HMD location and rotation. The data points that come out do have the set sampling rate, and it's not sending the same value multiple times, so this looks good. But what is the maximum sampling rate that the sensors can deliver, and what happens when I set my sampling rate higher? Are these the actual measured data points or are the values extrapolated somehow?

Best,

Maartje

 

@Daniel_Y @zzy

  • Thanks 1
Link to comment
Share on other sites

  • 2 weeks later...
  • 1 month later...

Hi @GestureLab,

Thank you for sharing your process for other people to find. I'm working on something quite similar so it has been very useful. Specifically, I setting up LSL to record data from OptiTrack, UE4 and some external sensors.

Would you mind sharing how you are using the osc2lsl tool? Unless I'm missing something, there is no clear documentation. Any help would be really appreciated.

Best of luck with your project,

Nic

Edited by OP_MEG
Added account link
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...