UMNWalterLab Posted August 5, 2018 Share Posted August 5, 2018 Hello, We need to grab the dual camera images and simply show them on Vive Pro, and later, possibly add some NPR filters. We got it to work, partially, with SRWorks SDK, OpenCV, and OpenVR, but we are wondering if there's a simpler solution to show the images right away after grabbing the frames from the SDK. Our solution so far is to grab the openCV Mat, convert them to OpenGL textures, and show them in OpenVR. It's very slow, and to be honest it's very slow and don't seem like real. If we can eliminate OpenVR and OpenGL, we could get better performance. By the way, the official DLL version 0.7.5.0 was a nightmare, and we had to debug our code for 3 days to find out it won't work simultaneously with OpenGL. The post on this forum with the attached DLLs of 0.7.5.1 helped and works perfectly. So, my questions essentially are: 1- Is there any way to just show texture/image/picture direcltly on Vive Pro screens? 2- Is there a one-to-one translation between front cameras and Vive Pro screens? I am assuming not, because of asymmetric nature of Vive lenses. You cannot simply swap the rendering buffer with textures filled by left and right cameras’ distorted images in an OpenGL application, right? I've read somewhere that the center of each eye is leaning inwards, so are the cameras the same? 3- What happens when you change the IPD inside the Vive. Since the front cameras are fixed, is the picture moved by software automatically before rendering, or the user needs to take actions and shift the images? Thank you for any feedback. Link to comment Share on other sites More sharing options...
Daniel_Y Posted August 5, 2018 Share Posted August 5, 2018 May I know why you don't use SRWorks's Unity plugin which handle all your questions directly? Link to comment Share on other sites More sharing options...
UMNWalterLab Posted August 5, 2018 Author Share Posted August 5, 2018 We thought it will be faster if we use C++ and SRWorks directly. When we tested the Unity, the cameras had about 120-140 milliseconds lag/latency. Link to comment Share on other sites More sharing options...
Daniel_Y Posted August 6, 2018 Share Posted August 6, 2018 A time-space warping techinques is applied in SRWorks, so the perceived latency for mostly static scene is mitigated to a minimun when you fast move your head with HMD. I guess you may concern the latency with moving object but the programming language is not a key factor contributing the latency. It mostly depends on your pipeline responding to your use case. The fastest tested approach is to render your camera texture to a overaly using openvr directly. Link to comment Share on other sites More sharing options...
UMNWalterLab Posted August 6, 2018 Author Share Posted August 6, 2018 Thank you, DanY. So, by last paragraph, you mean we can get the camera feeds directly inside OpenVR? Because right now, we are using OpenVR, but in conjunction to SRWorks: SRWorks > Distorted Images > Convert to Texture > Show with OpenVR. Link to comment Share on other sites More sharing options...
Daniel_Y Posted August 7, 2018 Share Posted August 7, 2018 You could refer to the sample $(openvr)\samples\tracked_camera_openvr_sample included in OpenVR SDK for front camera access. Link to comment Share on other sites More sharing options...
UMNWalterLab Posted August 7, 2018 Author Share Posted August 7, 2018 Thank you, I will give it a try. Link to comment Share on other sites More sharing options...
LS_Tpowell Posted August 20, 2018 Share Posted August 20, 2018 On 1: Let me know if you find this out, would be great to know. On 2: The closest I've been able to get to One to One would be using the the undistorted images, with VRTextureBounds of (Left:0.2,0.0,0.78,1.0 | Right: 0.22, 0.0, 0.8, 1.0) This is not ideal, as with the unity and unreal samples you should be projecting the image further away from the eyes and accounting for motion of the user's HMD between capture frames. On 3: I noticed no discernible difference in any IPD settings while using my HMD in pure VR, AR, or passthrough. I'm not sure if there's an issue with my HMD but I would presume that IPD setting does typically impact the view as a whole since it should be a universal change across all applications regardless of the application's intent. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now