waveSk Posted September 22 Share Posted September 22 I'm using vive wave xr plugin 5.3.2 and unity2020 to develop a unity application. In the unity C# script, I need to use native APIs to submit some custom frames, the API I use is WVR_SubmitFrame, but I find before calling it, I need to call WVR_RenderInit. While, the API WVR_RenderInit is internal in wvr.cs file, so I cannot call it, why it is internal? Also, could someone show me an example that how to render a custom frame(maybe from cloud and send to me) and display it on screen by unity C# script based on unity and vive wave xr plugin? Link to comment Share on other sites More sharing options...
Alex_HTC Posted September 22 Share Posted September 22 Howdy @waveSk Some parts are meant to be used together in a certain way. For remote rendering, there are a few approaches you can take depending on what you're doing. You could just warp the frame in unity and use it in the normal rendering pipeline. If it's stereo, then just use the eye mask to make sure that each image shows up on each eye as intended. This would be a good starting place. If you're going deeper, you can keep going until you hit creating unity graphics plugins for hybrid rendering or other uses like https://docs.unity3d.com/Manual/NativePluginInterface.html and https://github.com/Unity-Technologies/NativeRenderingPlugin Hope this helps! -Alex Link to comment Share on other sites More sharing options...
waveSk Posted September 23 Author Share Posted September 23 I'm using WVR_SubmitFrame to submit custom frames in unity C# script, but it does not work, the custom frames do not display on the screen. How to solve it? Link to comment Share on other sites More sharing options...
Alex_HTC Posted September 25 Share Posted September 25 @waveSkThe way to submit custom frames to each eye or both eyes is through standard unity mechanisms, not through device-specific code Use a camera with an appropriate eye mask that shows only your texture - left eye/right eye, and use the texture you're trying to submit there instead. This can lean on using left-eye and right-eye only layer flags on the camera. That would be my first try, then i would look at similar things in the multipass setup if that didn't work. You'll find many examples that do something like this using this api https://docs.unity3d.com/ScriptReference/Camera-stereoTargetEye.html with leftonly/rightonly and layer masks I'm sure there are more, though certainly the low level graphics apis are another way if performance concerns eliminate the left/right eye flags on cameras. These are described at https://docs.unity3d.com/Manual/NativePluginInterface.html and https://github.com/Unity-Technologies/NativeRenderingPlugin Link to comment Share on other sites More sharing options...
Alex_HTC Posted September 25 Share Posted September 25 It depends on your render pipeline a bit, but in the standard, "built-in render pipeline", the cameras can be set up in the editor for the left or right eye, and then the camera culling mask set to a new layer "left eye" and/or "right eye" Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now