Jump to content

Rendering overlays with the Unity plugin


Recommended Posts

There's some UI stuff that I always want to render on top of everything else in my scene, regardless of whether it's in front of or behind other objects. In a normal Unity setup, I would put all of those objects on a dedicated layer, and then have a second camera whose culling mask renders only that layer, and then give that camera a greater depth than the main camera.


This approach doesn't seem to do anything when using the Wave SDK. After looking at the code a bit, it seems that the reason is that the Wave SDK bypasses Unity's regular rendering system and instead renders the two main cameras manually to render textures and passes those on to the individual eyes. And this code completely ignores any other cameras I might have in the scene.


Now my question is, is there any way to hook into that without modifying the plugin code, so that I can still render my UI stuff on top of the images provided by the left and right eye camera?

Link to comment
Share on other sites

Hi ,


Thanks for valuable inputs.

Currently there is only one way provided in WVR_Overlay for native developer, and not implemented to Engine side.

We will try this first internally then plan the implementation afterwards.

In parallel, in your practice, do you need the interaction of this top overlay (i.g. gaze mode or use controller)?

It's welcome to provide your expectation for the use case then we can plan into SDK.


Link to comment
Share on other sites

Thanks for the reply. Yes, I'd need to be able to interact with the overlay, usually via the controller (gaze is usually not precise enough for our use case). These are essentially some additional controls that are only available in the dev build, but it's important that they can't be obscured by regular scene geometry and that they can always be interacted with (think of it as in-headset level editor, and while you're editing the level you might accidentally move an object between you and these editing controls).

Link to comment
Share on other sites

More feedbacks:


Unity suggest not to use Screen-Space camera UI in Virtual Reality.


Please try to use World space UI.


A solution for your purpose:

  1. Do off-screen rendering:  Set a RenderTexture to Camera.renderTarget of your UI camera.
  2. Create a UI or GameObject to show the RenderTexture in the world space.
  3. Make the UI or GameObject follow head.  Keep it in front of eye.
  4. Optional:  Give a Material which turn off depth test to the UI or GameObject.
Link to comment
Share on other sites

am using world-space UI. Which is why that UI can be obscured by other scene geometry and why I need to make sure I always render it on top.


Your suggested solution only helps if I could ensure that there's never anything between the object which displays the RenderTexture and the eye camera, but if I could do that, then I wouldn't need a RenderTexture in the first place.


Basically, there's some objects that I need to be able to guarantee that they're visible and interactable, but I don't have any guarantees that objects won't move between the eye camera and the object I want to render.


I'll try switching off the depth test, but I think that would require making sure that this object is rendered last? Otherwise the obscuring object would still be rendered on top of it, right?

Link to comment
Share on other sites


This topic is now archived and is closed to further replies.

  • Create New...