Jump to content

Solution Proposal for DisplayPort Transmission of VR Camera Renders


nightcat33

Recommended Posts

The following is my proposed solution for the problem, and I would like to know if it is correct:

In Unity's virtual scene, there are two virtual cameras—one for each eye—rendering different views. As a result, two separate images are generated at the same time: one for the left eye and one for the right eye, which correspond to the left and right displays of the VR headset (HTC VIVE Focus Vision). The images are transmitted via the DisplayPort interface. The question is: how should the rendered results be transmitted? Should the left and right camera renders be transmitted alternately through the DisplayPort? Alternatively, should the left and right camera renders be combined into a single image for transmission, and then split on the VR side? Can Unity control the rendering order of virtual cameras? Can Unity directly or indirectly control the data sent to DisplayPort? How should the VR headset client handle the data received from DisplayPort?

Transmission Options:

  1. Alternating transmission of the left and right camera renders:
    In this method, the rendered results for the left and right cameras are transmitted alternately through DisplayPort. The advantage of this approach is its relatively simple logic, with a stable data transmission flow. Unity can render the images for the left and right eyes sequentially and send them in turn. It is essential to ensure that the DisplayPort has enough bandwidth to support the required frame rate and resolution.

  2. Merging both camera renders into one image:
    This approach involves combining the rendered results of the left and right cameras into a single large image and transmitting it all at once via DisplayPort. The benefit here is a reduction in transmission complexity, as only one transmission is needed, and the VR headset will split the combined image back into individual left-eye and right-eye images. This method may provide better frame synchronization, but the headset will need to handle the decoding and splitting of the data.

Can Unity control the rendering order of virtual cameras?

Unity allows you to control the rendering order of virtual cameras through custom rendering pipelines and scripts, such as using the OnRenderImage or Camera.Render methods. You can use the Camera.depth property to specify the rendering priority of each camera, ensuring they render in the desired order. Additionally, using a Scriptable Render Pipeline (SRP) allows further customization of the rendering process, giving full control over the camera rendering sequence.

Can Unity control the data sent to DisplayPort?

Unity does not have direct control over the data transmission to DisplayPort. However, it can indirectly influence it by using custom plugins or external libraries to communicate with the DisplayPort hardware. Unity mainly handles generating the rendered results, while the system's underlying drivers handle the DisplayPort transmission.

How should the VR headset client handle the data received from DisplayPort?

For the HTC VIVE Focus Vision headset, after receiving data transmitted through DisplayPort, the VR headset will process these images according to its internal rendering pipeline. For the second approach, where the images are combined, the VR client will need to split the large image into individual left-eye and right-eye images and then display them on the respective screens. The VR headset’s API or drivers should be capable of handling combined images and correctly displaying the left and right eye content.

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...