Jump to content

Search the Community

Showing results for tags 'rendering'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • VIVE
    • VIVE Community Forums
    • VIVE Technical Support
    • VIVE Developer Forums


  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



About Me

Found 5 results

  1. In my project, it is very important to use separate cameras for each eye. Any camera with the Target Eye set to Both or Left appears to work as intended, but when set to Right, the result on the Android HMD is like a left-eye camera; only the left part of the display renders, and the right side does not render. My project has successfully used per-eye cameras on other platforms previously. This is the first time implementing Wave platform. I can't see how to fix this issue, which is probably not a concern for most developers, but for me it is critical. I am using Wave XR Plugin 1.0.0 (com.htc.upm.wave.xrsdk) on Unity 2019.4.8 LTS. @Tony PH Lin
  2. I've got a project that I am deploying to Focus 3, but I cannot get the controllers to render in device (rendering occurs in editor). I've tried overriding the RenderModelHook but nothing seems to be working. Input works and both the reticule and guideline are rendering correctly. Using Unity 2020.3.11 Using VUI 1.13.2 'Empty' project with reproduced error available. Is Focus 3 just not supported yet for this feature?
  3. When using Vive wirless (not when using tethered) with a new Dell G5, observing a slight judder continually happening along with some stutter when moving the head. Happening with all four new G5's with Nvidia 3070's. Did not happen with Dell 8940's or 8930's with Nvidia 1080's and 16GB RAM. The judder and stutter, even though slight, make it usable for all users that have tried it. No issues when using it tethered. We have swapped out the G5 with a 8940 using all the same Vive hardware and no issues. Computer specs: NVIDIA(R) GeForce RTX(TM) 3070 8GB GDDR6 10th Gen Intel® Core™ i7-10700F processor(8-Core, 16M Cache, 2.9GHz to 4.8GHz) Dell prop motherboard 512GB M.2 PCIe NVMe Solid State Drive 16GB, 1x16GB, DDR4, 2933MHz Vive specs (proven out on Dell 8940's): Vive Pro Eye Vive Wireless
  4. Hi there, we are developing an eye test over a vive pro, and for comparison to an existing test we need to capture exactly what viewport is rendering at the original resolution (1400x1600 per eye). Does anybody think about a solution or aproach to get this. Put another camera in the rig is not the solution, we need to know which pixels are the user seeing and which not. @Corvus @MariosBikos_HTC
  5. Hi there:) I'm graduated student and i studying ray tracing in VR with Unity. My program make rendering image of each eye and send them to Unity. Then show those images to each camera. My question is how can i make stereoscopic rendering image. I found some HTC Vive's official paper but i couldn't find. Temporarily i used off axis projection stereoscopic method. If you guys know official paper or thesis of this question plz give me some feedback. Thank you 🙂 @Tony PH Lin @Cotta
  • Create New...