Jump to content

francescosolera

Members
  • Posts

    1
  • Joined

  • Last visited

Reputation

0 Neutral
  1. Hello everyone, Using Unity + the SteamVR plugin, I was able to grab the passthrough images of the left and right cameras on the Vive Pro 2 as a Texture2D object and display them on a plane. To accomplish it I used the function SteamVR_TrackedCamera.VideoStreamTexture from the Valve.XR namespace. The texture returned is a single one and is composed of the left & right images stacked vertically, as shown in the attached image. It looks like is the same stacked image that is shown when you run the camera test from Vive. The resolution of such texture is 1840x1224 so once I consider only one camera it reduces to 920x1224. If I only consider the largest square crop not affected by undistortion aberration - which account for about 50% of the height of the image - all I am left with is an image of about 450x450 pixels. This seems to be consistent with the low quality I observe both when i) I display such image through Unity on the 5K headset display, and ii) I activate the "Room View Camera Mode" which is natively implemented by HTC Vive. Is this the best resolution we can get for undistorted images out of the headset cameras? Does anyone know the true resolution? It does not seem to be documented nor in the official specs. Or am I grabbing the camera images the wrong way? Thanks in advance, Francesco
×
×
  • Create New...