Jump to content

Bipul

Verified Members
  • Posts

    14
  • Joined

  • Last visited

Reputation

1 Neutral

1 Follower

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi! I am with Vive Eye Pro, and windows 10 OS. Following this instruction set (Installation of the eye tracking software (vive.com)), I installed the latest VIVE_SRanipalInstaller_1.3.2.0.msi. After running the SR_Runtime, in the tray first, it appears orange, then totally black. Then after Launching SteamVR, In the System Dashboard in VR, while I am selecting VIVE Pro Eye, there is actually nothing. Only a dark screen. So, I cannot further accept the user's terms and conditions and do eye calibration. I uninstalled, and re-installed steamvr several times, as well as tried with different sr_runtime versions. But it is all the same. Any suggestion?
  2. You attached a link Computer Keyboard Not Working? (keyboardtestt.com) that says computer keyboard is not working. How that is relevant on this thread of eye tracking!
  3. I am also having the same issue, and as usual, the vive team is missing to assist. Did anyone solve the problem?
  4. was it recorded session? Is there any link to watch it?
  5. Hello, I am developing a project, C++ and OptiX API for real time rendering. So, I want to send the rendered scene to my Vive Pro Eye and exploring all possibilities how to do that. I checked the Wave Native SDK, and I understood this native SDK only works for android as using GLES. So for a desktop VR visualization, it won't work, am I right? Also, I found the source code contains both java and CPP. If I have a real-time rendered scene with an graphics API like OptiX, and want to send to vive pro eye, how can I do that? Using OpenXR? Can you suggest any good tutorial or documentation? Thanks!
  6. for vive pro eye there is no better option than SRanipal. Tobii XR does not work, Vive OpenXR only supports unity and unreal, OpenVR SDK is deprecated, use OpenXR for hardware agnostic. not at this moment (openXR might be) -- not sure yet.
  7. Hello Dario! I am running simple C++ project without any game engines. My primary goal is after rendering the scene (using NVIDIA OptiX API), how to send it more easily to HTC Vive Pro Eye. Thanks for your previous suggestions, here I might ask some more nonsense questions (sorry, I am new in VR domain): The Vive OpenXR download page says, it is only available for Unity and Unreal Engine, as plugin. So, it means there is no way I can use it in my C++ project as I am using SRanipal SDK (native_c code)? So, if the previous case is true, that only leads to use either OpenVR or OpenXR, and as OpenVR has already been deprecated, leads no other options than using OpenXR, right? Currently I am using SteamVR, OpenXR for PC VR page says, the Vive Pro series can use Vive Console utility for PC VR. Can I use it for Pro Eye headset? Is there any additional benefit of using this application than SteamVR?
  8. Hello, To connect and send rendered data from C++ project to Vive Pro Eye, which SDK I should use? OpenVR or OpenXR or Vive OpenXR? How OpenXR and Vive OpenXR are different? Is there any benefit of using OpenXR over OpenVR?
  9. Hello! I have a Vive Pro Eye VR headset, and I am doing some real-time rendering in C++ project. I want to display the rendered scene data to my HTC Vive Pro Eye. Also, I need the eye-tracking data to render my scene accurately. I have three questions: I found the SRanipal Native SDK (C language) could be used for real-time eye tracking. Is there any better/alternative way to access the eye tracking data? How to send the rendered data to Vive Pro Eye? I find several options, e.g., Tobii XR, Open XR, Vive OpenXR, OpenVR SDK. However, Tobii XR SDK probably does not work with Vive Pro Eye. That left us last three SDKs. Which SDK could be used to send data to my headset? Can you suggest any documentation? If there could be an SDK that can handle both eye tracking and communicate with the VR, it would be even better. One more question, the SDKs (OpenXR, OpenVR) just do the communication, not any rendering, right? I meant, as I need to send stereo rendered scene to my VR, I have to render this in my rendering engine, the OpenXR or OpenVR would not take part in rendering, right? Thanks for your suggestion in advance.
  10. Thank you so much for your instant reply. I am using NVIDIA OptiX ray-tracing framework, if eye tracking could work with that, it would be amazing.
  11. Hi @paul_hhhh, see nobody answered your query, and congratulation, you have solved your problem. I believe your solution would also be helpful for me. May I ask you a question, I have an HTC Pro Eye HMD, and trying to use the eye tracing feature in my C++ project. I know SRanipal works for both Unity3D and Unreal Engine, but I want to have eye-tracking in my C++ code. Is there any documentation/ tutorial how I can do that?
  12. Hello! I am a newbie in VR, as well as real-time ray-tracing. However, I would like to see the real-time ray-tracing rendering with my HTC Vive Eye Pro. If possible also use the gaze tracking feature for a varied ray-tracing. I have already purchased a RTX 3090 graphics card for my purpose. Here are some of the problems I am facing. I have come to an end that both Unreal Engine and Unity3D is not fully developed for real-time ray-tracing in VR. Unity3D VR still attached to DX11. There are few other options, e.g., use the DX12/OptiX/Vulkan Raytracing/Falcor framework/OpenCL for implementing ray-tracing in VR. I know OpenCL, DX12 are probably commonly used APIs for real-time raytracing. However, as a beginner, I actually do not know which API could be easier to use without a dead end. I would really appreciate if the experts here could guide me how to proceed to a real-time raytracing in VR? Is there any native plugin from HTC for that? Kind regards, Bipul
  13. I have two naive questions regards the foveated rendering plugin: while I am importing the foveated rendering plugin from the unity asset store, I am only able to manually adjust the shading pattern and rate. But in some documentation found, the developer can even adjust the radius of each of the region (e.g., fovea, periphery). Why I cannot see that, what could be probable mistake? If I do not want to import the foveated rendering plugin from the unity asset store, I also can download from the git. But how to import the git repository to my project? I have imported a 3D scene (Sponza) and attach the ViveFoveatedRendering.cs to the main camera. However, while hitting the play button, and test with my HTC Pro Eye HMD, the camera stay static in the scene, and not moving around.
×
×
  • Create New...