Jump to content

Corvus

Verified Members
  • Posts

    313
  • Joined

  • Last visited

Everything posted by Corvus

  1. This post has been moved. Follow the link to the new location. Thanks!
  2. Currently the eye calibration runs as a seperate program and does not have any access for developers to modify. I can share your feedback with the SDK teams regarding calibration order and advancing the process. Are you encountering issues with the calibration and accuracy or are you asking for another reason?
  3. , Like Position X, the number keeps changing between 0.3727 1587 and 0.3728456. The position info looks accurate to me. The changes you're seeing between those values is less than 100th of a centimeter and I would expect to see such small variations even if the base stations and controller were solidly mounted. 1. The velocity keeps changing up and down slightly around the value. Like Velocity Y, the number keeps changing between -0.004792385 and -0.004795785. Similar response as the positoin accuracy. If you want you can round the values to a significant digit and not see the variations. 2. Sometimes really big values appear. Like Velocity x, it reached around 9.35334. Again, this happens when the controller is firmly placed on the table without moving. Position/Velocity values may jump around when tracking is lost or interrupted. Is your controller placed in a location with clear line of sight to both base stations? Did you step between the controller and base station or is there any other possible occlusion?
  4. Thanks for catching this! We'll get this fixed in an update.
  5. This post has been moved. Follow the link to the new location. Thanks!
  6. , Checking if the HMD is a pro eye was added in SDK 1.0.1 with this function: ViveSR.anipal.Eye.SRanipal_Eye.IsViveProEye(); You can find more info about it in the html docs here: SRanipal_SDK_1.0.1.0_Eye/Eye/02_Unity/Document/html/class_vive_s_r_1_1anipal_1_1_eye_1_1_s_ranipal___eye.html#aa965380861092964677bec0a39a93844 This will allow you to check for the pro eye without initializing.
  7. We are looking into this issue and I would like to share a possible workaround in the mean time. If anyone else is experiencing this calibration "initialization failure" error please post here or pm me so we can follow up if necessary. Temporary workaround for calibration "initialization failure" error: - uninstall "VIVE_SRanipalInstaller" & "Tobii VRU02 Runtime" - install VIVE_SRanipalInstaller_1.0.1.0.msi - run SRanipal runtime - attempt calibration, receive error - wait for SRanipal_update_service to prompt for new version update - let it update to latest (1.0.3.0) - restart Steam & SteamVR - restart SRanipal Runtime - test if calibration initializes
  8. 1. What are the accuracy and the unit of the eye position get from eyeData.verbose_data.left.gaze_origin_mm? I printed the value and found that I can get the value of five or six decimal place. Are they really present the accuracy of the eye position or just only noise data. Units for eye position are reported in millimeters (mm). I will follow up with more info about accuracy. 2. What does SetEyeParameter() really do? What is the sensitive factor of gaze ray in [0,1] mean and what can it affect the user? I will follow up with more info about SetEyeParameter and how it works with GazeRayParameter to set the "sensitive_factor". 3. The Focus Sample in the SDK present 3 DartBoards that can show the eye focus point in different areas, but I am curious about the tremble that occurs on the border of different areas is caused by the real physiological natural eye reflex such as nystagmus? or just because the device is not accurate enough? The 3 dartboards are placed at different distances to highlight the ease/difficulty of focusing on small points from different distances. 4. I am using EyeFocusSample to get the eye focus point at the Sci-Pi style Environment from Unity Asset Store, however, after adding the Focus(), the performance becomes low to cause the frames latency very obvious. Is this caused by considerable objects or the Raycast() used in the Focus()? if so, what's the limitation or baseline of using Focus()? Can you share a code snippet showing how you're using Focus()? And how many objects with colliders are involved in the scene? I haven't heard of any developers getting large performance problems from using Focus. 5. I need to get eye focus points at different realistic VR environments, but due to Q4, the latency actually cause a huge negative effect on my experiment. So, is there any possible to use the two directions from both eyes to calculate the eye focus point? Or the result would be very inaccurate and very different from the FocusInfo.point that get from Focus()? You can use the gaze ray or a custom solution to calculate focus point but the Focus function should provide good results.
  9. After looking into this issue I have some good news to share with developers. In January HTC Vive announced a partnership deal with Mozilla for Firefox Reality to be the default internet browser: https://blog.mozilla.org/blog/2019/01/08/mozilla-announces-deal-to-bring-firefox-reality-to-htc-vive-devices/ https://blog.vive.com/us/2019/01/07/htc-vive-teams-mozilla-aws/ This partnership means that with the next system update (no confirmed schedule yet) Firefox Reality browser will be integrated as the default browser for Vive Focus and Focus Plus! Firefox Reality is a great VR browser and developers who have it installed may have noticed it will work with WebVR, android intents, and Unity functions such as Application.OpenURL("http://www.google.co.uk"); Stay tuned to the community forums for more information and updates. ( )
  10. No, mono 360 video will not have stereoscopic 3D. Over/under or SBS 3D video provides different images to each eye for stereo depth that cannot be achieved with a mono 360 video where each eye is receiveing the same video (even if offset for lenses/fov).
  11. 1. Please check with Amazon about Sumerian support for Vive Pro Eye. Vive Pro and Vive Pro Eye share similar HMD specs but for eye tracking support there are different requirements and a runtime to handle the eye tracking data. 2. Eye tracking requires the SR_Runtime application which transmits the eye data to the VR application which has integrated the SDK. There is no WebVR/Sumerian support that I'm aware of. 3. The eye tracking runtime requires Windows 8.1 or later (64 bit) and will not run on a Raspberry Pi or browser extension.
  12. Hey Chris, Are you having an issue with the android manifest or with running the sample apps?
  13. Sorry to hear you are having some issues on specific PCs. Let's try to narrow down the issue and get it working for you! Here are some questions and suggestions to get started: Are you using Windows 8.1 or later (64-bit)? What version of SteamVR are you using? Do you have admin rights on the logged in account? Do you see and accept the Windows UAC popup? Do you have the robot face icon in the notification tray? With the Pro Eye plugged in and link box powered on, does the notification icon robot have black eyes or are they green/red? If you right-click the notification icon and select "About", what eye camera version is shown or is it "N/A"? Can you try uninstalling these: - Tobii VRU02 Runtime - VIVE_SRanipalInstaller Restart PC. Install latest runtime (VIVE_SRanipalInstaller_1.0.3.0.msi) Plug in HMD and wait for any updates to complete. If you are still having issues can you post or PM me a SteamVR system report? SteamVR > Settings > General > Create System Report
  14. Have you thought of using a text config file instead of command line arguments? What specific functionality are you trying to expose to arcade operators?
  15. This bug will be fixed in the next SDK release, thanks again!
  16. There is "Viveport Video" app for playing videos on Focus (currently it does not have a movie theater environment): https://www.viveport.com/mobileapps/8117abd4-b7e7-4c74-bdc5-b9e271e86b19
  17. Included with the SDK are pdf guides for Unity & Unreal (VIVE 3DSP SDK guide for Unity plugin- 0.10.0.pdf) and there is also this talk from our GDC Developer Day with useful information to help you get started:
  18. Hi Andy, Currently leaderboards are created and reset via the online dev console but I have also heard from other devs recently asking for dynamic leaderboards. Thanks for sharing your feedback and I will work with the SDK team to put it on the roadmap.
  19. PM sent. FYI, the lip tracking module is not a consumer device and is not planned for sale currently. We are working with developers and enterprise that are interested in experimenting with it for potential applications.
  20. The VIVE Foveated Rendering Plugin has been released for developers to integrate: Unity https://github.com/ViveSoftware/ViveFoveatedRendering Unreal https://github.com/ViveSW/UnrealEngine/tree/VariableRateShading-4.23.1 It's required to link your Epic Games account to GitHub account and get authorized by Epic Games to see the repo. Step by step tutorial: https://www.unrealengine.com/en-US/ue4-on-github Getting started with Foveated Rendering using Unreal Engine: https://forum.vive.com/topic/7434-getting-started-with-vrs-foveated-rendering-using-htc-vive-pro-eye-unreal-engine/ Introduction Vive Foveated Rendering is a rendering Unity plugin which reduces the rendering work load through cutting edge GPU technologies. This plugin supports both fixed and eye-tracked foveated rendering. The developer could easily apply foveated rendering to their VR application and adjust shading rate and region size for either better performance or better quality, according to their requirements. Features Foveated Rendering Variable Shading Rate(VRS) Support Eye Tracking(for VIVE Pro Eye only) System Requirements: Operation System: Windows 7 with DirectX 11 Graphics Card: NVIDIA Turing based GPUs Driver Version: 430 and later (Optional)Eye Tracking: SRAnipal v1.0.0.0
  21. Introducing The VIVE Pro Eye The VIVE Pro Eye has launched. Developers will now be able to create more immersive experiences using precision eye tracking and foveated rendering. The headset features 120Hz tracking and 0.5°–1.1° accuracy for amazing eye tracking performance and is the preferred VR headset for NVIDIA Variable Rate Shading (VRS). Needless to say, eye tracking is an exciting new technology. It allows for enhanced gameplay interactivity and simplified input and navigation—with our SDKs, you can make hand controllers entirely optional. Plus, eye tracking enables foveated rendering, which empowers developers to improve performance and enhance visual fidelity in virtual and extended reality experiences. Let’s dive in and see what it can do for you. SRanipal SDK To get started building experiences for the VIVE Pro Eye, developers can download the SRanipal SDK and SR runtime here: https://hub.vive.com/en-US/profile/material-download The SRanipal SDK allows developers to track and integrate users’ eye and lip movements, empowering developers to read intention and model facial expression. The SRanipal SDK requires the SR runtime which runs in the notification tray to show the current eye tracking status for VIVE Pro Eye. Plugins for Unreal and Unity are included in the eye tracking SDK along with sample code for native C development. If you have any questions, you can join the VIVE SRanipal SDK forum to engage directly with our developer relations team and other VIVE Pro Eye developers: https://community.viveport.com/t5/Vive-SRanipal-SDK/gp-p/ViveSRanipalSDK Foveated Rendering Compatible with: Turing based GPUs. (GeForce RTX and Quadro RTX) Foveated rendering allows VR developers to increase performance and visual quality by supersampling where the user's eyes are focused and undersampling the peripheral vision. Pushing the limits of VR rendering is important to developers—that’s why we made integrating the plugin straightforward. VIVE is releasing a separate foveated rendering plugin for Unity and NVIDIA will launch the Variable Rate Shading or VRS Wrapper, which is part of the VRWorks Graphics SDK. The new VRS Wrapper NVAPIs assist developers in integrating foveated rendering within their application more easily, with less code. [Foveated Rendering Unity Plugin] https://github.com/ViveSoftware/ViveFoveatedRendering https://assetstore.unity.com/packages/tools/particles-effects/vive-foveated-rendering-145635 Here is a quick example of foveated rendering integration with the Unity plugin: Import the VIVE Foveated Rendering plugin. Attach ViveFoveatedRendering.cs to your main VR camera. Click play. Foveated rendering should be automatically enabled if your system meets the requirements. ! NOTE: Requires Eye Tracking Plugin: Vive-SRanipal-Unity-Plugin.unitypackage For more information regarding NVIDIA Variable Rate Shading and Foveated Rendering: https://devblogs.nvidia.com/vrs-wrapper https://developer.nvidia.com/vrworks/graphics/variablerateshading https://devblogs.nvidia.com/turing-variable-rate-shading-vrworks/ SDK Features & Use Cases Eye Tracking SDK Features: - Pupil Position - Pupil Diameter - Eye Openness - Focus Object - Gaze Ray - Eye Blendshape Morphs - Foveated Rendering (requires plugin) Enterprise Use Cases: - Training - 3D Design Tools - Medical - Cognitive Function Testing - Analytics - Expressive Avatars / Character AI - Visualization - UI Controls Gaming Use Cases: - Foveated Rendering - Gameplay (Aim Assist, Object & Menu Selection) - Avatar Eyes - Recording Eye Animations - Analytics & Visualizations To purchase a VIVE Pro Eye and see official specs, please visit https://www.vive.com/us/pro-eye/
  22. Thanks for the bug fix! We will pass this info along to the SDK team.
  23. Hey KospY, I see what you're saying and thanks for sharing the sample code. We're working on improving the Viveport SDK, samples, and documentation and this is a great example of how it can be improved. One thing I want to check with you is what parts of the Viveport API will you be using? If it's only for the DRM then it's worth noting that you can use your existing Steam build and we provide a DRM wrapper that you can use without integrating the Viveport SDK. Regarding the sample code, I've written a version that checks for successfull initialzation after 10 seconds and quits if the callback has not completed: using UnityEngine;using System;using Viveport;public class InitTest : MonoBehaviour{ static string VIVEPORT_ID = "bd67b286-aafc-449d-8896-bb7e9b351876"; private static bool bInitComplete = false; void Start() { Api.Init(InitStatusHandler, VIVEPORT_ID); Invoke("CheckInitStatus", 10); } private static void InitStatusHandler(int nResult) { Debug.LogWarning("INIT: " + nResult); Viveport.Core.Logger.Log("Init(): " + nResult); if (nResult == 0) bInitComplete = true; } static void CheckInitStatus() { Debug.Log("check init status"); if (!bInitComplete) { Debug.LogWarning("init failed"); Application.Quit(); } }}
  24. Can you try importing and enabling OpenVR in Unity: Import OpenVR Package Window > Package Manager > OpenVR (Desktop) > Install Enable OpenVR Edit > Project Settings > Player > XR Settings > Enable Checkbox "Virtual Reality Supported"
  25. Hello Kospy, The ViveportDemo.cs script is an example to help devs get started. Currently the script does not handle if there is an Init or DRM failure which is expected to be implemented by the developer. Usually the developer will have the application quit or show a message to the user for the issue. In the examples on the DRM page there is the lines "// Handle error and close your game." Here is where you can call "Application.Quit();". Otherwise, if the DRM and Init is successfull you can then launch into the game. https://developer.viveport.com/documents/sdk/en/api_drm.html
×
×
  • Create New...