Jump to content

Search the Community

Showing results for tags 'hand tracking'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • VIVE
    • VIVE Community Forums
    • VIVE Technical Support
    • VIVE Developer Forums


  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



About Me

Found 17 results

  1. Hi, I've been trying to get hand detection going for my project and this is going to be a bit lengthy, because I've already tried out so many things, so here goes: First I started with hand tracking in OpenXR. I followed the tutorial described here: https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/how-integrate-hand-tracking-data-your-hand-model/ Had to fix a couple of problems in the tutorial code, notably adding declaration for locations: private static XrHandJointLocationsEXT locations; And also a few other problems. So the script now compiles and runs, but unfortunately when I call res = feature.xrLocateHandJointsEXT(feature.m_leftHandle, locateInfo, ref locations);, the locations.isActive is always 0. I've debugged as far as I could, but I just don't seem to be getting hand tracking updates. Any suggestions would be very appreciated as I am not very familiar with OpenXR or unity. I'm using Unity 2021.3.5f1. Also camera is enabled in SteamVR settings and tested. It just looks like hands are either not detected or camera input is not reaching OpenXR. I also tried to create a completely new project and just run the sample code, launching HandTrackingScene.unity from the samples folder, that we can import with VIVE OpenXR for Windows. That also didn't work- same thing, the hands don't even appear. And same thing happens in Unity 2020.3.13f1. All I do is follow this tutorial: https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/installing-vive-openxr-pc-vr-plugin/ And then the OpenXR setup part of this tutorial: https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/how-integrate-hand-tracking-data-your-hand-model/?site=tw& And then I only run the sample project and it doesn't do anything. If I look at the code and print out a debug statement, then it is obvious that line 46 in UpdateHandData.cs always evaluates to FALSE: so the branch of: if (locations.isActive == 1) never executes. The source code of public int xrLocateHandJointsEXT(ulong handTracker, XrHandJointsLocateInfoEXT locateInfo, ref XrHandJointLocationsEXT locations) inside HandTracking_OpenXR_API seems to be closed source, so I don't know what happens beyond. I guess one more useful thing to add: I can get the hand tracking to work on Unity 2019.4.30f1 using the Hand Tracking SDK: https://hub.vive.com/storage/tracking/unity/setup.html But that requires to use OpenVR, which is deprecated in later unity versions. Unfortunately Unity 2019 is not an option for me, because I need to also run ROS-TCP-endpoint and that one only works on Unity 2020.2.x and above. I also tried out importing SteamVR plugin Version 2.7.3 (sdk 1.14.15) - February 23, 2021 into Unity 2021.3.5f1 and that allows me to use the old Hand tracking SDK with OpenVR. But that gives me two new (albeit less severe) problems: 1) Unity crashes the moment I stop my app 2) When I start my app, StemVR displays a complaint in the headset about missing actions.json file in the "Assets\StreamingAssets\SteamVR" folder. The error says: the action manifest for [Testing]<my project name> was not found. I created an empty actions.json file there and it still can't find it, so I don't know what else to try. I desperately need help on this please! I am working with a VIVE PRO headset. I am using Windows 11. Thanks.
  2. Hey We're trying to use Hand Tracking on the Vive Pro 2 in unreal, we have it working with a normal Vive, but it gives the error in the logs: LogVHTEngine: Error: Start camera failed, retrying... LogViveHandTrackingComponent: Error: VHTEngine start failed: Camera The cameras are working correctly as they work when tested in SteamVR. According to the Aristo log its trying to start a Cosmos camera (which fails cause we're not using a cosmos), so it hangs for a while looping through looking for the camera: Start Cosmos high resolution camera error: NoCameraDevice Start Cosmos camera error: NoCameraDevice It then loops through NoCameraDevice a bunch, before giving up. I've attached the log, I'm assuming this is because the Vive Pro 2 is new and isn't supported yet, wondering when support would be added for this, and if there's a possible work around for now? Thanks Alex Aristo.log
  3. Hello. I am using Wrist Trackers to track external elements in my app (the trackers are not placed on my wrist: they are placed on a tracked extinguisher). I am also using hand tracking. It was working fine until recently. However, since the last Wrist Trackers update (, whenever I am using them, they also change the position of the hands (The position of the hands returned by the hand tracking system) and place them next to the tracker. Please note that I have the same behavior in Unity using the Wave SDK and in the Focus 3 main menu (the virtual hands are placed next to their corresponding trackers). In the previous version, there was a popup allowing us to choose if the Wrist Trackers were used to enhance the hand tracking or as external tracker (this second option being what we were using). However, this pop up is no longer displayed when I pair the trackers and I couldn't find an option to disable this feature anywhere. Is there any way to go back to the old behaviour? I have a headset with Firmware version 3.3.999.446 and the Wrist Trackers are on Thank you in advance for your help. @C.T.
  4. Hey, I've been trying to get hand tracking working on my Vive Focus 3 in Unity while using VBS. These are the docs I'm currently using Setup — Vive Hand Tracking SDK 1.0.0 documentation. I've not been able to get SteamVR to detect the cameras on the headset through VBS, am I right in assuming that is not supported by VBS currently and that hand tracking is Android only? Thanks in advance for any responses.
  5. Hi. Is it possible for foucus3 devices to use hand tracking with exe files by using VIVE Business streaming? Or is the hand tracking of focus3 only available as apk? If there is a way to run it on PC, we would like to know. For example, is it possible to use Wave Unity SDK and build it for PC and use hand tracking, or is it possible to use Vive Hand Tracking SDK and build it for PC and use hand tracking? Thank you for your time.
  6. I imported SteamVR Plugin and Vive Hand Tracking SDK from the following URL into an empty project I created in Unity Editor 2020.3.19f1. Vive Hand Tracking SDK 1.0.0 : https://hub.vive.com/storage/tracking/unity/setup.html SteamVR Plugin 2.7.3 (sdk 1.14.15) : https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647 A When I opened the sample project(Assets/Samples/VIVE Hand Tracking SDK/1.0.0/Sample/Sample.unity) and enter Play Mode it worked fine, but when I exited Play Mode, Unity crashed. Please let me know if there is a way to exit Play Mode normally. Editor used Unity Editor 2020.3.19f1 SDK used Vive Hand Tracking SDK 1.0.0 : https://hub.vive.com/storage/tracking/unity/setup.html SteamVR Plugin 2.7.3 (sdk 1.14.15) : https://assetstore.unity.com/packages/tools/integration/steamvr-plugin-32647 Operation procedure Connect the VIVE Pro and launch SteamVR from Steam. Create an empty project in Unity Editor 2020.3.19f1. Import SteamVR Plugin 2.7.3. Import Vive Hand Tracking SDK 1.0.0. Follow the instructions in the window to complete the settings. Open the sample project(Assets/Samples/VIVE Hand Tracking SDK/1.0.0/Sample/Sample.unity) and enter Play Mode. Exit Play Mode after confirming that virtual hands are displayed. Expected result Exit play mode normally. Actual results Unity Editor crashes.
  7. Vive Wave SDK Starting from 4.1.0, Wave SDK officially support hand tracking interaction on Vive focus 3 for content creation. You can find detail instruction below to enable the Hand Tracking. Welcome to join the new adventure of future VR. More documentations to Getting Started for Unity Developer: https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRHand.html More documentations to Getting Started for Unreal Developer: https://hub.vive.com/storage/docs/en-us/UnrealPlugin/UnrealHand.html How to enable hand tracking In “Connectivity” page of VRS 2.0 setting, switch “Enable Hand Tracking” to ON, Manually restart the device is a must Launch hand tracking supported content , put aside the controllers on a stable surface. hand will show up after put hand in front of camera. Vive Wave SDK 4.1.0 is available at https://developer.vive.com/resources/vive-wave/sdk/ Vive Wave SDK 4.1.0 supports limited access to the camera texture API. ROM Version Information You can have Hand Tracking effective based on ROM Version 2.0.999.114/3.0.999.116 and after.
  8. Hi everyone, I was wondering, whether it is possible to track more than two hands with the SDK. So I want to track my hands and additionally the hands of another user. Is this possible and if yes, how can I set this up? Thanks for every answer!
  9. Virtual reality wouldn’t be where it is today without the incredible talent and creativity of our developer community. At HTC VIVE, we are always working to enable developers to build the content and applications that power experiences across the spectrum of reality and we eagerly listen to their feedback. We’re excited to share some updates to our Hand Tracking SDK and give you a look at what’s coming next. Now in early access, the VIVE Hand Tracking SDK is a cross-platform tool to track hand position and gesture recognition using the front camera(s) of the VIVE, VIVE Cosmos, VIVE Pro, VIVE Focus Plus and VIVE Focus. Finger tracking (21 points) is available for VIVE Pro Eye, VIVE Cosmos, VIVE Cosmos Elite, VIVE Focus Plus and VIVE Focus. We made recent improvements to add confidence in hand result, add pinch level, and reduce jitter. Here are some of the release notes for VIVE Hand Tracking SDK v0.9.3 [Early Access]: Update deep learning models for all platform. Improved accuracy while maintaining same latency. Improve hand depth calculation to match with see-through better. This includes Cosmos see through and SRWorks SDK. Speedup detection on Windows (up to 30%). Add support for Valve Index. Update mesh for hand model. Update auto-rig script to support more hand models. Support define custom gestures in skeleton mode. Added more samples and utilities. The complete list is found in the documentation included with the download. Additionally, there's a short presentation on the VIVE Hand Tracking SDK from the VRTO 2020 conference in June here: https://www.youtube.com/watch?v=YnAP0LxM0UE Later this month, we’ll also have more to share on the blog about the new features and improvements to the recently updated Wave 3.2 SDK, for all-in-one headsets such as the VIVE Focus Plus. Meanwhile, if you’d like to learn more about developing with us, please go to https://developer.vive.com/ and follow us on Twitter @htcdev
  10. Wave SDK 3.1.94 Early Access and Experimental ROM 3.07.623.336 - How to Request Access Find Wave SDK 3.1.94 Early Access here - https://developer.vive.com/resources/2020/04/01/vive-wave-sdk-early-access-updated-3-1-94/ To receive ROM access, read on..... This SDK has a medley of new features, including: Pass Through Passthrough cameras triggered when exiting virtual safety wall Use HelloVR to test this effect Also press twice power button to enable Pass Through, and you should see the launch time is very quick. Hand Gesture & Tracking A Gesture sample Gesture_Test is provided in the wvr_unity_samples.unitypackage. You can find the sample under Assets/Samples/Gesture_Test. https://hub.vive.com/storage/docs/en-us/TheGestureUsage.html Direct Preview for Unreal https://hub.vive.com/storage/docs/en-us/UnrealPlugin/Unreal_wavevr_DirectPreview.html Adaptive Quality (now working in ROM update 3.07.623.336) Adjusts CPU/GPU rendering performance levels to improve FPS https://hub.vive.com/storage/docs/en-us/WaveVR_AdaptiveQuality.html REQUEST ROM ACCESS In order to use the new features found in Wave SDK 3.1.94 you’ll need to upgrade to the latest ROM (go to Settings > System Update). Please write to @Tony PH Lin and provide your name, company, device type (Focus or Focus Plus), HMD serial number, and your current ROM Version. These are the ROMs required to receive the special ROM: Focus Plus: 3.07.623.3 Focus (dev kit): 2.04.1400.2 Expect 2 business days turn-around time. Once your request is processed, then you will receive the FOTA for ROM version 3.07.623.316 for Focus Plus or ROM version 2.11.1400.216 for Focus. You will then need to make another system update to ROM version 3.07.623.336. If you have any feedback please share with us in the comments. Cheers, The Vive Wave Team
  11. Hi, As the title implies, I would like to use Unity 2019 (version 2019.3.8f1) for my project, an application which would use hand tracking on the Vive Focus Plus. However, a few seconds after I start the application on the headset, it suddenly freezes and crashes. The same code, on a 2018 version of Unity, works perfectly. I would like to know if anyone had encountered those issues with the Vive Focus Plus and handtracking on Unity 2019, and what steps you took if you did encounter this problem. Have a good day !
  12. First, I'm not very good at English so bare with me. I'm currently working on the Sample scene in the Hand tracking sdk in Unity. I'm using XR settings of Google Cardboard which doesn't support Skeleton mode.(I suppose I'm on the 2d point mode) But can I render the skeleton model with the 21 points returned by the GestureInterface.cs(with the fake z coordinate)? I know it will seem weird but I would really want those skeleton to show
  13. I'm currently working on the Sample scene in vive hand tracking on unity Is there a way to render the hand skeleton with the 2d mode points returned (with a fake z factor) from Gesture Interface script?
  14. Good Morning, I am currently trying to use the ViveHandTracking plugin for Unity with a custom hand model (I have just sliced the default FBX with Blender above the wrist). The model is rigged exactly in the same way as the default one. I am actually able to display the new model and also seeing it moving. The issue is with finger animation, because it does not work. Is there a way to have the new model properly animated? Thank you very much
  15. Having a Cosmos for demos and development, evaluating the options for hand tracking with these questions 1) read somewhere the APIs for Cosmos are different from the one for Pro, but didn't find any specific reference for Cosmos in the API documentation. It's true, or I'm wrong. 2) I'm interested to integrate this in applications developed on Unigine. Can I expect a engine level integration, or I will need to ask developers to re-integrate hand tracking support at every engine release or every new application. Let's say how complicated is to create a new engine specific plugin. 3) Related to previous question, if will be possible to just emulate the controllers behavior, since we didn't need complex interactions. 4) will be possible to integrate Leap Motion, if this will guarantee a more stable controller emulation experience
  16. Hi, I am trying to create a steam vr driver for vive hand tracking - emulate index controllers, but it seems like there is a loop that takes over steam vr - so i can't make the tracking work in background process. Are there any plans to implement this on your side? To be able to use hands instead of controllers? Thank you.
  17. Hi, I'm working for a enterprise that needs to visualize the hands using the htc vive pro, but we can't get the hand tracking sdk installed. I'm practically new at this and I don't have too much idea of how to rebuild my project. When we copy the folder and paste into the indicated folder and try to relaunch the project, it says that "Could not be compiled. Try rebuilding from source manually". What can we do? Is there a way to do it automatically? Best regards,
  • Create New...