Jump to content

zzy

Verified Members
  • Posts

    219
  • Joined

  • Last visited

Everything posted by zzy

  1. Hi @Tima To solve this problem, please add the following lines to interface_gesture.hpp: struct GestureVector3 { float x, y, z; }; struct GestureQuaternion { float x, y, z, w; }; INTERFACE_GESTURE_EXPORTS void SetCameraTransform(GestureVector3 position, GestureQuaternion rotation); Please call SetCameraTransform function every VR frame (i.e. 90 times per second) to pass the HMD transform to native side. Like GestureResult, position and rotation are in Unity axis. After calling SetCameraTransform function, you no longer need to apply HMD transform to GestureResult. Note: This change will also be available starting from next release.
  2. Hi @Tima Do you call UseExternalTransform(true) before StartGestureDetection? This could be the cause for the problem, as you are applying hmd transform on GestureResult, making your hands move with your HMD. By the way, since the raw camera fps is 60, setting maxFPS above 60 has no use.
  3. Hi @Tima The current plan for next release is about early February. About grab strength, do you have any special use case that cannot use pinch strength? Also, do you have any suggestion about how to calculate grab strength from skeleton? I would think of using average closing angle of each finger, but not sure how to handle the case when some fingers are closed while others are stright (e.g. point/two gesture).
  4. @Siyo Just like my colleague as mentioned, this is a developer feature. If you want to see it in any of games you are playing, you need to make suggestion to the developers. The Neos VR has adopted our SDK in this September after requested by their users: https://github.com/Frooxius/NeosPublic/issues/339
  5. Hi @Tima Thanks for the suggestion. We plan to add support for joint rotation/pinch direction in c++ interface in next release.
  6. Hi @Tima Thanks for spotting the bug. We have fixed this internally.
  7. Hi @Tima After some debugging this week, I found the problem is caused by a calling convention error in openvr library. After upgrading openvr from 1.0.14 to 1.3.22, the problem is solved. Please find the attched patch for win32 (only include aristo_interface.lib and aristo_interface.dll). Let me know if this works on your side. 0.9.4_win32_patch.zip
  8. Hi @Tima Thanks for the minidump.
  9. Hi @Tima Thanks for the feedback. I'll check this week to see if I can reprodcue it on my side.
  10. Hi @kilik For first question, another thing I can think of is to install MSVC 2015 update 3 runtime on your PC. The dll not found problem might be caused by failure to load dependency dll. For second question, if your image is not correct, then it's possible hand detection fails. If you are hand in editor, is that using EditorEngine (which generates fake hand data)? For camera problem, I would suggest to follow steps here: https://hub.vive.com/storage/tracking/misc/steamvr.html
  11. Hi @kilik Here are some some points I can think of to check. You are using the latest 0.9.4 version. You are running Windows Unity Editor. Linux/Max is not supported. You can try to run the pre-built windows sample and see if that can work.
  12. Hi @Fangh Hand tracking SDK provides some extra function than Hand package in WaveVR package. You can see the document here for details: https://hub.vive.com/storage/tracking/unity/advanced.html Since hand package in WaveVR might change API in future, I would suggest to use Hand tracking SDK for your application.
  13. Hi @ericvids Yes, it's not normal why the program picked up Intel graphics but not NVIDIA one. I would suggest to check if your NVIDIA driver and OpenCL function is installed correctly. I recommend to download the latest driver from https://www.nvidia.com/Download/index.aspx?lang=en-us Also, thanks for your suggestion for the log. I'll add it in the next version. For now, you can modify `AssetsViveHandTracking\Scripts\Engine\ViveHandTrackingEngine.cs` (about line 127) and add the log: if (index <= lastIndex) return; if (index > 0 && lastIndex == 0) Debug.Log("Detection initialization completed"); lastIndex = index;
  14. Hi @ericvids Indeed, thre is no error message from unity log. So I assume the hand tracking is running (or still initializing). To run on Windows, our SDK will neeed to compile OpenCL kernels at runtime, which can be quite time consuming for the first run. This might take a few minutes before the hands can be detected. Please make sure you let the program run for a few minutes and check if hands are detected or not. There also can be other reasons which caused detection failure. I would suggest to upgrade GPU driver to latest version from NVIDIA/AMD official site.
  15. Hi @Tima We have released v0.9.4 today. x86 build is included in the C++ plugin.
  16. Hi everyone, Vive Hand Tracking SDK has a new release v0.9.4 today. You can download it from here. Highlight of changes: Update deep learning models for all platforms. Add the option to limit max raw detection fps to save computing resources. (Unity) Add assembly definition files as separate package for Unity 2019 or newer. (Unity) Add support for Wave XR Plugin. (Unreal) Move native binary to a module in the project. All the contents of plugin are inside project folder. (Unreal) Experimental support for WaveVR Hand API (requires WaveVR 3.2.0). (C/C++) Add x86 binary. For detailed changes, please refer to release notes.
  17. Hi @stefy_rzv Although some newer phones have multiple cameras, hand tracking sdk only support one camera on Android phones. So it's always in 2d point mode for any Android phones. The statement for 3D position is only valid for VR HMD.
  18. Hi @ericvids After a quick test, I cannot reproduce this problem locally. My steps are listed below. Please let me know if I missed something. Unity version is 2018.4.19f, Vive Hand Tracking SDK 0.9.3, SteamVR plugin 2.6.1 (latest from Asset store). SteamVR version 1.14.16 I used SteamVR_TestTrackedCamera example scene as a start point, since this already has a camera texture. Please first make sure this scene alone can work fine. Attach GestureProvider script to [CameraRig]/Camera game object. For display purpose, add LeftHandRenderer and RightHandRenderer prefabs to the scene. Prefabs are inside Assets/ViveHandTracking/Prefab folder. Play the scene now, you can see both hand skeleton and camera texture. Tested on both Vive and Vive Pro.
  19. Hi @ericvids From my previous experiences, I think cameras from SteamVR can be accessed by multiple callers. A quick example is that when hand tracking is running, you can still see the camera preview in camera tab from SteamVR settings. As for how to access the camera frame/textures, you can refer to the OpenVR sample code here: https://github.com/ValveSoftware/openvr/tree/master/samples/tracked_camera_openvr_sample I'll have a check next week to see if I can reproduce this on my machine.
  20. Hi @毛.sy Good to know it's working. I will include this change in the next release.
  21. Hi @毛.sy Have you tried the solution I posted in another thread? https://forum.vive.com/topic/8921-hand-tracking-use-in-unity/?do=findComment&comment=37867
  22. Hi @毛.sy Can you please try if the following changes solve your problem? Please change UpdateResult function in Assets\ViveHandTracking\Scripts\Engine\ViveHandTrackingEngine.cs to: public override void UpdateResult() { var transform = GestureProvider.Current.transform; IntPtr ptr; int index; var size = GestureInterface.GetGestureResult(out ptr, out index); if (index < 0) { Debug.LogError("Gesture detection stopped"); State.Status = GestureStatus.Error; State.Error = GestureFailure.Internal; return; } if (index <= lastIndex) return; lastIndex = index; if (State.Status == GestureStatus.Starting) State.Status = GestureStatus.Running; if (size <= 0) { State.LeftHand = State.RightHand = null; return; } bool isLeft = false; var structSize = Marshal.SizeOf(typeof(GestureResultRaw)); for (var i = 0; i < size; i++) { var gesture = (GestureResultRaw)Marshal.PtrToStructure(ptr, typeof(GestureResultRaw)); ptr = new IntPtr(ptr.ToInt64() + structSize); for (int j = 0; j < 21; j++) gesture.points[j] = transform.TransformPoint(gesture.points[j]); State.SetRaw(gesture); isLeft = gesture.isLeft; } if (size == 1) { if (isLeft) State.RightHand = null; else State.LeftHand = null; } }
  23. @davide445 All existing headset support will continue in v1.0
  24. Hi @davide445 Yes, we are definitely maintaining back compatibility, ever since the first release of SDK.
  25. Hi @毛.sy We don't support changing camera scale in the current version. Can you please share your use case why you need to scale the camera in a VR application? Can you please verify if setting camera scale to (1,1,1) can solve the problem?
×
×
  • Create New...