Jump to content

Open XR Unity - Focus 3 - Hand Input - no Grab ?? -


Recommended Posts

Hello, 

I am trying to use Focus 3 with open XR. 

I can do a lot but I am not able to grab objects. 

I am trying to grab an object from the Hands Demo Scene (XR Interaction toolkit 2.3.2 package +  Hands Interaction Demo ==> Hands Demo Scene) 

 - What am I missing ? 

 - Why Is the grab action ( or pinch ) not recognized ? 

In the OpenXR settings (in Plug-in management) this is what I have: 

image.png.d6055a0224cb3b47c66602435312bb76.png

 

From then, I thought I would go and create a new input action for my focus 3. 

In the OpenXR - Developer Resources page on the vive documentation, there is a section for hand tracking to interact with objects (https://developer.vive.com/resources/openxr/openxr-mobile/tutorials/unity/hand-tracking/interact-objects-remotely/)

1) I cannot find the Hand Interaction Profile to add in the project settings >  XR Plugin Management > OpenXR 

Interaction003.png

2) I cannot find the Focus 3 in the Input Action device list 

image.png.4fa78755e5f7b197ecd33c15feedb9d3.png

I suppose this would only show if VIVE Focus 3 Hand Interaction profile is added. 

Setup : Unity 2021.3.15f 

I am currently using these packages from the VIVE registry : 

image.png.228745ce2ec7706b658361f116ec10bd.png

 

Any help would be appreciated. 

 

 

Thanks in advance !

 

Julien 

Link to comment
Share on other sites

@JulienActiveMe

I hear that these things can be a little complicated. 

Make sure to install the "Vive OpenXR ToolKit - Android" package as that is what adds support for the "selected" feature and you should then see the "Vive xr hand interaction"  
image.thumb.png.5b2014c72b94762bd6b3a2f840437f4b.png

With this is my understanding that we support the "selected" event (by way of a pinch gesture) using the openxr runtime at the moment. It is my understanding that additional hand gestures are in progress under the openxr hand gestures. I may be a bit behind on this and maybe @Tony PH Lin knows of any additional gestures supported by openxr in the xr interaction toolkit.

If you switch to our wave native hand gestures, there are some additional hand gestures documented here https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRHand.html


 

Link to comment
Share on other sites

@JulienActiveMe

On second thought, 
Typically grab is implemented using physics - so setting up colliders/rigidbodies on the hand joints themselves - this is exposed by the openxr api itself and application level logic will dictact if there is a collision and/or grab 

The deeper specifics of this device is on the ViveHandInteraction.cs file in the standard openxr plugin -- so make sure that the project is compiling as well, as otherwise this profile may not show up for you

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...