Jump to content

Use XR Interaction Toolkit for XR Elite Handtracking


Recommended Posts

Is it somehow possible to use Unity's XR Interaction Toolkit together with the new XR Elite?
When I start the Hands Interaction Demo from the XR Interaction Toolkit together with the XR Elite, I can see my hands and move them. I can also press buttons, but it is not possible to grab an object. I think I need an interaction profile for the XR Elite to grab things.
Has anyone been able to try this out yet?

Link to comment
Share on other sites

@Schreppy Grabbing objects tends to be done at the application layer, by creating colliders on the finger joints provided by the device apis -- friends in the vr industry tend to want to customize this very heavily based on what's going on in the application, and usually they tell me that the y start with some of the assets off of the assetstore for this purpose. We provide the bone positions and the application uses those to determine if something should be "grabbed"

Another approach is to use the interactables in the unity xr interaction toolkit and use the 'select' gesture (pinch) which works at the system level with openx, or use our wave native xr sdk and our hand gesture recognizer 

 

  • Like 1
Link to comment
Share on other sites

Thanks for the answers. But to register this pinch there is the interaction profiles right?
That would mean I need something like this from vive for the xr elite to register the pinch.

In the Open XR "Android" Settings I also have this to choose from, but I need it in the "Windows, Mac..." settings:
9EtvsVu.png

NcZVMtn.png

G1d2GHR.png

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...