Jump to content

Focus 3 development with Wave and Unity's XR Interaction Toolkit


aseton

Recommended Posts

I can't get my Focus 3 to work with Action-based components of the XR Interaction Toolkit, only Device-based will have input generated, either positional/rotational or buttons.
I am using latest Wave SDK 4.1.1, newest XR Interaction Toolkit 1.0.0-pre.5 on 2020.3 LTS, Universal Rendering Pipeline and New Input System.
Exactly the same setup works for Oculus Quest 1&2 as well as Pico Interactive Neo 2.

I have seen comments here and there about using Device-based to get it to work, but no reasons why Action-based would not work.
I know it's preview software, but would be nice to I there is a misconfiguratioon on my side, or if my expecations are off.

Would be greatful for any insightful pointers,
Cheers

Link to comment
Share on other sites

Hello, I have the same issue that @aseton is describing.

I'm porting a prototype from that work well on Quest 1 & 2 and Pico Neo 3 on the Focus 3 but we are using the Action-based XRController and like aseton said with it we don't have any positional, rotational or buttons inputs generated.

On our side the main advantage of using the Action-based XRController is to take advantage of the new inputs system and its link with XR Interaction Toolkit. All of our interactions are based on the Action-based XRController so having only one type of VR Rig that works the same way with every different headsets is a huge advantage for us. Adapting all our interactions to the device-based XRController for the Focus 3 is definitely possible but it seems like a lot's of work when it works out of the box with the other headsets 😄.

Best,

Edited by Romain
Link to comment
Share on other sites

Same issue here. Having debugged it a little, I think I found two possible workarounds.

A.) Accessing the devices via common usages seems to work. E.g. use "CenterEyePosition" for the camera position.

  image.png.a39245c064598369bb6269ce32e84e0e.png

 

B.) (Probably better.) Register your own layouts. Here's an example I created for our own project (largely untested): https://gist.github.com/simon-meer/c4de3103daf62756cc13d6739f25acd5
Simply add somewhere and it should work.

 

Link to comment
Share on other sites

Hi @smeer, thanks a lot the workaround B is working perfectly on my side.

I just have a question to understand better how the layout works, in the functions FinishSetup(), when we get the childControl GetChildControl<ButtonControl>(path) what does the argument path refer to ? It does not seems to be a specific path that lead to specific controller data so I was wondering how the layout link the controller inputs to a specific InputControl ? I don't know if it really clear 😅 but I mean how does the layout know that the ButtonControl primaryButton must get the data from the A/X buttons and the ButtonControl secondaryButton the data from the Y/B buttons for example ?

Regarding the workaround A, on my side I was already using the common usages and the position and rotation of the headset was working well but I had nothing on the controllers side : no button inputs and the position, rotation were not updated.

Link to comment
Share on other sites

@Romain

Disclaimer: Most of this I found through debugging and reading the docs, so take it with a grain of salt.

The devices (HMD + Controllers) are recognized by Unity, but because no layout exists, the default layouts (XRHMD and XRController) are used as base. Unity then goes through the list of device capabilities, looks for a matching InputControl in the inherited base layout, and if none is found creates a new one. Here are the capabilities of the HMD for example -- and as you can see, the names do not match with the ones commonly used.

What I did was simply adding aliases for those properties. When building the managed representation of the device (and the InputControls), Unity then uses the common names instead of the WaveSDK names, hence GetChildControl() will find them by their common name. The mapping of primary/secondary etc. is probably already done on the native side that provides the capabilities.

I based those classes on layouts I found in the Oculus XR Plugin btw. They seem to be doing something similar.
Also take a look at XRLayoutBuilder#Build() -- that's where the InputControls are created.

Regarding workaround A; not sure how you would access them. The controllers do also use common usages (like "GripButton"), though, but maybe you would have to create the paths manually using the path syntax.

Edited by smeer
  • Like 1
Link to comment
Share on other sites

  • 3 weeks later...

Hi @smeer,

I tried your solution B and got it partially working. The controllers are correctly tracked and can be seen in the headset, yet the position and rotation of the headset isn't tracked. We use this setup:

  • Vive Focus 3
  • Vive Wave XR Plugin 4.1.1
  • Input System 1.0.2
  • Unity 2020.3 LTS
  • XR Interaction Toolkit
  • WaveDeviceLayouts.cs from github

The scene contains a simple action-based XR Rig that just works fine with an Oculus Quest 2. Do you have an idea why the headset is not tracked?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...