aseton Posted August 23, 2021 Share Posted August 23, 2021 I can't get my Focus 3 to work with Action-based components of the XR Interaction Toolkit, only Device-based will have input generated, either positional/rotational or buttons. I am using latest Wave SDK 4.1.1, newest XR Interaction Toolkit 1.0.0-pre.5 on 2020.3 LTS, Universal Rendering Pipeline and New Input System. Exactly the same setup works for Oculus Quest 1&2 as well as Pico Interactive Neo 2. I have seen comments here and there about using Device-based to get it to work, but no reasons why Action-based would not work. I know it's preview software, but would be nice to I there is a misconfiguratioon on my side, or if my expecations are off. Would be greatful for any insightful pointers, Cheers Link to comment Share on other sites More sharing options...
Tony PH Lin Posted August 27, 2021 Share Posted August 27, 2021 Hi @aseton, As you mention it's still a preview version, and there is also Unity Input System existed before. Could you elaborate more what's advantage you decide to choose? This can help us to prioritize the support importance. Thanks. Link to comment Share on other sites More sharing options...
Romain Posted August 27, 2021 Share Posted August 27, 2021 (edited) Hello, I have the same issue that @aseton is describing. I'm porting a prototype from that work well on Quest 1 & 2 and Pico Neo 3 on the Focus 3 but we are using the Action-based XRController and like aseton said with it we don't have any positional, rotational or buttons inputs generated. On our side the main advantage of using the Action-based XRController is to take advantage of the new inputs system and its link with XR Interaction Toolkit. All of our interactions are based on the Action-based XRController so having only one type of VR Rig that works the same way with every different headsets is a huge advantage for us. Adapting all our interactions to the device-based XRController for the Focus 3 is definitely possible but it seems like a lot's of work when it works out of the box with the other headsets 😄. Best, Edited August 27, 2021 by Romain Link to comment Share on other sites More sharing options...
iiidefektiii Posted August 27, 2021 Share Posted August 27, 2021 I am having the same issue as @Romain I can only get Device Based to work accross platforms. I am also using some third party plugins that also work with Action Based and not Device Based. Why is Wave so far the only SDK that doesn't support Action Based? Link to comment Share on other sites More sharing options...
smeer Posted September 1, 2021 Share Posted September 1, 2021 Same issue here. Having debugged it a little, I think I found two possible workarounds. A.) Accessing the devices via common usages seems to work. E.g. use "CenterEyePosition" for the camera position. B.) (Probably better.) Register your own layouts. Here's an example I created for our own project (largely untested): https://gist.github.com/simon-meer/c4de3103daf62756cc13d6739f25acd5 Simply add somewhere and it should work. Link to comment Share on other sites More sharing options...
Romain Posted September 1, 2021 Share Posted September 1, 2021 Hi @smeer, thanks a lot the workaround B is working perfectly on my side. I just have a question to understand better how the layout works, in the functions FinishSetup(), when we get the childControl GetChildControl<ButtonControl>(path) what does the argument path refer to ? It does not seems to be a specific path that lead to specific controller data so I was wondering how the layout link the controller inputs to a specific InputControl ? I don't know if it really clear 😅 but I mean how does the layout know that the ButtonControl primaryButton must get the data from the A/X buttons and the ButtonControl secondaryButton the data from the Y/B buttons for example ? Regarding the workaround A, on my side I was already using the common usages and the position and rotation of the headset was working well but I had nothing on the controllers side : no button inputs and the position, rotation were not updated. Link to comment Share on other sites More sharing options...
smeer Posted September 2, 2021 Share Posted September 2, 2021 (edited) @Romain Disclaimer: Most of this I found through debugging and reading the docs, so take it with a grain of salt. The devices (HMD + Controllers) are recognized by Unity, but because no layout exists, the default layouts (XRHMD and XRController) are used as base. Unity then goes through the list of device capabilities, looks for a matching InputControl in the inherited base layout, and if none is found creates a new one. Here are the capabilities of the HMD for example -- and as you can see, the names do not match with the ones commonly used. What I did was simply adding aliases for those properties. When building the managed representation of the device (and the InputControls), Unity then uses the common names instead of the WaveSDK names, hence GetChildControl() will find them by their common name. The mapping of primary/secondary etc. is probably already done on the native side that provides the capabilities. I based those classes on layouts I found in the Oculus XR Plugin btw. They seem to be doing something similar. Also take a look at XRLayoutBuilder#Build() -- that's where the InputControls are created. Regarding workaround A; not sure how you would access them. The controllers do also use common usages (like "GripButton"), though, but maybe you would have to create the paths manually using the path syntax. Edited September 2, 2021 by smeer 1 Link to comment Share on other sites More sharing options...
Romain Posted September 2, 2021 Share Posted September 2, 2021 Thanks for the explanation @smeer ! Link to comment Share on other sites More sharing options...
Remio1 Posted September 9, 2021 Share Posted September 9, 2021 Thanks @smeer and @aseton, solution B worked for us as well. Link to comment Share on other sites More sharing options...
pseltmann Posted September 30, 2021 Share Posted September 30, 2021 Hi @smeer, I tried your solution B and got it partially working. The controllers are correctly tracked and can be seen in the headset, yet the position and rotation of the headset isn't tracked. We use this setup: Vive Focus 3 Vive Wave XR Plugin 4.1.1 Input System 1.0.2 Unity 2020.3 LTS XR Interaction Toolkit WaveDeviceLayouts.cs from github The scene contains a simple action-based XR Rig that just works fine with an Oculus Quest 2. Do you have an idea why the headset is not tracked? Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now