Jump to content

Linkone

Verified Members
  • Posts

    20
  • Joined

  • Last visited

Posts posted by Linkone

  1. @Alex_HTC Thank you for your reply. See answers below in your quoted message. Thank you for your suggestions.

    17 hours ago, Alex_HTC said:
    1. Are the controllers otherwise accessible otherwise, do rendermodels show up and/or other visualiztions on the relevant tracked pose driver/controller?
      Yes controllers are otherwise accessible, using the Input system and a customized Input Actions that runs in parallel to the XRI Default Input Action. Rendermodels show up correclty and are tracked correctly. I am using the XR Controller (Action-based). I am able to press the trigger button and the a/b + x/y buttons. RenderModel I got it from the VIVE OpenXR Toolkit Samples and 

      image.thumb.png.d972b84903ef544403d14f9777132703.png
      image.png.a36b847512f51cafc62888a0e34dfe76.png
    2. is this openxr or wave sdk we're talking about here?
      It is purely OpenXR and I would avoid installing a new package for just this.
    3. Is the vive-specific controller profile on the input profile or only the generic xr controller or "Vive focus 3 controller"?
      I have both the HTC Vive Controller when available (trigger) and "the Vive Focus 3 Controller" (trigger/a/b)
    4. What devices are listed in Inputdevices.GetDevices https://docs.unity3d.com/2020.1/Documentation/ScriptReference/XR.InputDevices.GetDevices.html --the event for registering the device may be firing before the connection, and this call will let you know all devices that are connected, regardless of when the script runs or events fire
      Thanks for this suggestion. I might actually use it in case my current solution does not work
    5. Is there an active device on the relevant profile?
      Are you referring to the interaction profiles?
      image.png.dedf344e03e51d9e48061187c841e857.png
    6. As a sanity check - do these repos work for you:
      1. https://github.com/ViveDeveloperRelations/MinimalRepo/tree/wave_controller_examples - the "wave_controller_examples" branch the wave controller sample scene
      2. if on openxr:  https://github.com/hardcoded2/Updated-XR-Interaction-Toolkit-Examples/tree/openxr - does the "openxr" branch work for you? 
      3. if on wave: https://github.com/ViveDeveloperRelations/XR-Interaction-Toolkit-Examples/tree/wave_openxr_xr_origin does the branch "wave" branch work for you in the "vr" project?

     


    Anyway I am not sure why InputDevices do not work, but I may be thinking at it wrongly. In my experiment I let the user use only one controller, so that s why I wanted to register to the different bits.
    At the end of the day I found an hacky way to check the tracking status to assess if a controller is tracked or not and I use that to detect whether it s on or not.
    Here is the code in case someone get a similar problem.

    [SerializeField]
            InputActionProperty m_LeftControllerTrackingStateAction = new InputActionProperty(new InputAction("Tracking State", expectedControlType: "Integer"));
            [SerializeField]
            InputActionProperty m_RightControllerTrackingStateAction = new InputActionProperty(new InputAction("Tracking State", expectedControlType: "Integer"));
    
     private void Awake()
            {
    
                m_LeftControllerTrackingStateAction.action.performed += OnLeftControllerDetectedTracking;
                m_LeftControllerTrackingStateAction.action.canceled += OnLeftControllerDetectedTracking;
                m_LeftControllerTrackingStateAction.action.Enable();
    
                m_RightControllerTrackingStateAction.action.performed += OnRightControllerDetectedTracking;
                m_RightControllerTrackingStateAction.action.canceled += OnRightControllerDetectedTracking;
                m_RightControllerTrackingStateAction.action.Enable();
            }
    
    private void OnLeftControllerDetectedTracking(InputAction.CallbackContext context)
            {
                int trackingState = context.ReadValue<int>();
                Debug.Log("Left controller tracking state is " + trackingState);
                if (trackingState > 0)
                {
                    //Send event to ack that the controller is currently tracked
                }
                else if (trackingState == 0)
                {
                    //Send event to ack that the controller is currently not tracked
                }
                else
                {
                    Debug.LogError("Left Controller - Invalid tracking state: " + trackingState);
                }
    
            }
    
            private void OnRightControllerDetectedTracking(InputAction.CallbackContext context)
            {
                int trackingState = context.ReadValue<int>();
                Debug.Log("Right controller tracking state is " + trackingState);
                if (trackingState > 0)
                {
                    //Send event to ack that the controller is currently tracked
                }
                else if (trackingState == 0)
                {
                    //Send event to ack that the controller is currently not tracked
                }
                else 
                {
                    Debug.LogError("Right Controller - Invalid tracking state: " + trackingState);
                }
            }




     

  2. I am developing an application targeting the Vive XR Elite 3 with Unity 2021.3.30 LTS. My application needs to detect when a motion controller is connected or disconnected. I've implemented this functionality using Unity's InputDevices.deviceConnected and InputDevices.deviceDisconnected events.

    The code works as expected when running on my PC using a Vive Pro headset + controllers, but it fails to detect controller connections and disconnections when running on the Vive XR Elite build. I am logging any errors and there any. It seems to me that the events are never fired.

    Here's a simplified version of my code:

    Can someone help me to understand if I can use something else to achieve this?

     

    public event Action<ControllerName> OnControllerConnected;
        public event Action<ControllerName> OnControllerDisconnected;
    
        public enum ControllerName
        {
            Left,
            Right
        }
    
        private void OnEnable()
        {
            InputDevices.deviceConnected += OnDeviceConnected;
            InputDevices.deviceDisconnected += OnDeviceDisconnected;
        }
    
        private void OnDisable()
        {
            InputDevices.deviceConnected -= OnDeviceConnected;
            InputDevices.deviceDisconnected -= OnDeviceDisconnected;
        }
    
      private void OnDeviceConnected(UnityEngine.XR.InputDevice device)
      {
      	Debug.Log("Device connected: " + device.name);
      	if ((device.characteristics & InputDeviceCharacteristics.Left) != 0)
      	{
      		Debug.Log("Left Controller Connected");
      		OnControllerConnected?.Invoke(ControllerName.Left);
      	}
      	else if ((device.characteristics & InputDeviceCharacteristics.Right) != 0)
      	{
      		Debug.Log("Right Controller Connected");
      		OnControllerConnected?.Invoke(ControllerName.Right);
      	}
      }
    
      private void OnDeviceDisconnected(UnityEngine.XR.InputDevice device)
      {
      	Debug.Log("Device connected: " + device.name);
      	if ((device.characteristics & InputDeviceCharacteristics.Left) != 0)
      	{
      		Debug.Log("Left Controller Disconnected");
      		OnControllerDisconnected?.Invoke(ControllerName.Left);
      	}
      	else if ((device.characteristics & InputDeviceCharacteristics.Right) != 0)
      	{
      		Debug.Log("Right Controller Disconnected");
      		OnControllerDisconnected?.Invoke(ControllerName.Right);
      	}
      }

     

  3. I was thinking on getting a new development machine, and I was wondering if anyone is developing using laptop. I havent  manage yet to make a simple Unity project working with the DirectStreaming that comes with WaveXR and for me testing without making a build is fundamental (I cannot wait every time 20 mins to test somehting). For this reason I am thinking of upgrading the machine I am using.

    I am thinking of getting something highend (https://www.scan.co.uk/products/173-3xs-vengeance-4080-240hz-qhd-169-12gb-nvidia-rtx-4080-intel-core-i9-13900hx-64gb-ddr5-2tb-ssd-wi) but I am worried that with laptops you might have problems to setup WaveXR with DirectStreaming.

    Does anyone have any experience with using laptop and WaveXR direct streaming?

    Thank you for your help.

     

  4. I am using OpenGL to develop for the VIVE XR Elite standlone (so I don t have much choices from OpenGL). I haven't managed to get my setup done as well with Vive Streaming, but I can stream despite having some errors in the editor. So I think that the problem might simply be the UNity Version you are using. What version are you using?

  5. As far as I know and as @Huxxley said you can only use the OpenXR or Wave SDK to make a build that works for standalone. In particular if you use the Wave Skd it it will pop out the reccomended changes for the Player Settings which can be convenient if making a quick build. Things to check are similar to builds made on the Focus or Oculus Quest, like using the OpenGL3, Arm64 as target cpu, remove any post processing if using URP and similar others.

  6. I haven t used the Vive trackers personally, but the passthrough is achievable using the Vive Open XR plugin, and that s exactly the guide you are going through. I am using Unity 2021.3 LTS version. Also I am using the XR Elite, so I am not sure how it will be in the Focus 3. It will require time to adjust settings to make it work as unfortunately the documentation is not really good from the HTC people and things are scattered all around, but it is doable.

  7. Hello everyone, 

    I am trying to use a sample project to see how to best develop for the XR Elite and I am having a really hard time on setting Direct Preview using the Wave Vive Registry. I believe I have the correct setup (laptop with NVDIA GTX 1060)

    I have mainly followed this guide (https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRDirectPreview.html

    1. WaveXR correctly selected in XR-plugin management
    2. HTC Device drivers correctly installed, Windows firewall with correct permissions
    3. Adb path correctly set
    4. Install Device Apk -> Fails all of the time.
      1. Ensured that the full path does not have any spaces (D:\Projects\DiamondDiggers_XR_noURP\Library\PackageCache\com.htc.upm.wave.xrsdk@5.3.1-r.2\Runtime\DirectPreview\Binary\RRClient\Vive_rrClient.apk)

    I have set up SteamVr on the laptop and I am able to correctly to the laptop using the VIVE Business Streaming

    Can someone please help me in troubleshooting? 
    Also how does the Unity plugin recognizes if we need to start a WiFi connection?

    Is there any discord community that helps with vr development?

    Any help will be greatly appreciated,

    Bye

  8. Hi everyone,

    I have been developing in the past for the Vive/ Vive Pro using Unity and I have been happily using SteamVR package for that.
    I recently got a Vive XR Elite and I am wondering if need a different package for development.

    At the moment I will still develop fullVR, so with this device in mind and considering will be an Android application, what is the sdk that I can use in Unity to get me started?

    Also any patreons/youtubers that you would reccomend to browse for references? I am not expecting to do more that controller pointers interaction to start with.

    Any ideas and suggestion would be much appreciated.

    Thank you!

×
×
  • Create New...