Jump to content

Alex_HTC

Moderators
  • Posts

    291
  • Joined

  • Last visited

Posts posted by Alex_HTC

  1. Howdy @Maple Bay

    There are two options:
    1) kiosk mode - set larger boundaries than are needed and drop users into 'kiosk mode' and they will not see the boundaries of the space and have no option to redraw them
    2) LBSS  -  our location based suite has a) ways to lock down headsets, control and reset them to a consistent state as well as b) tracking options and modes to avoid anything boundary/map related  - https://business.vive.com/mea-en/support/vive-lbss/category_howto/location-based-entertainment-(lbe)-mode.html --LBSS is what most large arcades and event folks use as it also provides device management suite / software updates / system version control and other features useful when managing more than a few headsets at a time.

    -Alex

  2. @1099

    Preventing aliasing is a big subject in gamedev forums, and there may be some good hints there

    I can't tell, but try disabling 'foviation mode' as it seems like this happens disproportionately on the left side of the image. Fixed foviated rendering increases performance, but can create some additional artifacts outside of where the users' eye looks 'forward'.

    Some of the other solutions from the gamedev community may also be useful - avoiding high contract, straight lines like that, and maybe adjusting the shadow quality.

  3. @1099I'm guessing there is some confusion - i think he meant to mention 

    Interop.GetRenderTargetSize(ref uint width, ref uint height)

     

    under the hood Tony was mentioning this api which is not in that script
    https://hub.vive.com/storage/docs/en-us/WVR_GetRenderTargetSize.html


    In regards to:

    Quote

    UnityEngine.XR.XRSettings.eyeTextureResolutionScale = scale;


    The answer to this depends on your render pipeline, and i believe a few other things.

    Looking quickly, i believe the URP doesn't support changing it through this api, only the built-in api. 
    I believe the URP has a way of changing this from a slider on the camera "Render Scale" property described here https://docs.unity3d.com/Packages/com.unity.render-pipelines.universal@10.4/manual/universalrp-asset.html
    and they mention in this thread https://forum.unity.com/threads/windows-mixed-reality-eyetextureresolutionscale-not-working.1097992/



    There are also versions of the pipelines that have issues - looks like there is an issue with 10.0.x URP and earlier https://docs.unity3d.com/Packages/com.unity.render-pipelines.universal@16.0/changelog/CHANGELOG.html 

    And perhaps i'm misremembering but i seem to remember there is some more unusual configuration that requires finding a reference to the xr display subsystem and setting the "scaleOfAllViewports" https://docs.unity3d.com/ScriptReference/XR.XRDisplaySubsystem-scaleOfAllViewports.html 


    That said, typically aliasing is reduced through MSAA settings and other filtering (including using proper blending and MIP maps)

  4. @waveSkThe way to submit custom frames to each eye or both eyes is through standard unity mechanisms, not through device-specific code

    1. Use a camera with an appropriate eye mask that shows only your texture - left eye/right eye, and use the texture you're trying to submit there instead. This can lean on using left-eye and right-eye only layer flags on the camera. That would be my first try, then i would look at similar things in the multipass setup if that didn't work. You'll find many examples that do something like this using this api https://docs.unity3d.com/ScriptReference/Camera-stereoTargetEye.html with leftonly/rightonly and layer masks
    2. I'm sure there are more, though certainly the low level graphics apis are another way if performance concerns eliminate the left/right eye flags on cameras. These are described at  https://docs.unity3d.com/Manual/NativePluginInterface.html and https://github.com/Unity-Technologies/NativeRenderingPlugin 
  5. Hi @Linkone

    Just for a sanity check:

    1. Are the controllers otherwise accessible otherwise, do rendermodels show up and/or other visualiztions on the relevant tracked pose driver/controller?
    2. is this openxr or wave sdk we're talking about here? 
    3. Is the vive-specific controller profile on the input profile or only the generic xr controller or "Vive focus 3 controller"?
      image.thumb.png.4f08cf317ae89f72f9a115062e0e9305.png
    4. What devices are listed in Inputdevices.GetDevices https://docs.unity3d.com/2020.1/Documentation/ScriptReference/XR.InputDevices.GetDevices.html --the event for registering the device may be firing before the connection, and this call will let you know all devices that are connected, regardless of when the script runs or events fire
    5. Is there an active device on the relevant profile? 
    6. As a sanity check - do these repos work for you:
      1. https://github.com/ViveDeveloperRelations/MinimalRepo/tree/wave_controller_examples - the "wave_controller_examples" branch the wave controller sample scene
      2. if on openxr:  https://github.com/hardcoded2/Updated-XR-Interaction-Toolkit-Examples/tree/openxr - does the "openxr" branch work for you? 
      3. if on wave: https://github.com/ViveDeveloperRelations/XR-Interaction-Toolkit-Examples/tree/wave_openxr_xr_origin does the branch "wave" branch work for you in the "vr" project?

     

  6. Howdy @BanjoCollie

    I hear you're having issues getting the openxr support working on your xr elite using unity.

    From a top level, to rule out environmental issues - try downloading https://github.com/ViveDeveloperRelations/MinimalRepo/tree/openxr_hand_tracking the "openxr_hand_tracking" example and see if that works.

    It sounds like there two things that come to mind:
    1) Make sure to have an xr rig and not a regular camera
    2) try removing the meta plugin from the project and do a clean build / remove the library folder. The meta openxr loader is non-standard and causes issues with every other openxr platform at the moment, and last i checked the code that it runs to add to the android manifest also runs unconditionally. I think this may be patched in some versions, but regardless this is a source of errors for a lot of folks.

    -Alex
     

  7. Howdy @waveSk

    Some parts are meant to be used together in a certain way.

    For remote rendering, there are a few approaches you can take depending on what you're doing.

    You could just warp the frame in unity and use it in the normal rendering pipeline. If it's stereo, then just use the eye mask to make sure that each image shows up on each eye as intended. This would be a good starting place. 

    If you're going deeper, you can keep going until you hit creating unity graphics plugins for hybrid rendering or other uses like https://docs.unity3d.com/Manual/NativePluginInterface.html and https://github.com/Unity-Technologies/NativeRenderingPlugin 

    Hope this helps!
    -Alex

  8. @Laksh


    The hand visualizer is up and running here with the "hand tracking subsystem" - just not the hand interaction poses yet.

    image.thumb.png.0b06ed44e7dae87ea530c01f8ea6803b.png

    i think the "hand interaction poses" should work in this repo in the test/hand_gestures branch -- I haven't tested the main scene in the xri stuff yet, but the hand visualizer test in the xr hands package is up in working --= the hands get visualized. There's some issue with the main hand shader (probably due to some import issue - i don't think this test started on a clean branch) so it doesn't show, but the hand visualizer does show the bones/etc correctly - it's just the skinned hand doesn't render properly. : 
    https://github.com/hardcoded2/Updated-XR-Interaction-Toolkit-Examples/tree/test/hand_gestures

    Looking at the "hand interaction poses" and the "palm poses" to see how much work it might be. It looks like it depends the non-standard openxr loader as a dependency right now (which could can be modified for demo purposes) and might need some minor glue code, so i'm not sure about timelines. That said, hopefully this gets you a little further than you are at the moment 🙂

    Always looking for more feedback - I'm personally excited about the xr hands stuff and am a big fan of one of the authors of the package - Eric Provencher -- genuinely a nice person
     

    • Thanks 1


  9. It is my understanding that the hand demo is a little new and technically may not have cross vendor support. I'm interested to see what is needed, as I can't imagine there's too much work, since afaik, it may just be a matter adding a profile to the interaction profile or making sure to enable the correct openxr extension.  The way that the openxr extension has been implemented shifted a bit from my understanding, which may be somewhat involved with what you're seeing.

  10. @1099

    Howdy!

    It seems there was an assumption in the tutorial that the ANDROID_HOME variable is set. This is common for native android folks, but less so for unity android folks.

    You can fix this by setting adding the fully qualified path on your machihne.

    For me, i've installed the android tools in "E:\env\AndroidSDK\platform-tools" 
    For most others, they've installed android tools along with the unity editor itself, in which case the path would look similar to (but would have a different version) "C:\Program Files\Unity\Hub\Editor\2022.2.15f1\Editor\Data\PlaybackEngines\AndroidPlayer\SDK\platform-tools"


    Thanks,
    Alex

  11. Howdy @throni3,

    I'm unsure what the issue would be, but I would guess that config files and services can be in different places on different distros. Or maybe there's a driver incompatibility or a dependency like libcurl or libpangoft2 not being satisfied. I am unsure who runs this configuration internally, but is an interesting one!  Information about steam config files/setup might be more readily available on the steamvr linux forums or similar https://steamcommunity.com/app/250820/discussions/5/

    Steamvr is needed for setup and integration. The device is a part of the steamvr suite, which also captures other parts of the setup like boundary, onboarding, rendering, user interface and managing other devices as well. I think there are some ways that are fairly unsupported for getting things to work without steamvr, but those are typically for very narrow use cases to my understanding.

    There may be some way to get this sort of setup in another way by using vive trackers and/or third party accessories. We do expose electrical pins that can simulate much of the controller, which might be another approach apart from side-stepping steamvr.

    -Alex

  12. The error 001 in Vive Console is usually caused because it thinks the headset is not connected, which can be a physical connection or by a driver(software) issue. If the driver for this device is not loaded correctly, then the Vive Cosmos will not work properly.

    Here are some things you can try to fix the driver issue and get your Vive Cosmos working again:

    -Update the drivers for your Vive Cosmos. You can download the latest drivers from the HTC website.
    -Make sure that the Vive Cosmos is plugged into a USB 3.0 port. USB 2.0 ports may not have enough power to power the Vive Cosmos properly.
    -Try a different USB cable. The USB cable that came with your Vive Cosmos may be damaged.
    -Reset your Vive Console. To do this, close the Vive Console app and then unplug the power cable from the link box. Wait for a few seconds, and then plug the power cable back in.
    -Try rebooting to ensure that the latest driver loads
    -If you are still having trouble, contact HTC support for help here https://contact.vive.com/hc/en-us/requests/new

  13. On 7/31/2023 at 12:43 PM, sejet86196 said:

    How is it that the XR Elite costs over a grand and you can't turn off the boundary? Or even easily reconfigure it without going into another area that you don't have a recently configured playspace for.

    Having to go "to the lobby" to access the full settings menu is already an unnecessary pain, but then there's hardly useful fleshed out settings there once you do.

    You can turn off the boundary by using 'instant mode' in any system software past 4.0 
    It was first described here:

     

  14. @1099Currently, the oculus loader uses non-standard modifications and is incompatible with all other openxr platforms. This is slowly being addressed in the openxr spec. For you, means that devs need to make sure to disable the oculus extension, open unity, and enable our extension otherwise the non-standard openxr loader remains in use.
    And make sure that the oculus post-processing android manifest extensions aren't getting executed, which happened for a lot of the older versions of the plugin, which can be resolved by removing the oculus package from the package manager as well.
    I believe the opposite may be true as well, so typically i would keep different platforms on different branches.

×
×
  • Create New...