Jump to content

Alex_HTC

Moderators
  • Posts

    291
  • Joined

  • Last visited

Posts posted by Alex_HTC

  1. @nektonHowdy!

    It looks like there is some warping around the hands in the image above.
    There are some trade offs with hand tracking and passthrough that go into the image composition. Currently, in our openxr plugin we are not exposing this yet, but in our wave sdk we expose a function to allow the developer to prioritize either scale or position, which impacts the artifacts that are pointed out above. https://hub.vive.com/storage/app/doc/en-us/UnityXR/UnityXRPassthrough.html#set-passthrough-image-focus 

    Right now, I'm unsure if there is any way to do the same in openxr, but i'll inquire further.
    Thanks,
    Alex

    • Thanks 1
  2. @vethetheth

    Howdy!

    It looks like there may be an issue. I was able to reproduce the issue on my end and we'll see what we can do. @Lawrence may have some better suggestions as well

    For starters - i'm not sure you should be seeing "pink". I was able to get the hands up and running, but they are invisible.
    So it looks like the materials in your project haven't undergone the standard upgrade process - 

    To convert your materials to URP, go to Window > Rendering > Render Pipeline Converter. In the Render Pipeline Converter window, select Built-in to URP from the Convert From dropdown menu. Then, check the Material Upgrade checkbox and click the Initialize Converters button. Finally, click the Convert Assets button to convert your materials to URP.


    You should see something like this
    image.thumb.png.082730dea5c5bf8674d634cc36679df9.png

    If you can zoom in a bit (or squint hard) you may see that there is no material converter for the hands. This should at least get you to a spot where the hands are invisible. Or if you want, you can start from the project where i got to this point here, for reference https://github.com/hardcoded2/viu_test  So at least we should be on the same page now.

    Ok, for a workaround until the team can look at it -- 

    Go to the slider next to render model hook, and a pre-configured version of this as a preset
    image.png.6a8bc7862a95bb078936ab998bf88e5e.png
    image.png.a9e662c296df96bdef31071a66253264.png

    And then you can save that new render model preset as a default, so when you create a new component it will be set up this way. You can do this by selecting "add to RenderModelHook default" button.
    image.png.d4aa5568398213a7866d326be2a3a52a.png
     
    Note that if you want to change the model instead (since you indicated this was the desired behavior) of the override shader, it should work all the same

    Thanks for reaching out!

    -Alex

     

  3. @Mike Ratcliffe Hi Mike!

    I'm not sure about the older versions of the sdk at the moment.

    Which bug are you looking at?

    I am currently looking at Wolivc code and do builds for it - so i guess you got the right person here 🙂  If you haven't seen the 1.3.2 release, it is on the store here https://www.viveport.com/apps/142c54df-b271-4a23-900f-21bb045829db

    The current version of Wolvic that builds against a more recent version of wave is at https://github.com/hardcoded2/wolvic/tree/asink/132mergemain - i went ahead in this branch and checked in the artifacts since the previous scripts and structure are way different. The next steps are to look into moving all of our code to openxr, which should be interesting.

    -Alex

    • Like 1
  4. @MercJones
    I'm guessing you're talking about a pc based device? 

    It's worth noting that the plugin referred to hasn't been updated in 2 years, so likely this is part of the trouble.

    Make sure that you're using a vr rig, that the xr management subsystem is enabled for the target device, and that the project works with some known-good examples. We provide samples for both wave and openxr for this reason. Likely other toolkits should have similar scenes. I seem to remember also aware that some older sdks are incompatible with the newer xr plugin system that was introduced post-2019 unity, so that may also be a blocker here, but maybe my memory doesn't serve me.

    If wave isn't to your liking, openxr is the standard cross platform toolkit and we support that - try https://developer.vive.com/resources/openxr/

    And we're always here to address questions or help with any issues if you choose to revisit using wave as well!
     

  5. @Linkone

    Howdy!

    We recommend using wave or openxr.

    Wave getting started: https://developer.vive.com/resources/vive-wave/tutorials/getting-started-wave-unity-developers/
    OR
    Openxr getting started: https://forum.vive.com/topic/12959-tutorial-focus3-openxr-open-beta-getting-started-in-unity/

    Both work well for basic interaction and work well. Install the samples and you'll see examples that show the controllers and how they can be used to interact with objects and/or ui 🙂 And don't hesitate to ask questions!
     

  6. @vanontip 
    Howdy!

    At quick glance - it looks like you have an old version of the vive console (the current one is 2.1.23.2 and the version in your screenshot is much lower at 2.0.23.2) - try updating the "vive console" software

    Either here https://www.vive.com/us/setup/pc-vr/ or through steam here https://store.steampowered.com/app/1635730/VIVE_Console_for_SteamVR/


    USB issues can be tricky to diagnose but can frequently be figured out with some trial and error. In the worst cases, adding an external pci card for usb connectivity has solved anything outside of normal bugs - ie an old 'haunted' machine that i had once.

    Our official docs for this seem to be at https://www.vive.com/us/support/vive/category_howto/headset-not-detected-due-to-usb-issue.html

    My shorthand for issues i've seen are:
    Make sure you're using the supplied cables - as there are different usb specs now that can be difficult to tell apart as a user. Using the provided cables can ensure that you don't run into this issue.
    Plug into different usb ports on your computer
    Reboot and make sure all of the software components are up to date (including windows/display drivers as well as steamvr and vive console)

    In the worst case, there are some techniques described by users in threads like this:
    https://www.reddit.com/r/Vive/comments/767coj/my_computer_cant_seem_to_find_the_htc_vive_usb/
    and articles like this:

    https://www.drivethelife.com/htc-vive-usb-errors-and-troubleshooting/

      Thanks,
    Alex

  7. Howdy @Martina_H

    1) It looks like your gaze posiiton is correct. The gaze direction is a vector that points from the eye to the point on the screen where the user is looking. When you transform this vector, you are essentially converting it from eye coordinates to world coordinates. This is necessary in order to render the gaze point on the video.

    2)To play a 3D side by side video(also referred to as stereoscopic video) on a Game Object, you can use the following steps:

    1. Create a new Game Object and add a Video Player component to it.
    2. Set the Video Player component's source to the 3D side by side video that you want to play.
    3. In the Video Player component's inspector, enable the "Stereoscopic" option and select the "Side by Side" option.
    4. Set the Video Player component's z-position to a negative value. This will cause the video to be rendered in front of the user.

    An example code snippet for stereo is 
     

            // Create a new Game Object and add a Video Player component to it.
            GameObject videoObject = new GameObject();
            videoObject.AddComponent<VideoPlayer>();
            
            // Set the Video Player component's source to the 3D side by side video that you want to play.
            videoObject.GetComponent<VideoPlayer>().source = VideoSource.Url;
            videoObject.GetComponent<VideoPlayer>().url = "https://www.youtube.com/watch?v=6ZfuNTqbHE8"; //alternatively "file://"+ Application.streamingAssetsPath + "3Dtest.mp4";
            
            // Enable the "Stereoscopic" option - side by side in this example
            videoObject.GetComponent<VideoPlayer>().targetCamera3DLayout = Video3DLayout.SideBySide3D;
            
            // Set the Video Player component's z-position to a negative value.
            videoObject.transform.position = new Vector3(0, 0, -1);

    Note that the path to the video will be relative the the build - so if you want to build and run the app and always have it include the video, you could put the video inside the streaming assets path in your project and then use the Application.streamingAssetsPath to reference the folder name as referenced ehere https://docs.unity3d.com/ScriptReference/Application-streamingAssetsPath.html . Interestingly enough, that an example showing how to use new unity video player component as well at that link.

    u could use the standard vive media decoder, split the material down the middle yourself and show one half on one eye and another on another eye using layer masks and make a vr rig with two eyes, one for the left and one for the right, with the culling mask excluding the other eye layer

    Thanks,
    Alex

  8. @jiarong701
    Howdy!

    This is a great question!

    We introduced apis to better let developers decide which is more important for them - scale or alignment for their specific application. https://hub.vive.com/storage/app/doc/en-us/UnityXR/UnityXRPassthrough.html#alignment While all of the modes will be getting better over time as we fine tune things, we want you to be able to make decisions that work for your app.

    -Alex

  9. @Schreppy Grabbing objects tends to be done at the application layer, by creating colliders on the finger joints provided by the device apis -- friends in the vr industry tend to want to customize this very heavily based on what's going on in the application, and usually they tell me that the y start with some of the assets off of the assetstore for this purpose. We provide the bone positions and the application uses those to determine if something should be "grabbed"

    Another approach is to use the interactables in the unity xr interaction toolkit and use the 'select' gesture (pinch) which works at the system level with openx, or use our wave native xr sdk and our hand gesture recognizer 

     

    • Like 1
  10. @JulienActiveMe

    On second thought, 
    Typically grab is implemented using physics - so setting up colliders/rigidbodies on the hand joints themselves - this is exposed by the openxr api itself and application level logic will dictact if there is a collision and/or grab 

    The deeper specifics of this device is on the ViveHandInteraction.cs file in the standard openxr plugin -- so make sure that the project is compiling as well, as otherwise this profile may not show up for you

  11. @Vector B

    These differences in various controllers are part of why we offer packages like VIU ( https://github.com/ViveSoftware/ViveInputUtility-Unity or https://assetstore.unity.com/packages/tools/integration/vive-input-utility-64219 )- since the origin and other attributes of controllers are little different.

    There are other approaches, but the raw offsets are available in that package, and there are a few other toolkits to better manage the differences in origin/etc in various controllers for more specific usecases

  12. @JulienActiveMe

    I hear that these things can be a little complicated. 

    Make sure to install the "Vive OpenXR ToolKit - Android" package as that is what adds support for the "selected" feature and you should then see the "Vive xr hand interaction"  
    image.thumb.png.5b2014c72b94762bd6b3a2f840437f4b.png

    With this is my understanding that we support the "selected" event (by way of a pinch gesture) using the openxr runtime at the moment. It is my understanding that additional hand gestures are in progress under the openxr hand gestures. I may be a bit behind on this and maybe @Tony PH Lin knows of any additional gestures supported by openxr in the xr interaction toolkit.

    If you switch to our wave native hand gestures, there are some additional hand gestures documented here https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRHand.html


     

×
×
  • Create New...