Jump to content

Alex_HTC

Moderators
  • Posts

    291
  • Joined

  • Last visited

Everything posted by Alex_HTC

  1. Try updating the sranipal version to the latest version at https://developer.vive.com/resources/vive-sense/eye-and-facial-tracking-sdk/download/latest/ I'm aware ofa few changes in the past year that would be resolved with a newer sranipal sdk
  2. @nektonHowdy! It looks like there is some warping around the hands in the image above. There are some trade offs with hand tracking and passthrough that go into the image composition. Currently, in our openxr plugin we are not exposing this yet, but in our wave sdk we expose a function to allow the developer to prioritize either scale or position, which impacts the artifacts that are pointed out above. https://hub.vive.com/storage/app/doc/en-us/UnityXR/UnityXRPassthrough.html#set-passthrough-image-focus Right now, I'm unsure if there is any way to do the same in openxr, but i'll inquire further. Thanks, Alex
  3. @vethetheth Well, that's interesting. Re-ran it this morning and i saw this Did a clean checkout and got the same result just now. Do you have git lfs installed? It's available here https://git-lfs.com/ and will be necessary to get the texture assets.
  4. @vethetheth can you try the repo provided as well https://github.com/hardcoded2/viu_test
  5. @vethetheth Howdy! It looks like there may be an issue. I was able to reproduce the issue on my end and we'll see what we can do. @Lawrence may have some better suggestions as well For starters - i'm not sure you should be seeing "pink". I was able to get the hands up and running, but they are invisible. So it looks like the materials in your project haven't undergone the standard upgrade process - To convert your materials to URP, go to Window > Rendering > Render Pipeline Converter. In the Render Pipeline Converter window, select Built-in to URP from the Convert From dropdown menu. Then, check the Material Upgrade checkbox and click the Initialize Converters button. Finally, click the Convert Assets button to convert your materials to URP. You should see something like this If you can zoom in a bit (or squint hard) you may see that there is no material converter for the hands. This should at least get you to a spot where the hands are invisible. Or if you want, you can start from the project where i got to this point here, for reference https://github.com/hardcoded2/viu_test So at least we should be on the same page now. Ok, for a workaround until the team can look at it -- Go to the slider next to render model hook, and a pre-configured version of this as a preset And then you can save that new render model preset as a default, so when you create a new component it will be set up this way. You can do this by selecting "add to RenderModelHook default" button. Note that if you want to change the model instead (since you indicated this was the desired behavior) of the override shader, it should work all the same Thanks for reaching out! -Alex
  6. @Mike Ratcliffe Hi Mike! I'm not sure about the older versions of the sdk at the moment. Which bug are you looking at? I am currently looking at Wolivc code and do builds for it - so i guess you got the right person here 🙂 If you haven't seen the 1.3.2 release, it is on the store here https://www.viveport.com/apps/142c54df-b271-4a23-900f-21bb045829db The current version of Wolvic that builds against a more recent version of wave is at https://github.com/hardcoded2/wolvic/tree/asink/132mergemain - i went ahead in this branch and checked in the artifacts since the previous scripts and structure are way different. The next steps are to look into moving all of our code to openxr, which should be interesting. -Alex
  7. @MercJones I'm guessing you're talking about a pc based device? It's worth noting that the plugin referred to hasn't been updated in 2 years, so likely this is part of the trouble. Make sure that you're using a vr rig, that the xr management subsystem is enabled for the target device, and that the project works with some known-good examples. We provide samples for both wave and openxr for this reason. Likely other toolkits should have similar scenes. I seem to remember also aware that some older sdks are incompatible with the newer xr plugin system that was introduced post-2019 unity, so that may also be a blocker here, but maybe my memory doesn't serve me. If wave isn't to your liking, openxr is the standard cross platform toolkit and we support that - try https://developer.vive.com/resources/openxr/ And we're always here to address questions or help with any issues if you choose to revisit using wave as well!
  8. @Linkone Howdy! We recommend using wave or openxr. Wave getting started: https://developer.vive.com/resources/vive-wave/tutorials/getting-started-wave-unity-developers/ OR Openxr getting started: https://forum.vive.com/topic/12959-tutorial-focus3-openxr-open-beta-getting-started-in-unity/ Both work well for basic interaction and work well. Install the samples and you'll see examples that show the controllers and how they can be used to interact with objects and/or ui 🙂 And don't hesitate to ask questions!
  9. @MercJones We do recommend using wave (https://developer.vive.com/resources/vive-wave/tutorials/getting-started-wave-unity-developers/) or openxr. what engine? what device? Which versions of the firmware and engine? What do you specifically mean by steamvr toolkit?
  10. @vanontip Howdy! At quick glance - it looks like you have an old version of the vive console (the current one is 2.1.23.2 and the version in your screenshot is much lower at 2.0.23.2) - try updating the "vive console" software Either here https://www.vive.com/us/setup/pc-vr/ or through steam here https://store.steampowered.com/app/1635730/VIVE_Console_for_SteamVR/ USB issues can be tricky to diagnose but can frequently be figured out with some trial and error. In the worst cases, adding an external pci card for usb connectivity has solved anything outside of normal bugs - ie an old 'haunted' machine that i had once. Our official docs for this seem to be at https://www.vive.com/us/support/vive/category_howto/headset-not-detected-due-to-usb-issue.html My shorthand for issues i've seen are: Make sure you're using the supplied cables - as there are different usb specs now that can be difficult to tell apart as a user. Using the provided cables can ensure that you don't run into this issue. Plug into different usb ports on your computer Reboot and make sure all of the software components are up to date (including windows/display drivers as well as steamvr and vive console) In the worst case, there are some techniques described by users in threads like this: https://www.reddit.com/r/Vive/comments/767coj/my_computer_cant_seem_to_find_the_htc_vive_usb/ and articles like this: https://www.drivethelife.com/htc-vive-usb-errors-and-troubleshooting/ Thanks, Alex
  11. Howdy @Martina_H 1) It looks like your gaze posiiton is correct. The gaze direction is a vector that points from the eye to the point on the screen where the user is looking. When you transform this vector, you are essentially converting it from eye coordinates to world coordinates. This is necessary in order to render the gaze point on the video. 2)To play a 3D side by side video(also referred to as stereoscopic video) on a Game Object, you can use the following steps: Create a new Game Object and add a Video Player component to it. Set the Video Player component's source to the 3D side by side video that you want to play. In the Video Player component's inspector, enable the "Stereoscopic" option and select the "Side by Side" option. Set the Video Player component's z-position to a negative value. This will cause the video to be rendered in front of the user. An example code snippet for stereo is // Create a new Game Object and add a Video Player component to it. GameObject videoObject = new GameObject(); videoObject.AddComponent<VideoPlayer>(); // Set the Video Player component's source to the 3D side by side video that you want to play. videoObject.GetComponent<VideoPlayer>().source = VideoSource.Url; videoObject.GetComponent<VideoPlayer>().url = "https://www.youtube.com/watch?v=6ZfuNTqbHE8"; //alternatively "file://"+ Application.streamingAssetsPath + "3Dtest.mp4"; // Enable the "Stereoscopic" option - side by side in this example videoObject.GetComponent<VideoPlayer>().targetCamera3DLayout = Video3DLayout.SideBySide3D; // Set the Video Player component's z-position to a negative value. videoObject.transform.position = new Vector3(0, 0, -1); Note that the path to the video will be relative the the build - so if you want to build and run the app and always have it include the video, you could put the video inside the streaming assets path in your project and then use the Application.streamingAssetsPath to reference the folder name as referenced ehere https://docs.unity3d.com/ScriptReference/Application-streamingAssetsPath.html . Interestingly enough, that an example showing how to use new unity video player component as well at that link. u could use the standard vive media decoder, split the material down the middle yourself and show one half on one eye and another on another eye using layer masks and make a vr rig with two eyes, one for the left and one for the right, with the culling mask excluding the other eye layer Thanks, Alex
  12. @jiarong701 Howdy! This is a great question! We introduced apis to better let developers decide which is more important for them - scale or alignment for their specific application. https://hub.vive.com/storage/app/doc/en-us/UnityXR/UnityXRPassthrough.html#alignment While all of the modes will be getting better over time as we fine tune things, we want you to be able to make decisions that work for your app. -Alex
  13. @Schreppy @JulienActiveMe The other approach i was referring to looks somewhat like what is in this video and/or this one
  14. @Hitmobile You can always tune the settings in unity's quality settings https://docs.unity3d.com/Manual/class-QualitySettings.html Also we expose an api to control the quality of passthrough https://hub.vive.com/storage/app/doc/en-us/UnityXR/UnityXRPassthrough.html#set-passthrough-image-quality
  15. @bebezzang as far as I am aware, anchors are not supported in direct preview mode, since they require a boundary to be set up on the headset
  16. @Schreppy Grabbing objects tends to be done at the application layer, by creating colliders on the finger joints provided by the device apis -- friends in the vr industry tend to want to customize this very heavily based on what's going on in the application, and usually they tell me that the y start with some of the assets off of the assetstore for this purpose. We provide the bone positions and the application uses those to determine if something should be "grabbed" Another approach is to use the interactables in the unity xr interaction toolkit and use the 'select' gesture (pinch) which works at the system level with openx, or use our wave native xr sdk and our hand gesture recognizer
  17. @JulienActiveMe On second thought, Typically grab is implemented using physics - so setting up colliders/rigidbodies on the hand joints themselves - this is exposed by the openxr api itself and application level logic will dictact if there is a collision and/or grab The deeper specifics of this device is on the ViveHandInteraction.cs file in the standard openxr plugin -- so make sure that the project is compiling as well, as otherwise this profile may not show up for you
  18. @JinZhou The first step is to set it up to use it https://www.vive.com/us/support/vive-pro2/category_howto/activating-the-dual-camera.html Then set up unity https://developer.vive.com/resources/vive-wave/tutorials/installing-wave-xr-plugin-unity/ Then import our samples in all 3 of our wave packages (essence, native, and xr plugin) Then there should be a scene called "Passthrough" that should be a good guide
  19. @theycallmefm We would suggest using wave or openxr for this purpose. This sounds like a bug in the bindings as we regularly use the tracked pose driver in unity using the other libraries. -Alex
  20. @Vector B These differences in various controllers are part of why we offer packages like VIU ( https://github.com/ViveSoftware/ViveInputUtility-Unity or https://assetstore.unity.com/packages/tools/integration/vive-input-utility-64219 )- since the origin and other attributes of controllers are little different. There are other approaches, but the raw offsets are available in that package, and there are a few other toolkits to better manage the differences in origin/etc in various controllers for more specific usecases
  21. @Shiv We allow access to the framebuffer on our pc headsets that have the camera, but the camera on our standalone headsets tend to be protected for general use. -Alex
  22. @JulienActiveMe I hear that these things can be a little complicated. Make sure to install the "Vive OpenXR ToolKit - Android" package as that is what adds support for the "selected" feature and you should then see the "Vive xr hand interaction" With this is my understanding that we support the "selected" event (by way of a pinch gesture) using the openxr runtime at the moment. It is my understanding that additional hand gestures are in progress under the openxr hand gestures. I may be a bit behind on this and maybe @Tony PH Lin knows of any additional gestures supported by openxr in the xr interaction toolkit. If you switch to our wave native hand gestures, there are some additional hand gestures documented here https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRHand.html
  23. @BarDev061 @samjaitw The use of the depth sensor requires a beta sdk that @Tony PH Lin can help provide. You will also need to be on the latest version of the headset ROM -Alex
  24. @jayliu-deloitte There have been some stability updates in the latest 5.3 sdk release https://hub.vive.com/storage/app/doc/en-us/UnityXR/UnityXRDirectPreview.html A site to better help with the setup process https://developer.vive.com/resources/vive-wave/tutorials/direct-preview-unity/?site=de That said, I know @Tony PH Lin always welcomes feedback!
×
×
  • Create New...