Jump to content

Alex_HTC

Moderators
  • Posts

    291
  • Joined

  • Last visited

Everything posted by Alex_HTC

  1. @wekkosome suggestions are over in the other thread that you posted
  2. @DragonDreams You're getting 0 for all values, if i understand correctly on the native side. To me this sounds like a configuration or hardware failure. When running the unity example - does the face move at all? Or do you get that far? The native issue may be a red herring if it's compiling against older sranipal dlls as there was a required update relatively recently, which is why I'm asking more about the unity side since that is in a known-good state. I'll inquire further internally.
  3. @RMFoley92 Howdy! At the moment, we whitelist individual developers for this use case. We have no current announcements on opening this up at the moment. I can say that as an individual I previously worked in the AR space, and I'm excited about the potential applications. Requests like this help us prioritize our work. That said, shoot me an email and we'll get you in the process 🙂 -Alex
  4. @eddylai Sorry to hear about the trouble! This appears to be an issue with the way that unity looks for the keystore file. A user reported that deleting that file allowed it to work again. This seems to be an issue with unity that can be resolved by deleting (or moving) the "D:\Android\Home\.android\debug.keystore" file, so it can create one that it recognizes answers lifted from https://stackoverflow.com/questions/70917420/unity-error-on-m1-mac-failed-to-read-key-from-keystore-invalid-keystore-for https://stackoverflow.com/questions/54877439/failed-to-read-key-androiddebugkey-from-store https://stackoverflow.com/questions/72544504/unity-2021-3-4f1-gradle-build-failed
  5. @wekko it plays it in 2d format in the headset So if i understand correctly, the goal is to play a 360 video and allow the user to move their head around. There are a few ways to do this using unity's movie player package - https://docs.unity3d.com/Manual/class-VideoPlayer.html and a tutorial is here Set up your xr rig (make sure that it's not just a 2d unity camera) put a 10m inverted sphere surrounding the xr rig (i think there's a shader for this by default in the unity sample) deploy the app to the device 2) another approach is to use another plugin like avpro, and an example project is here https://github.com/hardcoded2/AVProTest
  6. @Lacota The problem goes away when restarting steamvr home. So if i understand correctly - this is running using vive streaming hub and using a pc app. Hrm. Will test and report this. Thanks!
  7. @jcm01 wave sdk is definitely supported on the new devices! https://github.com/ViveDeveloperRelations/ScenePerceptionDemo is an example of a project that works out of the box Make sure that in the project settings->xr settings menu "wavexr" is enabled and that other options are disabled The error posted indicates that the wavexr plugin didn't run it's code to hook this up, which happens when this option is not enabled
  8. @DragonDreams Before you launch the app, what status is the sranipal runtime reporting https://dl.vive.com/Tracker/Guideline/Vive Face Tracker Developer Quick Start.pdf Can you try the unity example and see if it reports the same issues? https://developer.vive.com/resources/openxr/openxr-mobile/tutorials/unity/getting-data-of-facial-tracking/ the goal of this is to further understand if there are any remaining hardware issues (like a usb 2.0 cable causing issues) and to further isolate the issue
  9. Hi @dynameis tw Thanks for the report! It does seem that those videos seem to be missing the metadata to specify that they're 360 videos. The standard metadata tag is specified here https://github.com/google/spatial-media/blob/master/docs/spherical-video-v2-rfc.md An example of a working video here has the immersive / stereoscopic tag in the video itself, which ends up looking like 0000014A StereoMode (4 bytes) 0000014A Header (3 bytes) 0000014A Name: 5048 (0x13B8) 0000014C Size: 1 (0x01) 0000014D Data: 1 (0x01) I'm not sure what the timeframe would be on 'forcing' a video into 360 mode when the video itself does not specify it, but I'll look into it! Thanks, Alex
  10. @Chloe Chen Hi! There would be two major reasons for this 1) Is the vive openxr package installed? 2) Is the project compiling? The feature group will not be available in some circumstances when there are other compile errors
  11. @eddylaiVive wave is the xr sdk https://developer.vive.com/resources/vive-wave/download/latest/
  12. @dynameis tw With the latest 1.3.2 version of wolvic - I've been streaming by hitting "fullscreen". At this point, most videos will work Some older videos do not have the metadata tag that identifies the type of immersive video it is. This can be fixed manually by hitting "fullscreen", selecting the "vr glasses" icon in the screenshot above, then selecting 180 in the options. The bottom three options are related to 180 video If you have some videos that are not working, send me a link
  13. @SuperGrover @dynameis tw @Wummi Great to hear that the new wolvic browser version is on the store and that it fixed the issue you were seeing. Enjoy your vr experience!
  14. @dynameis tw if you're still having trouble, try downloading the latest version of the wolvic browser on the store - it has fixes to a number of videos
  15. @DragonDreams 2 suggestions when we tend to see this: 1) Make sure you're using the latest version of sranipal runtime 2) make sure openxr provider is set correctly -- afaik it is in the registry as Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Khronos\OpenXR\1\ActiveRuntime but also exposed in steamvr settings
  16. Howdy @exedos Look at our openxr plugin here https://developer.vive.com/resources/openxr/openxr-mobile/download/latest/ This should help add the relevant xr entries. Here's some additional info from developer also uses unreal with openxr:
  17. Hit the "fullscreen" button and if it is not detected correctly, then select 180 or stereo 180 left/right or top/bottom
  18. Sounds like you want a video player like avpro - here's an openxr example https://github.com/hardcoded2/AVProTest/tree/openxr And collecting data is a matter of saving the output of raycasts from the user's eyes in whatever format makes sense to you (json/xml/flatbuffers/whatever)
  19. @Tiberiu5 or another way to put it -- It's very similar to the use case where a tracker is put on an accessory and you want to merge the two coordinate spaces -- some tech examples are here The "accessory" in this example would be your room 🙂
  20. Howdy @Tiberiu5 Great question! Adding a tracker in a stable position will help you get the two spaces in the same spot. My top level thoughts are -- all things are relative! Well... ok, this isn't philosophy 101, so how do we unpack that? How do we coordinate two different coordinate spaces in real life? We create a shared reference point. A controller placed at a specific point/rotation at a specific time (or a tracker) will help align the room. Just like in real life -- if I see the front of the statue of liberty and you see the back -- if our goal is to meet we now know that we need to walk towards eachother to meet. So the model needs to be aligned(postiion+rotation) with something in the virtual room that needs to be specified. You could use a tracker or a controller placed at the same point every time for this purpose. Then you match the model of the room ontop of the room based on that. Then the tracker(or controller) will be that "landmark" by which both virtual spaces can merge, and if it's a tracker, you could even see the impact in real time of the rotation until it is stable. With more points of reference (ie trackers) the positioning can be even more stable. If you pasted 4 trackers (overkill, but this is just an example) on the walls then you would know the alignment of the virtual room with a high degree of accuracy and stability - one tracker falling or slipping a little would have less of an impact. Applying more points will help, and that just depends on the budget and desire for accuracy. A lot of casual experiences (like the ones we were just demoing on the vive xr elite) will merge two spaces using a single point, and a more serious simulation/enterprise solution might want to have more accuracy over time and be more fault tolerant with more shared points of reference. Thanks for the question, and we love these use cases and I'm excited to see more content made this way myself. Let me know if you want to dig further into this! -Alex
  21. you can try copying the qr code with snip and sketch (or whatever screenshot tool) and then showing it on your phone or other device
  22. You can install 3rd party apps like developers using adb https://www.makeuseof.com/install-apps-via-adb-android/ Developers do this all the time 🙂
  23. @minugi2323 Can you try updating the system software? I'm seeing this on my headsets
  24. While we're compatible with older sdks (like the one mentioned above which uses the older deprecated openvr) Most recent unity versions no longer support that - so we suggest using the wave sdk https://developer.vive.com/resources/vive-wave/tutorials/installing-wave-xr-plugin-unity/
  25. @AltoK The team is off this week, but we will get some additional information soon. It is my understanding that there were a few last minute things that needed to be addressed
×
×
  • Create New...