Jump to content

Tony PH Lin

Employee
  • Posts

    886
  • Joined

  • Last visited

Posts posted by Tony PH Lin

  1. Hi there,

    We're pleased to share an updating news to enable the Depth Sensor via Beta ROM, so developers could start development to blend the real and virtual worlds to create new experiences for mixed reality. (how to join Beta program.)

    Check it out how it works with Julbee MR Demo video we showed in MWC and GDC.

    Note: In latest Beta version, we have simplified and integrated the setup steps inside MR Room Setup in VRS Launcher.

     

    You may also interested to know what's the features and SDK APIs for MR developments, how to do performance tuning with MR contents, and how to manage the alignments with virtual objects and passthrough etc.

    Here are more info. which can help you to deeper implementation.

    Tutorials for Mixed Reality Development

    Unity Development

    Unreal Development

     

  2. Hi @jcm01,

    Scene Mesh can be broken down into two parts: Scanning and Reading.

    The demo scene you are looking at can only read Scene Mesh, and in order for the demo scene to be able to read anything, you need to scan and construct a scene mesh first.

    As for scanning scene mesh, this can only be achieved through a Beta ROM on XR Elite which has a version of MR Room Setup that can scan scene mesh. If you have access to that Beta ROM, you should be able to scan a scene mesh using MR Room Setup and read the scanned mesh in the demo scene.

    Also, the visual and collider mesh actually refers to scene mesh data with different levels of detail, which can be used for different purposes as implied by their names.

    For apply beta ROM, please refer to


     

  3. Hi @vethetheth,

    Sorry for the late reply since we're just back from GDC.
    And newest Direct Preview which support both USB/Wifi and hand tracking are online on Wave 5.2.0.
    Please refer to the link to download new driver and the steps.
    https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRDirectPreview.html

    Let us know any issue if you hit, and we will follow up to resolve.
    Thanks.

  4. Hi @Dan Lauer,

    To support Unreal 5.1 on our next update in March is under development.

    However we still have some integration issues with UE 5.1, so we may release first our support on OpenXR plugin soon.

    For WaveVR plugin, once we have earlier progress, will let you know if we can try to provide Beta version for your trial. Will keep you posted.

  5. Hi @julie.helbecque, @vethetheth, @Austin Sullivan Unity,

    Sorry for the late response.

    Currently we're working on Direct Preview refactoring to improve connecting stability, user flow and add new feature like hand tracking support etc. and plan to have an update version with next SDK release (in March timeframe). 

    URP support is on our plan, but it also requires more efforts and investigations to make the whole paths work. Will keep posted if the schedule is confirmed.

    Thanks.

  6. Hi @bjarmenzeit, @Austin Sullivan Unity,

    Currently we're working on Direct Preview refactoring to improve connecting stability, user flow and add new feature like hand tracking support etc. and plan to have an update version with next SDK release (in March timeframe). 

    URP support is on our plan, but it also requires more efforts and investigations to make the whole paths work. Will keep posted if the schedule is confirmed.

    Thanks.

  7. Hi @Identical Josh,

    Thanks for your inquiries, and you're right we disabled the direct access the raw data from camera image due to privacy policy. But we do understand there're many applications that require camera image to do further development so we provide alternative ways for developers to can achieve similar goals and results. 

    For example, we provide several passthrough methods on latest Wave 5.1.0 and also add Scene SDK session for processing planes, anchors, meshes etc.
    https://hub.vive.com/storage/docs/en-us/ReleaseNote.html#release-5-1-0

    If you have any specific targets which it's difficult to reach through latest SDK, we'd like to learn the use cases and discuss how to support these scenarios.

    Thanks.

    • Like 2
  8. We did encounter similar symptom when our project managed one project across Oculus and Wave, and sometimes even we disabled Wave in the XR Management settings the related stuffs still exist in the logs. However it’s not 100% existing in Unity version and we guessed it’s also related to the sequences which one we import first.

     

    Since it might be more related to Unity environment sequence and flow problem, we have no idea at this moment.

    The alternative (workaround) we did internally was to remove either one of plugin in the project and redo the target build.

    Maybe you can try this way first and if the issue disappear, then it’s almost same issue we met.

    This is the experience and suggestion we can share.

  9. Hi @vethetheth,

    Sorry for late response.

    For the delay maybe due to hand occlusion? We have a FOTA 5.3 ROM update so maybe you can try again with this latest ROM.

    For developing question, we have numerous build and tests inside the headset.

    In terms of 32-bit build, we think no differences functionality wise, however, the 64-bit build should be less prone to out-of-memory issues (not that we have encountered memory issues on 32-bit so far). Personally recommend using 64-bit anyway.

    Hope this is helpful.

    Thanks.

     

  10. Hi @focus3fan,

    Thanks to bring this question for us.

    Currently in OpenXR there is no definition to do re-center.

    For system-level re-center, Focus 3 provides “Reset View” in VIVE menu.

    For in-app re-center, app has to record the pose transform by itself and covert runtime pose with this re-center transform to target pose.

     

    Here are more description from OpenXR:

     

    When a user needs to recenter LOCAL space, a runtime may offer some system-level recentering interaction that is transparent to the application, but which causes the current leveled head space to become the new LOCAL space. When such a recentering occurs, the runtime must queue the XrEventDataReferenceSpaceChangePending event, with the recentered LOCAL space origin only taking effect for xrLocateSpace or xrLocateViews calls whose XrTime parameter is greater than or equal to the changeTime provided in that event.

     

    When the user redefines the origin or bounds of the current STAGE space, or the runtime otherwise switches to a new STAGE definition, the runtime must queue the XrEventDataReferenceSpaceChangePending event, with the new STAGE space origin only taking effect for xrLocateSpace or xrLocateViews calls whose XrTime parameter is greater than or equal to the changeTime provided in that event.

    Event  (XrEventDataReferenceSpaceChangePending) is provided from Openxr runtime for app.

    Thanks.

  11. Hi @Knase,

    Thanks for asking. Basically this post is mainly for PCVR content developer who used OpenXR PC or SRanipal SDK, so they can stream PC content via VBS on Focus 3.

    We plan to roll out official Vive WAVE SDK next week to support eye expression, and also an OpenXR Mobile plugin.
    You're welcome to get a pre-release version through our server, and I'll PM you to support creating the access to have a trial and feedback.

×
×
  • Create New...