Jump to content

Vivi Wu

Employee
  • Posts

    124
  • Joined

  • Last visited

Reputation

30 Excellent

3 Followers

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi @Lightshape & @Oeds, When it shows the VIVE Logo loding icon, means still find the map. Please make sure you are on the place that you scan your map. Also you can look around. Make sure the HMD have collect enough featuers when you scan. This means scan more and look more while creating an LBE map. If it still stuck in spinning logo and can not retrive positions, please contact your HTC Sales (The contact window when you purchase the VB+ license) and they would guide you to get the debug tool.
  2. Below is the tutorial for how to use the tracker on OpenXR. Hope this will help. Thanks.
  3. Hi @RMFoley92, Please help to confirm below info for us: 1. What platform are you using? Android or Windows? 2. What device and the device version are you using? 3. Is there any error code or specific problem that you encounter Thanks.
  4. Hi @Eswar, Please try to use XR_HTCX_vive_tracker_interaction extension if you are using Unity engine. We've updated the tutorial and the new SDK support will be coming soon. Thanks.
  5. What is Ultimate Tracker: https://www.vive.com/us/accessory/vive-ultimate-tracker/ System requirement: VIVE XR Elite FOTA 6.6 - 1.0.999.508 or later version VIVE Focus 3 for VIVE Ultimate Tracker support and VIVE LBSS LBE Mode is a gradual rollout. Please contact VIVE Support to request access. Support Table:   Windows (PC) VR Android (AIO) VR Devices VIVE Focus3 / XR Elite + VIVE Streaming VIVE Focus3 / XR Elite Engine Unity Unreal Engine Unity Unreal Engine Ultimate Tracker ⬤ ⬤ ⬤ ⬤ Body tracking extension     ⬤   Android (AIO) VR Tutorial: Wave Prerequisite: Wave SDK 5.4.0 or later version Use as tracker Unity - https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRTracker.html Unreal Engine - https://hub.vive.com/storage/docs/en-us/UnrealPlugin/UnrealTracker.html Body tracking Unity - https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRBodyTracking.html OpenXR: Coming soon. Windows (PC) VR Tutorial: Prerequisite: VIVE Streaming 1.3.6 / VIVE Business Streaming 1.12.8 or later version OpenXR: Unity - VIVE_Ultimate_Tracker_PC_OpenXR_v2.pdf Unreal Engine - https://forum.htc.com/topic/14370-tutorial-openxr-pc-vr-how-to-use-vive-tracker/ More detail, support and troubleshooting : https://business.vive.com/us/support/ultimate-tracker/
  6. Hi @FXI, If it's PC VR (Windows platform) on OpenXR, we have different plugin versions of OpenXR. For Metahuman please import the OpenXR for PC VR(legacy) plugin (https://developer.vive.com/resources/openxr/openxr-pcvr/download/latest/). Then you could find the Metahuman assets and refer to the tutorial that you mentioned. Note: Remove Wave SDK first and if you are developing an Android platform please let me know, the tutorial here is not suitable for you. The new plugin and tutorial for 2 in 1 OpenXR will be coming soon.
  7. Hi @Ilija @lynXR, The change on the content is to modify the textures of ALPHA layer. So you could use OpenVR or OpenXR SDK for your PC content on Elite or Focus3.
  8. Introduction The “VIVE Focus 3” and “VIVE XR Elite” provide camera passthrough features, especially “VIVE XR Elite” has a great color passthrough. More and more content developers look forward to use the camera passthrough to create an immersive MR content. Now, Vive Business Streaming brings this vivid passthrough to PC content with MR support. In this document, we would like to mention that VBS can now enable passthrough underlay on VIVE HMDs during streaming with just a click. Content developers can effortlessly transform their content into immersive mixed reality encounters to work with VBS MR mode. By submitting alpha layer to SteamVR Runtime which many contents already support, e.g. Blender. VBS will take care of all the rest work to stream the alpha layer and blend with camera passthrough at headset side to create MR experience. PC users can now enjoy Mixed Reality contents with the high-quality color passthrough on VIVE XR Elite, also worked on VIVE Focus3. Requirements VIVE Streaming Hub v1.1.9 (developer only) / VIVE Business Streaming v1.11.3 or newer Supported GPU: NVIDIA and AMD Supported SDK: OpenVR and OpenXR To enable MR with passthrough VIVE Business Streaming: Open VBS Console Settings Switch to “Graphics” page and enable “MR with passthrough” Launch MR content and enjoy You can enable/disable MR mode in anytime VIVE Streaming Hub To enable developer mode, download devmode.jsonfile and copy to C:\Program Files\VIVE Hub\ folder. Start VIVE Streaming Hub, then the MR with passthrough option should be there in VIVE Streaming settings. Enable MR with passthrough option. Develop your MR content for VBS VBS extracts the alpha channel of textures of Content and use the information to blend VR texture and passthrough. If the content developers would like to mix real environment and content scene, they have to assign alpha value into the scene. Please read below steps for more details. The content textures format should include ALPHA layer The alpha value should be 0 to 1. To show passthrough on the HMD, please set the alpha values closed to 0 into target area in the scene. To show the content texture on the HMD, please set the alpha values closed to 1 into target area in the scene. 1. Content textures 2. Extracted Alpha layer texture 3. MR result from HMD Camera setup in Unity How to set transparent camera background in Unity: Select the Main Camera in the Hierarchy panel In the Inspector panel, find the Camera component Change the Clear Flags to Solid Color Change Background color to RGBA values (0, 0, 0, 0) You can use Unity's Color.clear to set this in your code Troubleshooting If there is no passthrough textures while MR mode is enabled, how to recover? Please reopen the function again to check if the MR mode normal Please verify the alpha layer of the textures is already provided with our steps and submit to SteamVR. The performance looks worse when MR mode is enabled. Please check the Console settings -> Graphics settings -> Streaming graphics preferences, we suggest not to use Ultra preference when MR mode is enabled due to additional encoding involved. How to record the MR content result in headset? Please make sure “Allow passthrough recording and casting” is enable. You can find the setting from HMD’s Lobby, Settings -> Advanced -> Camera settings. When you change this item, please reboot your device.
  9. Hi @aboutblank, Here's the new video for new Direct Preview. Please have a check, thanks.
  10. Hi @YangNoca, After the computer is connected to the headset, could you check whether the "adb devices" command is valid? If the installation is successful, the "HTC Device/VIVE RR Interface" should appear after the headset is connected. Thanks.
  11. VIVE XR Elite features a powerful full-color passthrough camera and depth sensor that enable you to interact with virtual objects in your real-world space. The following message will show you the scenario and tutorial for VIVE MR feature. Open up new possibilities in mixed reality. For the MR feature, it's supported by VIVE Wave SDK for Android development. Passthrough The basic MR would be the passthrough. Passthrough feature which enables an application to show the reality image to see the surrounding environment from the VR headset. Please refer the article here: Demonstration on how to use WAVE Passthrough Underlay/Overlay Passthrough Underlay Adjust the quality of passthrough image Scene perception The scene perception is the system ability to percept environment objects like floor, ceiling, wall, desk, couch, door, window etc. The MR content which included scene elements can recognize those scene objects and interact with them to make player to immerse in the real world. Scene elements are defined as Plane 2D, Anchor, Scene mesh, and Object 3D. Plane 2D: Support vertical/horizon plane and plane mesh Tutorial : Unity, Unreal Engine Scene mesh: Support visual mesh and collider mesh for room. Tutorial : Unity, Unreal Engine Anchor: A persist location in real world Tutorial: Unity, Unreal Engine
  12. Hi @xrSoftware, Are you using the default VRS Studio app on XR Elite? Or do you install it from Git ? After our test, you can use controller by press VIVE button or use hand tracking- pinch to exit the app by calling VIVE menu. Here's the tutorial : https://www.vive.com/us/support/vive-xr/category_howto/vive-menu.html Let me know if you still have question. Thanks.
  13. Unreal Engine does not provide OpenVR IVRDebug interface in SteamVRPlugin, therefore DriverDebugRequest() is not available for game project use. What you need to do to use DriverDebugRequest() in Unreal Engine: Link your Epic Games account to GitHub account and get authorized by Epic Games. https://www.unrealengine.com/en-US/ue-on-github Modify and rebuilt Unreal Engine according to our GitHub repository https://github.com/ViveSW/UnrealEngine/tree/4.27-DriverDebugRequest You should be able to call DriverDebugRequest() from your Unreal project now! If you want to check our sample project for a Blueprint example: Download sample project : UE 4.27 sample projectDDRTest_Proj.zip Switch the sample project’s Unreal Engine version to your local build Generate project file for sample project Now you can open the sample project and check Level Blueprint for the example of USteamVRFunctionLibrary::DriverDebugRequest() Note: For debug usage, search for “DebugRequest” or “SendV2RParamToServer” In Steam/logs/vrserver.txt to check if DriverDebugRequest() works properly.
  14. To identify if the device is lost tracking, the developer needs to use OpenVR API to get the device status. Also if you'd like to know whether your controller / wrist tracker is in 6Dof or 3Dof status, please follow the guidelines below and refer to OpenVR GitHub for more information. Required: VIVE Business Streaming version : 1.10.5 or later. Focus 3 – FOTA version : 5.0.999.676 or later Check OpenVR API to get status IVRSystem : IVRSystem::GetDeviceToAbsoluteTrackingPose IVRSystem::GetControllerStateWithPose IVRCompositor : IVRCompositor::GetLatestPoses (similar with WaitGetPoses) Both API can get TrackedDevicePose_t of devices with parameter-“ETrackingResult”, which is the tracking status of device. 6Dof = TrackingResult_Running_OK 3Dof = TrackingResult_Fallback_RotationOnly All of the APIs can get TrackedDevicePose_t which includes several parameters to identify the device tracking status. bPoseIsValid bDeviceIsConnected eTrackingResult TrackingResult_Running_OK: 6DoF TrackingResult_Fallback_RotationOnly: 3DoF Only the “bPoseIsValid” is true, “bDeviceIsConnected” is true and “eTrackingResult” is TrackingResult_Running_OK or TrackingResult_Fallback_RotationOnly means tracked. Otherwise, the device status should be lost tracking. The index of each device: To use the APIs, please use below device index for vr::TrackedDeviceIndex_t or pose array to get target device pose data. index device 0 HMD 1 Right controller 2 Left controller 3 Right wrist tracker 4 Left wrist tracker To access the target device and retrieve its status, please generate a large enough array and call the appropriate API. Use the index specified in the above table to obtain the status of the target device. Sample code: Sample 1: vr::IVRSystem * pvr = vr::VR_Init(&eError, vr::VRApplication_Utility); TrackedDevicePose_t result[5]; pvr->GetDeviceToAbsoluteTrackingPose(vr::TrackingUniverseStanding, 0, &result, 5); result[1].eTrackingResult; // right controller tracking result result[2].eTrackingResult; // left controller tracking result result[3].eTrackingResult; // right wrist tracker tracking result result[4].eTrackingResult; // left wrist tracker tracking result Sample 2: TrackedDevicePose_t result[5]; Vr::VRCompositor()->GetLatestPoses(result, 5, NULL, 0); result[0].eTrackingResult; // HMD tracking result result[1].eTrackingResult; // right controller tracking result result[2].eTrackingResult; // left controller tracking result result[3].eTrackingResult; // right wrist tracker tracking result result[4].eTrackingResult; // left wrist tracker tracking result
  15. Hi @Ryam, Could you provide me your E-mail. We can sent you an app which build py UE4.27 with OpenXR. You can use it to check is eye gaze on your device work or not. (Use Focus3 with VBS) Thanks.
×
×
  • Create New...