Jump to content

Search the Community

Showing results for tags 'vivepro'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • VIVE Community Forums
    • VIVE Technical Support
    • VIVE Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

  1. Hello. I am a university student working on my final project using an HTC Vive Pro. The goal of said project is to implement a segmentation algorithm like SAM in the headset using the footage from the front facing cameras. So I would need to have access to the camera images, process them and then display them back once processed on the headset's screens. I was hoping that you could point me in the direction of some development tools, documentations, projects, that could help me achive this. Thank you.
  2. The document states that the VIVE Pro series supports Hand Tracking on UE. However, I cannot get it to work with the sample project. The camera is enabled in SteamVR and the Camera Test has passed. The sample project works well with Focus 3, so I suspect it's more of a hardware or setup issue. Hand Tracking - Developer Resources (vive.com) Download: OpenXR for Unreal - Developer Resources (vive.com)
  3. I am wondering how I can calibrate the Vive pro eye for a single eye only. That is if one eye is closed, so far I am using the: SRanipal_Eye_API.LaunchEyeCalibratin() function to calibrate but it fails to calibrate if I have one eye close.
  4. Hi, I've been trying to get hand detection going for my project and this is going to be a bit lengthy, because I've already tried out so many things, so here goes: First I started with hand tracking in OpenXR. I followed the tutorial described here: https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/how-integrate-hand-tracking-data-your-hand-model/ Had to fix a couple of problems in the tutorial code, notably adding declaration for locations: private static XrHandJointLocationsEXT locations; And also a few other problems. So the script now compiles and runs, but unfortunately when I call res = feature.xrLocateHandJointsEXT(feature.m_leftHandle, locateInfo, ref locations);, the locations.isActive is always 0. I've debugged as far as I could, but I just don't seem to be getting hand tracking updates. Any suggestions would be very appreciated as I am not very familiar with OpenXR or unity. I'm using Unity 2021.3.5f1. Also camera is enabled in SteamVR settings and tested. It just looks like hands are either not detected or camera input is not reaching OpenXR. I also tried to create a completely new project and just run the sample code, launching HandTrackingScene.unity from the samples folder, that we can import with VIVE OpenXR for Windows. That also didn't work- same thing, the hands don't even appear. And same thing happens in Unity 2020.3.13f1. All I do is follow this tutorial: https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/installing-vive-openxr-pc-vr-plugin/ And then the OpenXR setup part of this tutorial: https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/how-integrate-hand-tracking-data-your-hand-model/?site=tw& And then I only run the sample project and it doesn't do anything. If I look at the code and print out a debug statement, then it is obvious that line 46 in UpdateHandData.cs always evaluates to FALSE: so the branch of: if (locations.isActive == 1) never executes. The source code of public int xrLocateHandJointsEXT(ulong handTracker, XrHandJointsLocateInfoEXT locateInfo, ref XrHandJointLocationsEXT locations) inside HandTracking_OpenXR_API seems to be closed source, so I don't know what happens beyond. I guess one more useful thing to add: I can get the hand tracking to work on Unity 2019.4.30f1 using the Hand Tracking SDK: https://hub.vive.com/storage/tracking/unity/setup.html But that requires to use OpenVR, which is deprecated in later unity versions. Unfortunately Unity 2019 is not an option for me, because I need to also run ROS-TCP-endpoint and that one only works on Unity 2020.2.x and above. I also tried out importing SteamVR plugin Version 2.7.3 (sdk 1.14.15) - February 23, 2021 into Unity 2021.3.5f1 and that allows me to use the old Hand tracking SDK with OpenVR. But that gives me two new (albeit less severe) problems: 1) Unity crashes the moment I stop my app 2) When I start my app, StemVR displays a complaint in the headset about missing actions.json file in the "Assets\StreamingAssets\SteamVR" folder. The error says: the action manifest for [Testing]<my project name> was not found. I created an empty actions.json file there and it still can't find it, so I don't know what else to try. I desperately need help on this please! I am working with a VIVE PRO headset. I am using Windows 11. Thanks.
  5. Hi, I wanted to ask about why does the Scene Understanding Example not working out for me. I am using Unity 2021.3.2f1 and have also added the OpenXR Vive Plugin, but still cannot play the demo sample. Then I tried to not use the sample and just make a custom one like the tutorial, but the script kept on disabling itself because it cannot found any XRMeshSubsystem using SubsystemManager.GetInstances(). Is there anything that I'm missing?? Oh yeah, this is also the setting for OpenXR Plugin Management Here's my script for MeshingTeapotFeature using System; using System.Collections; using System.Collections.Generic; using System.Runtime.InteropServices; using UnityEditor; using UnityEditor.XR.OpenXR.Features; using UnityEngine; using UnityEngine.XR; using VIVE.SceneUnderstanding; namespace UnityEngine.XR.OpenXR.Samples.MeshingFeature { #if UNITY_EDITOR [ OpenXRFeature ( UiName = "Meshing Subsystem", BuildTargetGroups = new[] { BuildTargetGroup.Standalone, BuildTargetGroup.WSA, BuildTargetGroup.Android }, Company = "HTC", Desc = "Example extension showing how to supply a mesh from native code with OpenXR SceneUnderstanding functions.", DocumentationLink = "https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/interact-real-world-openxr-scene-understanding/", OpenxrExtensionStrings = "", Version = "0.0.1", FeatureId = featureId ) ] #endif public class MeshingTeapotFeature : SceneUnderstanding_OpenXR_API { public new const string featureId = "com.unity.openxr.feature.example.meshing"; private static List<XRMeshSubsystemDescriptor> s_MeshDescriptors = new(); protected override void OnSubsystemCreate() { CreateSubsystem<XRMeshSubsystemDescriptor, XRMeshSubsystem>(s_MeshDescriptors, "Sample Meshing"); } /// <inheritdoc /> protected override void OnSubsystemStart() { StartSubsystem<XRMeshSubsystem>(); } /// <inheritdoc /> protected override void OnSubsystemStop() { StopSubsystem<XRMeshSubsystem>(); } /// <inheritdoc /> protected override void OnSubsystemDestroy() { DestroySubsystem<XRMeshSubsystem>(); } protected override void OnSessionCreate(ulong xrSession) { m_XrSession = xrSession; NativeApi.SetOpenXRVariables(m_XrInstance, m_XrSession, Marshal.GetFunctionPointerForDelegate(m_XrEnumerateReferenceSpaces), Marshal.GetFunctionPointerForDelegate(m_XrCreateReferenceSpace), Marshal.GetFunctionPointerForDelegate(m_XrDestroySpace), Marshal.GetFunctionPointerForDelegate(m_XrEnumerateSceneComputeFeaturesMSFT), Marshal.GetFunctionPointerForDelegate(m_XrCreateSceneObserverMSFT), Marshal.GetFunctionPointerForDelegate(m_XrDestroySceneObserverMSFT), Marshal.GetFunctionPointerForDelegate(m_XrCreateSceneMSFT), Marshal.GetFunctionPointerForDelegate(m_XrDestroySceneMSFT), Marshal.GetFunctionPointerForDelegate(m_XrComputeNewSceneMSFT), Marshal.GetFunctionPointerForDelegate(m_XrGetSceneComputeStateMSFT), Marshal.GetFunctionPointerForDelegate(m_XrGetSceneComponentsMSFT), Marshal.GetFunctionPointerForDelegate(m_XrLocateSceneComponentsMSFT), Marshal.GetFunctionPointerForDelegate(m_XrGetSceneMeshBuffersMSFT)); systemProperties.type = XrStructureType.XR_TYPE_SYSTEM_PROPERTIES; XrSystemPassThroughPropertiesHTC SystemPassThroughPropertiesHTC; SystemPassThroughPropertiesHTC.type = XrStructureType.XR_TYPE_SYSTEM_PASS_THROUGH_PROPERTIES_HTC; unsafe { systemProperties.next = (IntPtr)(&SystemPassThroughPropertiesHTC); } int res = xrGetSystemProperties(ref systemProperties); if (res != (int)XrResult.XR_SUCCESS) { Debug.Log("Failed to get systemproperties with error code : " + res); } } public void SetSceneComputeOrientedBoxBound(Transform transform, Vector3 extent) { Vector4 rotation; Vector3 position; ConvertTransform(transform, out rotation, out position); NativeApi.SetSceneComputeOrientedBoxBound(rotation, position, extent); } private void ConvertTransform(Transform transform, out Vector4 rotation, out Vector3 position) { // Get left-handed values. position = transform.position; float angle; Vector3 axis; transform.rotation.ToAngleAxis(out angle, out axis); // Convert left-handed values to right-handed. position.z *= -1; angle *= -1; axis.z *= -1; var rotationQuaternion = Quaternion.AngleAxis(angle, axis); rotation = Vector4.zero; for (var i = 0; i < 4; ++i) rotation[i] = rotationQuaternion[i]; } private class NativeApi { const string dll_path = "MeshingFeaturePlugin"; [DllImport(dll_path)] public static extern void SetOpenXRVariables(ulong instance, ulong session, IntPtr PFN_XrEnumerateReferenceSpaces, IntPtr PFN_XrCreateReferenceSpace, IntPtr PFN_XrDestroySpace, IntPtr PFN_XrEnumerateSceneComputeFeaturesMSFT, IntPtr PFN_XrCreateSceneObserverMSFT, IntPtr PFN_XrDestroySceneObserverMSFT, IntPtr PFN_XrCreateSceneMSFT, IntPtr PFN_XrDestroySceneMSFT, IntPtr PFN_XrComputeNewSceneMSFT, IntPtr PFN_XrGetSceneComputeStateMSFT, IntPtr PFN_XrGetSceneComponentsMSFT, IntPtr PFN_XrLocateSceneComponentsMSFT, IntPtr PFN_XrGetSceneMeshBuffersMSFT); [DllImport(dll_path)] public static extern void SetSceneComputeOrientedBoxBound(Vector4 rotation, Vector3 position, Vector3 extent); } } } And here's the script for MeshingBehaviour which I named SceneUnderstanding_Sample using VIVE.SceneUnderstanding; using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.XR; using UnityEngine.XR.OpenXR; using UnityEngine.XR.OpenXR.Samples.MeshingFeature; using UnityEngine.SubsystemsImplementation; using UnityEngine.InputSystem; public class SceneUnderstanding_Sample : MonoBehaviour { public GameObject emptyMeshPrefab_Default; public Transform target; private XRMeshSubsystem s_MeshSubsystem; private List<MeshInfo> s_MeshInfos = new List<MeshInfo>(); private Dictionary<MeshId, GameObject> m_MeshIdToGo = new Dictionary<MeshId, GameObject>(); private MeshingTeapotFeature m_MeshingFeature; //Scene compute bound variables // A default cube game object public GameObject m_BoxBoundObject; // Start is called before the first frame update void Start() { m_MeshingFeature = OpenXRSettings.Instance.GetFeature<MeshingTeapotFeature>(); Debug.Log(m_MeshingFeature.name); if (m_MeshingFeature == null || m_MeshingFeature.enabled == false) { enabled = false; return; } var meshSubsystems = new List<XRMeshSubsystem>(); SubsystemManager.GetSubsystems(meshSubsystems); if (meshSubsystems.Count == 1) { s_MeshSubsystem = meshSubsystems[0]; // textMesh.gameObject.SetActive(false); } else { enabled = false; } } // Update is called once per frame void Update() { if (m_BoxBoundObject == null) return; m_MeshingFeature.SetSceneComputeOrientedBoxBound(m_BoxBoundObject.transform, m_BoxBoundObject.transform.localScale); // The widths of a default cube is 1.0f. if (s_MeshSubsystem.running && s_MeshSubsystem.TryGetMeshInfos(s_MeshInfos)) { foreach (var meshInfo in s_MeshInfos) { switch (meshInfo.ChangeState) { case MeshChangeState.Added: case MeshChangeState.Updated: if (!m_MeshIdToGo.TryGetValue(meshInfo.MeshId, out var go)) { go = Instantiate(emptyMeshPrefab_Default, target, false); m_MeshIdToGo[meshInfo.MeshId] = go; } var mesh = go.GetComponent<MeshFilter>().mesh; var col = go.GetComponent<MeshCollider>(); s_MeshSubsystem.GenerateMeshAsync(meshInfo.MeshId, mesh, col, MeshVertexAttributes.None, result => { result.Mesh.RecalculateNormals(); }); break; case MeshChangeState.Removed: if (m_MeshIdToGo.TryGetValue(meshInfo.MeshId, out var meshGo)) { Destroy(meshGo); m_MeshIdToGo.Remove(meshInfo.MeshId); } break; case MeshChangeState.Unchanged: break; default: break; } } } } } P.S. I also tried the example for SRWorks and I also can't get it to Play.
  6. I downloaded Viveport and it says i need a app to open my viveport link and I cannot go on viveport without it
  7. Hello everybody. I am writing to ask for your assistance. I want to develop AR applications for Vive Pro in Unity. Although I read the documentations in HTC Vive website, I did not find any related docs for say superimposing a virtual object onto real world. I would be grateful If you could help me with that.
  8. Hi, I work at a University and we are looking to deploy this software across some VR suites in one of our buildings. I have tried various methods but can't find a silent install switch to use with this to deploy across these machines. Is there a silent switch I can use, or is there an .msi installer etc.? Looking for something that will make silent deployment of the linked software possible.
  9. Hi! I have been using VIVE Pro facial tracker for a while. I want to get the name or the label or weight at least, like the unique data of the facial expression being shown on the screen of Unity Engine when the avatar replicates the facial expression. I have edited the code in the script named as (SRanipal_AvatarLipSample.cs) in line (39). Here are some of the the results in the console- The value '0' is shown when I make no facial expressions, but other values are shown when I am making other facial expressions. Am I doing the right way, if the respective values of the real time facial expressions are being shown or does it show something related to the (Mouth_Smile_Right) expression only? Looking for a kind reply. Thanks!
  10. Hello, I am trying to build a VR application using Unity 2019.4.1 and Vive Eye Pro. Assume I have a "red" sphere in the scene, I want to check when the user looks at the sphere and change its color to green. Using the following libraries ViveSR.anipal.Eye, ViveSR.anipal, ViveSR I can read and check the data eyeData.verbose_data including gaze direction, gaze origin, eye openness and many more. I would appreciate any guidance and support to implement the example explained above. Thanks.
  11. Hello. I'm a Japanese student researching whether VR sickness can be improved by adjusting the frame rate. I'm sorry for my poor English, I'm running it through a translator. Let's cut to the chase. Using unity, I was able to successfully change the FPS on my PC screen using the following method. void Start () { Application.targetFrameRate=30; //30fps QualitySettings.vSyncCount = 0; } and the vSyncCount setting in unity has also been changed to "Don't Sync". However, when I connected it to the Vive pro, it automatically did this Application.targetFrameRate=-1 The value of -1 seems to be an automatic adjustment. As a result, FPS will be 90, which is the refresh rate value. But, I want the FPS value to vary between 10~90. Is there any way to prevent Application.targetFrameRate = -1? I would appreciate your advice.
  12. Hello. I'm a Japanese student researching whether VR sickness can be improved by adjusting the frame rate. I'm sorry for my poor English, I'm running it through a translator. Let's cut to the chase. Using unity, I was able to successfully change the FPS on my PC screen using the following method. void Start () { Application.targetFrameRate=30; //30fps QualitySettings.vSyncCount = 0; //This line works without it. } and the vSyncCount setting in unity has also been changed to "Don't Sync". However, when I connected it to the Vive pro, it automatically did this Application.targetFrameRate=-1 The value of -1 seems to be an automatic adjustment. As a result, FPS will be 90, which is Vive pro's refresh rate value. But, I want the FPS value to vary between 10~90. Is there any way to prevent Application.targetFrameRate = -1? I would appreciate your advice.
  13. Hello Recently my vive pro cable started glitching and do all sorts of stuff and made it impossible for me to play VR, so i decided to look for a new cable as you usually do and of course its out of stock everywhere, so i made a ticket instead to Vive support cause surely my warranty couldnt have gone out have only used for about just over one year now, so i typed to them all my problems that was obvious that a cable was the problem but nooooo i had to do all kindes of checking for nothing when they eventually said it was the cable that was the problem, so i said thank god now i can get a new cable finally but naaa apperently theres only warranty for 1 year on a cable when the whole headset if im not wrong is for 2 year warranty and how mutch i have to pay 106 euro for a cable ???? when on the shop its 51 euro ??? wtf and whats the high price for you would wonder ( labour and logistic fee) i didnt even send the cable in cause i knew it was the problem and i that needed a new one, didnt know there was bloody gold cables im getting. Well i probably wont by more vive stuff in the future thats for sure.
  14. Hello everybody. I am writing to ask for your assistance. I want to develop AR applications for Vive Pro in Unity. Although I read the documentations in HTC Vive website, I did not find any related docs for say superimposing a virtual object onto real world. I would be grateful If you could help me with that.
  15. Hi, right now I've been working only with the deployment of standalone applications, using the "Corporate content" in the "VIVE Business Device Management System". However, my company got a new client, that wants the same applications but on a PC-based system (Vive Pro 2). We have the application working, but we are not sure about which would be the best method to deploy, update and manage applications. The Vive DMS allows me only to upload APKs, so it doesn't look like an option. The Driver Deployment System seems to work only with drivers and within a local network (not remotely). What do you guys use for device management and application deployment with a Windows environment? What would you recommend?
  16. I have a HTC VIVE Pro and its link box was broken recently, so i ordered another one and when i connected everything, the base stations were recognized as Base Station 1.0 in steam vr app and their icons turn green but icons are for base station 1.0 and because of this problem in room setup it doesn't recognize the headset. it says controllers are ready but headset not ready. i did rescan for 2.0 base stations and did a channel reset with paper clip from back of bases nothing changed. connected base stations with micro usb cable nothing happened. i tried reinstalling windows, steam, and steam vr and again nothing changed. what's the problem? what do i do? please help
  17. Hello guys. I'm asking for your assistance for choosing a laptop to develop VR (HTC VIVEPro) simple games in unity. just for your information, VivePro supports only and only display port. the laptop I am looking for, must not only satisfy a display port 1.2+, but also the port defined for this purpose must be wired to GPU. thank you in advance.
  18. Hey folks, I've had the vive pro with wireless since wireless came out, with mostly, infrequent issues. However, I just booted up the vive for the first time in a while and I'm now getting the gray screen disconnect within minutes of starting to play any graphically intensive game. Specifically, I have been testing it with Alyx now because I know it used to work without issue as I played all the way through it when it came out. When I get the gray screen, the screen goes gray (duh), and then there is an animation like an old CRT monitor turning off where everything constricts to black sequentially but quickly. When this happens I DO still have sound, and I DO still have tracking, it's only the video that stops working. The only trend that I've spotted is in resource manager I get a spike on the graphics card usage to 100% immediately before the disconnect. My specs are MSI Z370 SLI + 8700k @ 5.0Ghz EVGA 1080 TI 32GB DDR4-3200 RAM Cl16 Things I have already tried (in no particular order) 1. Resetting all OCs to stock settings 2. Uninstalling and reinstalling the wireless driver 3. Ensuring there is no power management for wireless or pci slots 4. Ensuring everything is set to pci 3 in the bios 5. Ticking enable on the pcieconfig app that comes with wireless 6. Changing the wireless mode 7. Changing pci slots 8. Making sure that the antenna is well above me and pointing down 9. Installing a fan on the wireless in case it was heat related (it wasn't) 10. Trying a new usb cable 11. Trying different batteries 12. Setting the steamvr setting for controller turn off to never 13. Checking the steamvr logs for excessive reflections 14. Ensuring the antenna is screwed in tightly Any help would be greatly appreciated! SteamVR-2020-09-16-PM_11_01_17.txt HtcCU_20200916_222730_25584.txt
  19. Hi, I'm using the SRWorks SDK – 0.9.0.3 Version and getting the following error in the demo scene of ViveSR_Experience in Unity 2019.2.5f1 when hitting play: Runtime Error! Program: This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information. Unity quits when I hit the OK button. I tried following the Guide "VIVE Getting Started with SRWorks Experience in Unity" but the error occurs anyway. I'm using the Vive Pro Eye if that matters. The camera has been activated and successfully tested in SteamVR 1.11.13. The SRWorks Runtime has been installed as well. Any help is much appreciated.
  20. Dear all, we were using webxr a lot either by Google Chrome or by Edge. Fire Fox currently does not support WebXR. Currently we are facing the fact that both Edge and Chrome (83) return "VR not supported", although the HTC Vive is connected and SteamVR and "Viveport VR" manually started do run and show a scene. Does anyone else face the same issue? Example: https://threejs.org/examples/?q=webxr#webxr_vr_cubes
  21. Hello I bought vive pro eye to do eye tracking at unity. But, when I start eye tracking calibration in steamVR. adjust the headset up and down to fit the frame. Even if the headset up and down, the image does not change. and I can't go next step...just stop ; - ; In addition, the camera in SRruntime is not recognized. (And I connect my vive port to graphics card connected to the monitor.) I reinstall several times after deleting all the program(steamVR, SRruntime, etc,,,). SteamVR also 'Camera communication failed.' I think it is a problem with the headset recognition or camera recognition. No matter how many time I search, I can't get the result. Please help me,,, T ^ T (I'm already do this : ) I'm using the following desktop computer: GeForce Nvidia GTX 1080 Ti 11GB VRAM Intel Core i5-4670 CPU 16 GB RAM Windows 10 pro (64bit)
  22. Hello Everyone! We are having issue trying to use the Vive headsets in a multi-user environment. We have been doing it succesfully in the past, but in our experience we have noticed that things get messed up whenever there are updates to SteamVR, firmware upgrades to headset, controllers, trackers etc. The HTC documentation is conflicting and unclear. For example, on their official website, it is stated that Vive/Vive Pro headsets can be used together in a multi-user environment and the only thing mentioned is that 2 base stations are required for a 5x5m area and 4 base stations for 10x10m area. Link: https://www.vive.com/us/support/wireless-adapter/category_howto/multiuser-environment-examples.html This is conflicting with their Vive Pro documentation guide which states that only Vive pro headsets are recommended for the multi-user environment and base station V2.0 would be required. Link:https://dl4.htc.com/Web_materials/Manual/Vive_Pro(Enterprise)/UserGuide/VIVE_Pro_User_Guide.pdf For us, we have been unable to use the HTC Vive headsets in a multi-user environment as one headset keeps showing a green screen (no display output). By far, we have tested the following scenarios using official HTC Vive products: PC1 and PC2 are both built to the same specs: Motherboard: ASUS TUF H370 Pro Gaming Processor: Intel Core i5-9400f RAM: Corsair Vengeance 8GB DDR4 2400Mhz GPU: Nvidia Geforce RTX 2060 Storage: WD Green 240GB SATA III 6Gb/s Scenario 1: Base Stations: 2x V1.0 PC1: HTC Vive with HTC Wireless kit PC2: HTC Vive Pro with HTC Wireless kit Result: Vive wireless= green screen, Vive Pro wireless= working. Scenario 2: Base Stations: 2x V1.0 PC1: HTC Vive Pro with HTC Wireless kit PC2: HTC Vive with HTC Wireless kit Result: Vive wireless= green screen, Vive Pro wireless= working. Scenario 3: Base Stations: 2x V1.0 PC1: HTC Vive wired PC2: HTC Vive Pro with HTC Wireless kit Result: Vive wired= green screen, Vive Pro wireless= working. CHANGING BASE STATIONS TO V2.0 Scenario 1: Base Stations: 2x V2.0 PC1: HTC Vive Pro wired PC2: HTC Vive Pro with HTC Wireless kit Result: Vive pro wired= green screen, Vive Pro wireless= working. The play are has sufficient space, approximately 6x4m and each headset has more than 2x2m spacing. Everything has been set according to the guidelines by HTC (We have been using Vive headsets for 2 years now so we can be considered as experienced users) We have spent thousands of dollars purchasing premium HTV Vive products, we have multiple headsets, base stations, controllers, trackers etc. but our problem is not being resolved and the HTC support is unable to understand the problem. They do not have any solution other than to refer us to unhelpful documentation which is not only conflicting but also we have tried all the scenarios and gone through everything many times. Our activity is already getting very late and we have no help from HTC official support and our issue resolved. Can anyone suggest any solution to our problem? We would be really grateful if some can help. Thanks!
  23. Hi, Could you tell me about Vive pro camera’s spec? I would like to know number of pixels.
  24. Hello, i own a vive pro with 2 controllers 1.0 and 2 base stations 1.0. I would like to know if it is possible to add a 2.0 base station to better track my hmd in my seated zone ? I already know 2.0 cannot track 1.0 controllers, but this doesn’t matter for my setup, i only need the hmd to be tracked with the 2.0 base station. thanks for the answers 🙂 have a great day ! Fabien.
  25. Just received my Vive pro yesterday and the setup was flawless and without problems. The controllers and the HMD was updated by STEAM VR to their latest firmware - so all looked fine. But while playing around with the Chaperone, i noticed that my Dual Front Cameras are not working or at least can not be enabled. I always get a "Camera Communication Error" in STEAM VR when testing the CAM So here is the setup and description for everything. All the info is also provided with the attached screenshots. I am using Win10 64 BIT ( win10_1909_build_18363.535) and STEAM VR Version 1.9.16 Windows Device Manager shows the Vive Pro Multimedia Camera (Driver Version 10.0.18362.1 dated 21.06.2006) and the device has the VID/PID 0BB4&PID_030E and it is connected to a USB 3 SuperSpeed Port. I also disabled the power management for this port (as instructed in the VIVE online documentation) The Windows USB Drivers for the SS USB Hubs are up-to-date (Version 10.0.18362.1 dated 18.03.2019) Also attached is the LOG File from the STEAM VR Developer module. I also checked the Windows 10 Privacy Settings for Cameras (everything enabled but no special entry for Vive or Steam can be found there) but to make sure, i tested if the cam is visible/available to other windows applications. And yes, the cam gets detected as a device in e.g. Corel Video Studio but i could not capture something - but that could be normal/OK. So, what can be done or are there any tools or progs where i can test/see if the cam is just "dead" or if that is only a Software issue, or any things i could try in addition? Best regards and TY for all help - much appreciated. Steam_VR_LOG.txt
×
×
  • Create New...