Jump to content

HackPerception

Verified Members
  • Posts

    3,529
  • Joined

  • Last visited

Everything posted by HackPerception

  1. @Reimus Unfortunately this sounds very much like a PC-specific error as it survived the clean install. The HMD itself is basically a glorified display with sensors - they generally work or they don't work and aside from perhaps USB/DP bandwidth issues - I can't think of anything that would cause this behavior on the HMD side. You can try using older Nvidia drivers but there really isn't a ton of definitive steps beyond disabling motion smoothing and using legacy reprojection. If you Google the terms "motion smoothing, steamVR, and lag" you'll see alot of people reporting similar issues but unfortunately not alot of actionable steps to take. Motion smoothing is a Valve technology - they may have additional info if you reach out to their support or post on the SteamVR forums.
  2. @Dan1B1ue , No - unfortunately due to the complexity of how OEMs build and integrate laptops - we are only able to offer the Wireless adapter currently via a desktop PCI-e card which is the way we're able to achieve the greatest level of compatibility across the largest range of hardware since it's about as universal as you can get. Unfortunately there is no way to hack or mod the current wireless adapter to work with a laptop.
  3. Hi @rheamw , You can use what's called a null HMD driver. Triad (the company Valve contracts to make the SteamVR sensors) recently released an updated guide to the workflow you need to use within the last week - I haven't been able to test out their new recommended flow. In any case, the null driver is the way to achieve this - I'm not super sure about the python stuffs. Please keep in mind: Each one of the trackers will require you to have a corresponding Vive tracker dongle for RF The 2.0 basestations are only compatible with Vive Tracker 2018 which are easily ID'd by the blue Vive logo/power button. The first gens have a grey logo and won't work with 2.0 tracking.
  4. In practicality - using the External Tracking plate is the only robust way to use knuckles with Cosmos. The ET faceplate is a requirement to use knuckles with Cosmos. There are tools like OpenVR-SpaceCalibrator which can theoretically be used to hybridize different tracking systems into SteamVR using a bunch of math. The unfortunate reality is that the tool is a single-person open source project and that SteamVR has been updating like crazy. In the last year SteamVR alone pushed out over 300 beta releases and a number of major releases whereas the calibrator tool hasn't been updated in 13 months. Release 1.7.x specifically broke that tool in some instances. It'd basically be a full time job to keep up to date with SteamVR's update cycle and there's always the chance that updates could completely nuke tools like this, even if you stay on top of it. You'd also need to figure out how to get knuckles to talk to to your PC. I'm not sure how'd you'd do this beyond hacking Vive tracker/Steam controller bluetooth dongles. If you try to "hack" knuckles to work with Cosmos without the ET faceplate - it's going to likely be way more trouble than it's worth and will likely not be stable if you're able to get it working at all.
  5. @Yann Leurent Leap Motion is no longer operational and their technology assets were purchased by a competitor called Ultraleap. Before the sale, LM actually specifically shifted towards providing OEM partnership solutions so you could have an HMD with their technology baked in - it sounds like Ultraleap is still working with some OEMs but it's still pretty early into their acquisition. The USB port on Vive Pro/Pro Eye will work when paired with wireless and these HMDs are more suitable for enterprise and heavy usage. It's specifically that USB isn't supported on Wireless + Cosmos which is the problem here - that's the sole incompatibility right now and USB does work on wired Cosmos. Pro Eye + a Leap Motion would probably be a really powerful combo in the hands of skilled developer.
  6. @davide445 I'd test it if I had one on hand but I haven't seen a loose LM unit in at least a year. Love their tech. I have used other USB devices like USB-C -> 3.5mm with Cosmos - just not LM.
  7. @davide445 - You can use the onboard USB on the Cosmos if you're using wired - not on wireless. It's a USB-C header. If you're thinking of using knuckles, I'd err on the side of ordering before HL:Alxy decimates their stock.
  8. I was able to confirm with our hardware R&D team that the USB port will unfortunately not work when using Vive Cosmos with the Wireless adapter because the combined current draw of the HMD and external devices is higher than the design spec of the wireless adapter. At this time, it doesn't look like something we'll be able to resolve via a firmware update. Sorry it took some time to confirm the behavior - everything is a little weird right now due to COVID. @ATAT @Lionel
  9. All, The faceplate will be available as a standalone purchase from Vive.com upon release for those with existing Cosmos HMDs.
  10. @Adalast - restarting the PC and power cycling the linkbox can help with garbled audio issues.
  11. @BtheGoldenNinja - This isn't a very common error so there isn't a ton of info about it out there. In the few times I've seen it pop up on message boards, people have basically said that clean installing windows was a good strategy while others said that there were BIOS settings (USB specifically) that they were able to modify. Here's an example thread where they discuss both potential tweaks. This is really system specific and I've never seen it firsthand before - only on boards. Seems to have happened more with Asus gear - Asus may have more info on this.
  12. @komnidan - Please see my responses here, and here. In short, our SDK doesn't offer analytics tools out of the box but you may be able to roll your own solution using our APIs depending on the level of hardware access you need. Otherwise, you can license the Tobii XR SDK from Tobii which will give you deeper API access to the hardware and comes with a suite of Tobii' built analytics tools. The SDK package has .HTML documentation for Unreal. @MariosBikos_HTC is more familiar with the Unreal Implementation than myself and may have additional resources for you.
  13. @TomCgcmfc We are no longer manufacturing the original SKUs of Pro and are selling through remaining stock. The Pro has been succeeded by the Pro Eye as our enterprise HMD. Pro Eye is OLED. The OLED situation is tricky because consumer feedback overwhelmingly indicated that the subpixel arrangement and resulting screen door effect was one of the most important things to consumers as it ultimately affects perceived resolution. You can see this trend replicated across most current gen-HMDs. To boot, OLED panels are only manufactured by a handful of companies worldwide so there are some supply constraints when designing a product based around those panels.
  14. @Tedblobbloke Which HMD is this for? The support pathway can vary by which HMD you're referring to. HDCP errors are indicating that your GPU is failing to establish a secured encrypted connection to the HMD. Commonly it could be due to a bad HMD tether, one of the cables that you're using to connect to the linkbox, or the GPU simply not detecting the HMD correctly due to something like the HMD booting into extended mode rather than direct mode. If it's an orignal Vive - the best strategy is to partially bypass the linkbox and plug the 3-in-1's USB and HDMI cables directly into the PC. That said, if it's a new HMD from us - a week and a half old headset would certainly be under warranty and you probably could have gotten a tether swap or even an HMD swap via RMA.
  15. Hello @Aymeric - I'm generating a support ticket with the email address associated with your forum account so I can connect you with our content operations team. You can always reach out directly via store@viveport.com for devconsole and submission inquires - that's our primary devrel and support email for the Viveport platform.
  16. Every inside out VR tracking system uses machine learning and there's ML baked into SteamVR tracking as well. Beyond just the line of sight issue - there's ML for sensor fusion with the IMUs and there's also some ML at play to try and keep the input delay as small as possible. I personally think ML is going to be one of the biggest drivers for VR tech in the coming years - it's so incredibly powerful when deployed correctly but with that potential for huge gain comes the potential for abuse and the erosion of privacy.
  17. @The_Fly Linux support for Pro Eye is unfortunately not roadmapped at this point in time within the scope of the current product and I'm not sure what Tobii's stance on this is within the context of our licensing agreement. Our current SRAnipal SDK and runtimes are wrapped versions of core Tobii technologies that we're licensing (and that end-users also license as part of the Pro Eye HMD purchase). That unfortunately rules out open sourcing. Barring a change in Tobii's corporate philosophy, alot of that tech stack including the runtime engine is proprietary to Tobii. Vive couldn't open source it if we wanted to - we just license it. There are actually two SDKs that can access the Pro Eye - you're also able to license the Tobii XR SDK/runtime directly from Tobii. It provides a much deeper level of API access and comes with a suite of analytics tools. I can't find any info on if that SDK/engine supports Linux currently - I'd recommend reaching out and seeing if their first party solution will fit your use case. I've been using Linux for 15 years - I'm there with you on the benefits and I only use Windows because of VR. The unfortunate truth is that Pro Eye is already a hyper niche product in a niche industry - Linux support is an additional layer of niche meaning the userbase would be impractically small at the moment. I've only seen this question pop up once or twice - we really haven't seen many people ask for this. That said, we appreciate the feedback because it's it's a solid data point I can feedback to the HW product team.
  18. @The_Fly - We've had a running post where we've posted solutions to the calibration initialization issue. In other cases we've supported people via email or via our enterprise support team (support.enterprise@htc.com ) as it's only sold as an enterprise product (which has separate support from consumer). If you're still having issues after trying to instructions there - PM me and we can schedule a call with one of our senior devrel guys. The primary source of this issue has primarily been around Windows UAC. I'd recommend running everything under a full admin account rather than using a standard account and using the UAC admin prompt.
  19. @GP - I am PM'ing you the support contact that handles vive.com orders as that's pretty specific... If anybody on our team would know, it'd probably be them.
  20. @sebastian_holocafe Please copy your message and email support.enterprise@htc.com as this is certainly a complex case
  21. @GP, Yes - the Vive Wireless adapter works with Pro Eye (as long as you have the Pro wireless attach kit).
  22. @Damon - One piece of early internal feedback I received is that this could be related to thermals within the basestation as the trend your data shows is similar to trends we see around thermals. Positional data from stations can have more variance when the station is startup up and warming up. They're recommending that you test with the stations fully warmed up to rule that out as a potential source of the problem. Basically they're suggesting that you leave the stations running for 3 hours (you may need to disable BT power management) and then testing to see if your pose estimation is more accurate and if you still see the drift you're reporting above.
  23. @TrevorGage There's a couple of separate layers going on here: From the engine prospective, it's just standard multiplayer and it doesn't differ from You'd chose a multiplayer SDK or solution that's to your liking and build from there. In many cases, developers will often use the platform SDK for each specific content store (Oculus, SteamWorks... ect)... but when you do that, each store build becomes increasingly different from one another. You can use engines like PhotonEngine to do multiplayer agnostically. What you're describing is co-present VR (co-presence). Co-presence VR is not well supported - it's kind of a happy accident more than a fully supported feature within SteamVR. When doing co-present VR, the worldspace must be anchored and identical for both players. In other words, you can't do things like teleporting because if one avatar were to teleport, their avatars would be in different places relative to the real world distances between the players. Take a look at Hordez - they got around this by having an "on rails" situation where the playspace itself slowly moves through the environment. SteamVR itself is sort of wonky nowadays with how you can have 2 HMD have an identical roomsetup config. Previously, you used to just be able to run roomsetup on one PC, and then you'd copy that chaperone file from one PC over to the other. Nowadays that doesn't really work the same way - Valve has added some error correction methodology that means the chaperones are dynamic and can change without user input. We have an open source project to try and prevent SteamVR from altering the chaperone automatically - that project is hosted here. That said, SteamVR is constantly updating so that system is always changing. Due to the changes in SteamVR, alot of developers are instead building a calibration step into their application where you have fixed reference points in your environment and the player walks around with the controller and calibrates to the fixed points which transforms the worldspace. There unfortunately aren't alot of instructions on how to do this out there. It's a niche thing and SteamVR has changed alot over the last few years. If you build out your own calibration step - the huge benefit is that you can make it work across all runtimes and VR SDKs - not just limit the solution to SteamVR.
  24. @SanityGaming, full body tracking would technically be the most stable on 2.0 tracking with 3 or 4 stations because the additional stations will reduce the chance of tracking loss from occlusion. That said, it will work fine with BS 1.0 - you are just more susceptible to occlusion. Unless you're using a mo-cap with a tool like Ikinema Orion - there's probably not a huge performance gain relative to the cost but it depends on your use-case. We recommend Trakstrap for mounting them. Note that there are two generations of trackers. The original (grey logo) and Vive Tracker 2018 (blue logo). The older one works only with 1.0 stations whereas the newer one works with both generations.
  25. @mserdarozden - Please see my post here. In short, our SDK primarily offers things like feature data (gaze, pupil diameter...ect). You can either use Tobii's XR SDK (not VR Pro) to gain access to their analytics suit and the deeper level of hardware access they allow via their SDK. You could also theoretically try using our SRAnipal and rolling your own solution within your engine (I've seen developers do custom heatmap and session playbacks) but it definitely is a resource commitment and there are limits on what type of hardware data you can pull from that SDK. The Tobii XR SDK is probably what you're looking for as it has deeper levels of hardware access and has built out analytic features. I'd recommend contacting Tobii and inquiring about the licensing situation you may face in your specific situation.
×
×
  • Create New...