Jump to content

HackPerception

Verified Members
  • Posts

    3,529
  • Joined

  • Last visited

Everything posted by HackPerception

  1. Every inside out VR tracking system uses machine learning and there's ML baked into SteamVR tracking as well. Beyond just the line of sight issue - there's ML for sensor fusion with the IMUs and there's also some ML at play to try and keep the input delay as small as possible. I personally think ML is going to be one of the biggest drivers for VR tech in the coming years - it's so incredibly powerful when deployed correctly but with that potential for huge gain comes the potential for abuse and the erosion of privacy.
  2. @The_Fly Linux support for Pro Eye is unfortunately not roadmapped at this point in time within the scope of the current product and I'm not sure what Tobii's stance on this is within the context of our licensing agreement. Our current SRAnipal SDK and runtimes are wrapped versions of core Tobii technologies that we're licensing (and that end-users also license as part of the Pro Eye HMD purchase). That unfortunately rules out open sourcing. Barring a change in Tobii's corporate philosophy, alot of that tech stack including the runtime engine is proprietary to Tobii. Vive couldn't open source it if we wanted to - we just license it. There are actually two SDKs that can access the Pro Eye - you're also able to license the Tobii XR SDK/runtime directly from Tobii. It provides a much deeper level of API access and comes with a suite of analytics tools. I can't find any info on if that SDK/engine supports Linux currently - I'd recommend reaching out and seeing if their first party solution will fit your use case. I've been using Linux for 15 years - I'm there with you on the benefits and I only use Windows because of VR. The unfortunate truth is that Pro Eye is already a hyper niche product in a niche industry - Linux support is an additional layer of niche meaning the userbase would be impractically small at the moment. I've only seen this question pop up once or twice - we really haven't seen many people ask for this. That said, we appreciate the feedback because it's it's a solid data point I can feedback to the HW product team.
  3. @The_Fly - We've had a running post where we've posted solutions to the calibration initialization issue. In other cases we've supported people via email or via our enterprise support team (support.enterprise@htc.com ) as it's only sold as an enterprise product (which has separate support from consumer). If you're still having issues after trying to instructions there - PM me and we can schedule a call with one of our senior devrel guys. The primary source of this issue has primarily been around Windows UAC. I'd recommend running everything under a full admin account rather than using a standard account and using the UAC admin prompt.
  4. @GP - I am PM'ing you the support contact that handles vive.com orders as that's pretty specific... If anybody on our team would know, it'd probably be them.
  5. @sebastian_holocafe Please copy your message and email support.enterprise@htc.com as this is certainly a complex case
  6. @GP, Yes - the Vive Wireless adapter works with Pro Eye (as long as you have the Pro wireless attach kit).
  7. @Damon - One piece of early internal feedback I received is that this could be related to thermals within the basestation as the trend your data shows is similar to trends we see around thermals. Positional data from stations can have more variance when the station is startup up and warming up. They're recommending that you test with the stations fully warmed up to rule that out as a potential source of the problem. Basically they're suggesting that you leave the stations running for 3 hours (you may need to disable BT power management) and then testing to see if your pose estimation is more accurate and if you still see the drift you're reporting above.
  8. @TrevorGage There's a couple of separate layers going on here: From the engine prospective, it's just standard multiplayer and it doesn't differ from You'd chose a multiplayer SDK or solution that's to your liking and build from there. In many cases, developers will often use the platform SDK for each specific content store (Oculus, SteamWorks... ect)... but when you do that, each store build becomes increasingly different from one another. You can use engines like PhotonEngine to do multiplayer agnostically. What you're describing is co-present VR (co-presence). Co-presence VR is not well supported - it's kind of a happy accident more than a fully supported feature within SteamVR. When doing co-present VR, the worldspace must be anchored and identical for both players. In other words, you can't do things like teleporting because if one avatar were to teleport, their avatars would be in different places relative to the real world distances between the players. Take a look at Hordez - they got around this by having an "on rails" situation where the playspace itself slowly moves through the environment. SteamVR itself is sort of wonky nowadays with how you can have 2 HMD have an identical roomsetup config. Previously, you used to just be able to run roomsetup on one PC, and then you'd copy that chaperone file from one PC over to the other. Nowadays that doesn't really work the same way - Valve has added some error correction methodology that means the chaperones are dynamic and can change without user input. We have an open source project to try and prevent SteamVR from altering the chaperone automatically - that project is hosted here. That said, SteamVR is constantly updating so that system is always changing. Due to the changes in SteamVR, alot of developers are instead building a calibration step into their application where you have fixed reference points in your environment and the player walks around with the controller and calibrates to the fixed points which transforms the worldspace. There unfortunately aren't alot of instructions on how to do this out there. It's a niche thing and SteamVR has changed alot over the last few years. If you build out your own calibration step - the huge benefit is that you can make it work across all runtimes and VR SDKs - not just limit the solution to SteamVR.
  9. @SanityGaming, full body tracking would technically be the most stable on 2.0 tracking with 3 or 4 stations because the additional stations will reduce the chance of tracking loss from occlusion. That said, it will work fine with BS 1.0 - you are just more susceptible to occlusion. Unless you're using a mo-cap with a tool like Ikinema Orion - there's probably not a huge performance gain relative to the cost but it depends on your use-case. We recommend Trakstrap for mounting them. Note that there are two generations of trackers. The original (grey logo) and Vive Tracker 2018 (blue logo). The older one works only with 1.0 stations whereas the newer one works with both generations.
  10. @mserdarozden - Please see my post here. In short, our SDK primarily offers things like feature data (gaze, pupil diameter...ect). You can either use Tobii's XR SDK (not VR Pro) to gain access to their analytics suit and the deeper level of hardware access they allow via their SDK. You could also theoretically try using our SRAnipal and rolling your own solution within your engine (I've seen developers do custom heatmap and session playbacks) but it definitely is a resource commitment and there are limits on what type of hardware data you can pull from that SDK. The Tobii XR SDK is probably what you're looking for as it has deeper levels of hardware access and has built out analytic features. I'd recommend contacting Tobii and inquiring about the licensing situation you may face in your specific situation.
  11. @SanityGaming Don't get me wrong, there are definitely huge benefits to 2.0 but mostly if you're doing advanced setups. For standard room-scale setups, there isn't a huge difference except that 2.0 will give you more flexibility overall in where to mount the stations since you don't have the optical sync limitations. I believe that 1.0 tracking deals with reflections a little better because the algorithms are more mature and the solves are computationally simpler since you only have two stations but that situation may have changed since the launch of Index proper. I don't have a great public answer for the recent question except that "it's complex" and that it relates to supply chain constraints. Since Valve is the only authorized manufacturer of 2.0 stations, that part of the supply chain is not directly within HTC's control (unlike 1.0 stations which we're authorized to manufacturer). Fun fact: I actually bought a house in-Denver based on the fact that it had a room that was perfect for VR. It unfortunately only ended up only directly serving me for about 8 months before I got called up to the major leagues to work for Vive directly but it's super funny that I'm that big of a VR nerd that VR was influencing my living situation that deeply.
  12. @Victor_Kush This sounds like a different issue? The OP is talking about how the Vive will mute the mic on his application when the person hums a steady tone. My current understanding is that this is a hardware dependency thing - it's not in software. My understanding is that the audio chips used within Vive hardware has this noise-supression baked in because it prevents a feedback-screech from occurring by causing the system to mute if it detects what it thinks is a feedback echo.
  13. @davide445 - Having worked closely with alot of developers on optimization - I think across the board everybody thinks VRWorks and VRS is worthwhile and you can use VRS to drive foveated rendering which can net you ~60% perf win and extremely sharp center clarity which can actually have a much higher cost/perf gain ratio than going from say an RTX2080 to a 2080Ti. The VRSS stuff is really slick.
  14. @davide445 I think the most important generalized thing devs need to know about GPUs are that currently between 80-90% of VR users are sporting Nvidia cards and thus it makes a huge deal of sense for you to take a very serious look at incorporating Nvidia's VRWorks and VRS. Things like VRSS are major major wins for both developers and consumers alike and since it's mostly an Nvidia game, it's safe to assume that if you integrate those technologies into your project, your user-base will benefit.
  15. @SanityGaming - The primary differences are: Enhanced horizontal FOV (increase from 110 to 150 deg) means that you can mount the station in a wider range of mounting positions. With 1.0 stations, we recommend mounting in opposite corners of the playspace. With 2.0, you're not really limited to specific placing - just whatever provides the coverage you need for your specific playspace. 2.0 basestations do not have optical sync but rather integrate the sync into the laser sweep. This means that the two basestations do not need to be in direct line of site with one another like BS1.0 and it also eliminates the need for a sync cable if you're not using optical sync. The lack of optical sync is the primary reason you can mount BS2.0 in a wider range of positions. Since the two stations You can only run 1 pair of 1.0 stations in a room without conflict but you can run upto 16 2.0 basestations in a room without conflict. This is a hugely important feature for VR professionals. If you want to use more than 1 pair of 1.0 stations in a close environment, you physically have to separate the bases station pairs using things like curtains, walls, ect... This is because the 1.0 stations emit a bright optical sync flash that travels over 10m. You can use 2.0 stations to array upto 4 stations together to expand out to play-spaces approaching 10x10m. 1.0 stations are only rated for a playspace that about 5m across the diagonal which is about 4x4m usable play-space. There is generational compatibility between stations and devices. In a nutshell, older hardware with older sensors can only use 1.0 tracking whereas newer hardware with newer sensors (Vive Pro, Vive Cosmos Elite) can work with both 1.0 and 2.0 stations. You cannot use 1.0 and 2.0 basestations at the same time - they conflict. Here is a detailed breakdown of how basestations work. This video will make the whole sync system make sense. There's no stated accuracy improvement for 2.0 tracking. It's about the same although if you use more than 2 stations, you're risk of tracking loss from occlusion is lower. Overall, there isn't tremendous advantage to 2.0 tracking over 1.0 tracking if you're doing simple roomscale in-home. The real advantage comes if you're doing any sort of commercial application (arcades, trade shows, ect...) or if you really want a larger playspace. For the most part, the playspace size gets way more attention from in-home users than I think it warrants because almost no VR content is optimized for playspaces above 4x4m outside of the arcade space. I can't off the top of my head think of any game you can use off the top of my head where they've specifically optimized the game for larger spaces - although some of these games are on the horizon for release. If you presently have 1.0 tracking and are thinking of upgrading to 2.0 - it's really not a crazy big difference unless you have a specific use case that requires the 2.0's advantages. It may be a disappointing upgrade if you're doing simple room-scale at home.
  16. @AlBAO - You'll definitely need to use the dongle - it's not a generic Bluetooth radio, the dongle contains Valve's proprietary "watchman" technologies. Each tracker needs a corresponding dongle. You'd also want to use the little extender (cradle) that comes with the trackers so your dongles have as much physical separation from one another as possible, otherwise the radios will start interfering with one another since they're relatively high power and are transmitting loads of bandwidth for a Bluetooth device.
  17. @Lekreux - Is the SteamVR console spamming you about bluetooth driver issues? That driver historically can be really PC-by-PC. In these cases, updating via USB would usually be the backup plan. You'd need to take the basestation down, plug it into your PC via a USB data cable (it must be a data cable, not a charge only cable) and then plug it in and attempt the firmware update.
  18. I have the exact same issue in all 3 Points Headphones are working, Mic is not detected in the Device Manager, and the @stvnxu @C.T.
  19. @davide445 - I've confirmed with our BI team that Vive is very conservative with any user/usage data that is collected and we have a strong blanket policy to refuse all requests from outside entities with exceptions being rare. This is a question I've never actually gotten from another developer in my entire time running DevRel at Vive - most devs use the Steam survey or simply target the GTX1060 or 1070. This simply isn't widely requested information and I think as a whole, we see a lot of positive sentiment from both developers and consumers about being more privacy oriented than some of our key competitors. The only type of question I field in this vain are questions about which SDKs are supported on which GPUs in cases where there is a hardware dependency like SRWorks. With 30% of the usage Oculus reporting as "other" I'm not sure this is a very helpful dashboard unless you're trying to parse out RTX from GTX for hardware specific dependencies. For most general users, I'd recommend the RTX2060 or RTX2070 and would only push clients toward higher cards if your application specifically requires it, post-optimization as it's simply bad practice to ship unoptimized builds. I'd also steer all developers towards integrating Nvidia VRS if possible.
  20. @Bassermann - That's probably an alright setup - the render is neat. Since 2.0 tracking has a different sync system, the base-stations can be deployed with much much higher degrees of flexibility. Mounting the stations diagonally was primarily a 1.0 limitation. One thing to bear in mind is that the 2.0 stations have a 150 degree horizontal FOV - if you mount them in right angle corners you're actually not taking full advantage of the enhanced FOV. For instance if you moved the one in the corner out into the middle of the room like the one of the bookshelf, you'd have technically maximized that basestation's coverage potential. Your setup will probably ensure 360 coverage - you may have issue though if you're doing seated experiences and your controllers get blocked under the table.
  21. @davide445 I'll ask our BI team about this. Vive is definitely way more on the privacy-oriented side of the VR spectrum than competing firms so we don't really share alot about our customers - via reporting nor via API. It gets complex though because various layers of the tech stack (software or hardware) may capture and report data separately from our layers (i.e. Steam Hardware Survey).
  22. @DavidSW Depending on the product, it's either on a medium sized card that's specific to Viveport or like in the case of Vive-Pro it's found on the bottom corner of the large instruction sheet that's the first thing you see when you open the box. I am generating a Viveport support ticket for this post - if you reply to that support ticket with either the serial number of the HMD (usually printed on the side of the box) or your vive.com order number, they can retrieve the code on our side.
  23. @Cully - I sent you a PM with info on who to contact
  24. @davide445, No - we don't publish user's usage data anywhere. The closest thing on the SteamVR side would be the system survey but as you said, you can't isolate the GPU numbers to specifically VR users.
  25. @Tom17 Cosmos Elite bundle pre-orders are currently targeting mid-late March. I'm not on a hardware team - I'd have to ask for confirmation for them as to what the public message is right now. The original plan was to make our announcements at Mobile World Congress - now with MWC and GDC canceled everything is super weird and I'd need to defer to someone closer to the HW team. @C.T. @stvnxu "Soon"
×
×
  • Create New...