Jump to content

HackPerception

Verified Members
  • Posts

    3,529
  • Joined

  • Last visited

Everything posted by HackPerception

  1. @TomCgcmfc - The quality is a combination of a lot of factors - but mainly your taking a lower resolution image and spreading it across a high resolution panel that's directly next to your eye. You can't exactly just go and stick high res cameras on the front and call it a day - supporting high resolution sensors requires you to add a bunch of extra camera processing circuitry to the HMD (dramatically increasing raw costs) and after the fact you have to get the data back to the PC and rendered as quickly as possible. Consumer pass-through at this stage is mostly a convenience feature that keeps you from removing the headset and happens to help developers prototype MR ideas (on some HMDs), it's simply not designed to be a crazy high fidelity experience on the consumer level quite yet. Professional HMD's that are specifically targeting pass-through like Varjo's HMDs (which run $5,000 +) are able to support higher quality pass-through for a few reasons. What primarily drives it is that they're using eye tracking to only send back high resolution data on the part of the camera feed you're looking at because otherwise it's simply too much data to pipe back RAW. Even with foveation, they still need to tether the HMD using a custom fiber optic cable ($$$) in order to get support the bandwidth levels fast enough to get the data back to the PC, process, and then rendered back out to the headset without significant latency. We'll get there on the consumer side sooner than later but there are some pretty solid reasons why pass-through is at it's current quality level across consumer options.
  2. @Huvar - When you go into pairing mode, does the LED on the wireless transmitter begin to flash quickly? When it starts flashing quickly is when you press the button on the wireless adapter to pair it to the PC. @Synthesis
  3. You'll need at least 1 base station to see any sort of output on the Vive. You cannot run a Vive/Vive Pro without at least 1 base-station and there is no way to use it as a 3DoF headset by tracking rotation only. Newer versions of SteamVR omit "quick calibrate" and drives you toward standard calibration which requires controllers - it would appear that nowadays you also require controllers to complete setup but you may be able to get around this by using the "reset seated position" option. @affan101
  4. @pataan - Just to be extra clear here - Viveport Infinity services are accessed and launched by the "Viveport" client on your PC which is completely separate from the Steam client. If you don't have Viveport installed, you can download it from the top of www.viveport.com. You'd launch and log into that client to access the Viveport Infinity Library. Viveport titles are powered by SteamVR. When you launch a Viveport title, SteamVR will also co-launch and drive the experience. In some cases, you may need to log into the Steam client to have SteamVR launch. Cheers,
  5. @Aurescalia - The 1.0 and 2.0 stations have different "sync" systems - that's ultimately You can't have more than a single pair of the 1.0 stations operating within a direct line of sight of one another without the sync systems clashing You can't use 1.0 and 2.0 stations in the same room without physically isolating the two basestation types. If you try to run them in the same area at the same time, tracking will break due to interference from the sync systems. You can't use 2.0 stations to track 1.0 sensor-enabled gear because the old sensors predate the new sync system As Tom said, optimizing your positioning and adding a sync cable are the best ways to improve adding your existing tracking. 2.0 tracking primarily helps with very large or professional installations - for the playsizes most commonly found in-home, there's not a huge UX difference between 1.0 and 2.0 tracking unless you have environmental constraints that make it hard to optimize for 1.0's constraints.
  6. @Lionel - Valve published a guide to their binding system in late 2018. You can still access this method of binding via SteamVR's controller settings but it's now considered an outdated method. They recently overhauled the UI and you can now edit bindings using a new UI methodology - it's reliability new and I don't think they've published and documentation for it yet. The old binding mapping system definitely had some bugs - if you try it and it doesn't work, that's probably the old binding menu itself.
  7. @fceron - HDMI and Displayport are not fully interchangeable in the context of VR headsets - to run the Cosmos or any current gen HMD, you'll need a port that can drive native Displayport support for the most part. You will not be able to use an HDMI port to power a Cosmos or any modern HMD (Rift S, Index, PiMax, ect...) using any type of adapter unfortunately. The G731G seems to be a weird model - there isn't a ton of official info about that specific variant - only more generalized info for the family of laptops that one belongs to (which looks to be the Strix Scar III). The official specs for that family are pretty vague about what standards are actually supported on that port. I'd maybe recommend checking with Asus to see if that port a) Can support DP 1.2+ signaling and B) that the USB-C port is wired to the Nvidia GPU. You can usually double check the Nvidia wiring on your side by going into the Physx-page of the Nvidia control panel. I've copied an example below from the laptop I'm writing this from which has the USB-C port only wired to the Intel graphics making me unable to power an HMD via USB-C. We've had the most reliable experience on the USB-C side with the Club3D CAC-1507 USB-C -> DP 1.2 adapter. There are a ton of white label USB-C -> DP adapters from Chinese OEMs, it gets way confusing quickly to ID them by brand name. In general when looking for an adapter you want it to be able to support DP1.2+ signaling at 4K @ 60hz, and support as much bandwidth as possible, ideally over 20Gbps.
  8. @davide445 1) Cosmos is a SteamVR HMD - beyond making sure you create a controller map for the Cosmos specific controllers there's no SDK/API specfic stuff for Cosmos as opposed to Pro/OG Vive. There is a few optional SDK scenarios but on Cosmos, hand-tracking is probably the best example of that. 2) Unigine is really niche - I doubt we have any Unigine specialists internally and I'm not sure what the upgrade path is between engine releases of Unigine and how they handle external SDK's in that engine. As far as I'm aware, we haven't re-licensed licensing the hand-tracking SDK to be redistributable to any other company. I'd defer this to Unigine. I'm not aware of anybody developing hand-tracking stuff with Unigine. You're likely in uncharted territory. 3) I'd imagine you could probably link the handful of gestures to trigger SteamVR actions. I'd defer to @zzy on this as he's a specialist for this SDK. I'd probably avoid trying to create a custom controller driver - that's pretty deep in the weeds. I'd also note that hand-tracking SDK quality is higher on the Pro because it's a more mature HW platform and the SDK simply has been more optimized for Pro currently as it was initially architected on Pro.
  9. @sorefoot, I'm happy to pass along your feedback to our hardware product teams but I'm afraid I don't follow everything here. The Vive Wireless system doesn't output frequencies anywhere remotely near 2.4/5Ghz - that's actually the key to why it works. The transceivers on the Wireless system are operating in the 58-63Ghz range and fall under the 802.11ad standard. A primary benefit of using WiGig is that it specifically is in a quite part of the spectrum far isolated from 2.4/5Ghz signals which is required for the high-bandwidth, low latency requirements of VR. SteamVR tracked controllers use standard bluetooth (~2.45ghz) to connect to the HMD. If we released a wireless adapter solution that interfered with 2.5Ghz, it would break communication with the controllers. Our R&D team worked extensively with Intel and Displaylink to develop the highest bandwidth solution that was scalable at the time - WiGig was the only solution that enabled a fast enough and low bandwidth enough solution to drive not only the Vive but would also be forward compatible with the higher bandwidth requirements of Pro and Cosmos using a single core SKU. It still remains the fastest scalable solution currently (although 5G is opening up a huge array potential new technology backbones). It's not enough to take a USB/HDMI/DP signal and converting it to a wireless format. Due to the requirements of VR, we're force to do specialized computing via the IC's on the PCI-e card and leveraging the PCI-e bus to directly interface with the motherboard and it's components to create the WiGig signal. Everything is dynamic - it's constantly working directly with your PC components to dynamically adjust quality and optimize. "Display Standards" aren't as standardized as you might think - especially Displayport. Manufacturers have a huge amount of wiggle in room in how they adopt Displayport into their hardware - each DP device is in essence unique making the idea of a external box you just plug into your PC actually a nightmare scenario for compatibility because it quickly becomes device by device. The exact same is true for USB - it's actually not universal at all on the hardware and driver layer. It's an open standard and OEM's create all sorts of proprietary variants of USB controllers, busses, and hardware solutions which can impact or block VR usage (most notably ASMedia on Asus motherboards). Some motherboards, especially on laptops actually lack a true xHCI USB port - my Razer laptop only has Razor's proprietary USB flavor. If display and USB standards were more truly standard across the board - a breakout box solution would definitely have been more feasible for gen 1. Under the current OEM landscape, it's actually a compatibility hellscape that would exponentially dwarf the compatibility issues people have faced with the current PCI-e solution. PCI-e is the most standard it gets on the PC side and even then we still saw issues with how AMD MB OEMs integrate the PCI-e standard. Which bitter rivals are you talking about? I'm not sure what you're saying here. This circles back to all of the points. Which wireless standard would these transceiver companies use? You can count on one hand the number of technology backbones that can meet all of the requirements for VR displays and aren't co-existing within the 2.4/5Ghz range and few solutions can meet global regulations. There's a reason why Vive wireless is the only first party wireless VR adapter - the engineering challenges are still very complex in 2020. TP-cast is the only other example of a product I can point towards in this space that actually made a product at scale. Since they're not using WiGig, you're forced not only to have a wireless display bridge but you're also required to separately plug in a dedicated 2.4Ghz router to handle just the USB layer which isn't the most seamless UX. There's a huge engineering challenge with wireless VR - as the only company with a first party wireless solution we're definitely haven't backed down from the challenge but it's important to understand that the design of the current wireless adapter was pretty much the only tech stack that would actually scale at the time. 5G will open alot of development in the type of high-bandwidth low latency radios you need to pull this type of stuff off but ecosystem is still not fully baked. While it may seem that way from an end-user prospective the current Vive Wireless product is the most universally compatible product we were able to release from an engineering prospective in Q3 2018. By having a product in market - we learned a ton about real-world use cases and challenges and we'll be able to take your feedback and the feedback of our enterprise partners and build better products in the future.
  10. Tagging @Daniel_Y and @zzy for feedback. @4GD - please note, their response will likely be delayed until next week due to Lunar New Years.
  11. @mtarallo, You're correct, you cannibalize their effective range when you mount them high-up. Basestation 1.0 are rated to ~4.5 meters overall range and 2.0 stations are rated to about 5.5m. The range of basestations is partially limited by things like angular resolution - as you add distance, the laser beam collimation gradually fans out and becomes diffuse. SteamVR tracking is a timing based - past a certain distance, the angular speed of the laser across the headset sensors is too slow. If you're using 1.0 tracking - the range limitation is more pronounced because of how SteamVR 1.0 uses optical sync. Since 2.0 eliminates optical sync, 2.0 stations are more flexibility to deploy overall. We've already released an optically tracked desktop headset in Q4 2019 called the Vive Cosmos. No current consumer VR optical tracking system as as high precision/accuracy as basestation tracking - there's a tradeoff between the ease of optical tracking and the fidelity of the tracking. Cosmos is a modular headset. In Q1 2020, we plan on releasing a modular face-plate for Cosmos that lets you switch from the Cosmo's default optical tracking over to basestation tracking which also lets you use other SteamVR tracked devices like Knuckles controllers with the Cosmos. This is really helpful if you do things like travel with HMDs because you can use optical when traveling and then can switch to basestation tracking when needed.
  12. @Chris Kobayashi Our ID team has not publicly released dimensionally accurate CAD files or official dimensions for any of our desktop HMDs to protect the headset's design. After the holiday, I can forward your request to the ID team to see if there is a private pathway to share this info.
  13. The cameras on Vive and Vive Pro are VGA (640 x 480). The FOV is approximately 95 degrees horizontal and 78 degrees vertical.
  14. @ZINTENTEN - Still looking into the OpenVR/SteamVR aspect of this but I was able to confirm that you can still manually disable it by editing the steamvr.vrsettings script: Close the SteamVR compositor and then open the steamvr.vrsettings file using a text editor (default location is at C:\ProgramData\Steam\config) Check if there is a line that says “collisionBounds” If there is add the following to it: “CollisionBoundsFadeDistance” : 0 If there is not, then add the following in the file after a closed bracket set: “collisionBounds” : { “CollisionBoundsFadeDistance” : 0 }, Save and close the file. Next, start SteamVR and the chaperone bounds should be gone. If they are still there, check your syntax and try again. You just have to be really careful about your syntax and making sure your brackets are all closed off properly. Here is a picture, showing what the final addition looks like (note that the other lines of script in this file will likely look different):
  15. @atmcode - This answers you got don't make sense on numerous fronts. The application resolution option is an OpenVR option is in short a front-end GUI option that in turn controls a few multiplier values which OpenVR then will communicate to your game engine via API. The change happens within the game engine. Vive Cosmo's runtime shouldn't affect any of this - we're way at the end of that rendering stack. I just ran a series of quick QA tests and from what I can tell, the resolution render slider seems to be working on SteamVR 1.19.16 and Cosmos runtime beta 1.0.9.6. I'm not sure what Valve support is referring to here as this setting is rooted within OpenVR/SteamVR and the resulting APIs that talk back to the game engine. Our support staff probably meant that you were talking about the resolution of the HMD - that can't be changed which I guess is technically correct? They might not understand what you meant by a "change the resolution" type question as technically the target application is changing the resolution of certain rendering elements within the larger rendering job. I just tested super sampling on a variety of Cosmos setups a PC's and it worked on all of my tests. The key to supersampling on a modern HMD is that it's not going to make a huge different past 200%. If you dip down to 20%, you should see a noticeable difference, especially in text rendering. Anything past 100% is basically more nuanced detail as 100% is already ~2469x2915 which is pretty high for 90fps. Past a certain value, you're going to be trying to sample a texture at a higher value than the texture source itself can provide. Anything past 200% is the realm of enthusiasts - most applications aren't going to deliver more texture quality past a certain point. It made a much bigger different on gen 1 HMDs IMO. Higher values can yield an increased sense of presence and 3D space because your eyes have more small details to base their stereoscope off of. I took some example screenshots from my QA pass I just preformed with Cosmos below. As you can see, there is a huge difference between a 20% sampling value and a 100% sampling value. Past 100%, it just gets sharper - it's not going to be as night and day because developers are still limited by getting 4K texture sets to render at 90FPS with MSAA. Unity itself only supports 4K textures and the entire VR rendering pipeline itself is hybridized and geared towards 90FPS playback. Test using The Lab and simply dip the slider down to 20% quality and look at some text - some differences should be visible in the lab without a restart. Per VRSS, can you please screenshtot your VRSS settings? Why do you think it's not working? What apps are you using? VRSS is a devil of a thing to actually visually "see". Just like eye-tracking - it's the type of thing where if it's working, you can't actually tell as an end user as it's one of those things that "just works". I tested VRSS pretty darn heavily and unless you specifically have project open where you can debug VRSS, you shouldn't be able to visually tell it's on or off because Nvidia has done an expert job at blending the foveation. You really have to know what you're looking for to "see" it visually. In a nutshell, you kind of have to pick a high resolution texture, put it on the outside edge of a lens, and then slowly stare at that point while you move your head evenly to slowly bring it into the center of the FOV - it should increase in clarity from the outside of the lens into the center if VRSS is active. It's crazy stupid subtle because it's so well done. Of the 20 or so VR professionals I've demoed it to - nobody has outright said they can visually pinpoint the foveation - it's just sharper in the middle without the performance loss you would have previously seen to achieve the same quality in the center and without the obvious foveation ring you get with less advanced solutions. That's the key here - Nvidia is basically making this a driver level feature because ultimately they want it to seamlessly work without users ever even noticing or needing to take action. Towards that note, this is a brand new release and it's not something development teams are going out and optimizing for - it's just a new default driver feature. Here's the Lab in a Cosmos at 20% value: Here's the Lab in a Cosmos at 100% value: Here's the Lab at 400%:
  16. @ZINTENTEN - The chaperone behavior has definitely changed in the new SteamVR update and it appears as if "developer" chaperone mode has been depreciated - I'm currently looking into this.
  17. @oxygen4004 Can you please screenshot the error so we can see it in fully context? The console is spamming you with warnings about USB3 but everything is functioning correctly?
  18. @jackfrost2013, I'm sorry to hear your encountering an issue with your base-station. Unfortunately, the risk of mechanical issues are one of the potential trade-offs of the higher resolution tracking offered by external tracking over camera-based tracking. Here's a high level rundown of how 2.0 RMA's work within our ecosystem: HTC is not licensed to manufacturer SteamVR 2.0 base-stations, only Valve manufactures 2.0 stations. We simply resell Valve's product in this case and we're not directly involved in their manufacturing process. We purchase the units wholesale from Valve - there isn't any place in the manufacturing process where HTC is able to "save money" by cutting corners as we're purchase 2.0 stations as a fully assembled OEM product from Valve. All 2.0 stations are QA'ed by Valve before leaving their manufacturing line. They also receive a secondary QA by us before being boxed for resale in a kit. Replacing stations is a very expensive process; QA is much cheaper. In the types of situations like you're describing where a station is having issues out of the box, one potential source of the issue could any shocks/forces the kit received while in transit to you. Stations can definitely get damaged during shipping and handling - it's ultimately very hard to isolate this as being the source of the problem by virtue of how complex modern shipping is. You're 100% going to want to RMA that base-station and get a replacement unit from us. If it's making noises, it's a good indication that a lens has become detached or that a rotor has a fault. In any case, since the rotor spins around about 200,000+ times an hour, it's going going to deteriorate without repair or replacement. I'm sorry you're being asked to pay for the return shipping for the device. Reverse logistics and repair processing are by far the most expensive processes involved with selling hardware nowadays. In these types of scenarios, HTC actually pays quite a bit to process the swap/repair, and re-dispatch a functioning device to you. As such, in many scenarios we only cover shipping of the repaired device back to you after a repair or swap has been completed. The device will also be under warranty The length of Vive's warranty on base-stations that ship with Vive Pro full kit are 2 years in many regions and in some cases actually exceed the warranty you'd get from Valve by a full year.
  19. @dvp_dominic - Are you specifically referring to the trigger button on the Vive Focus' controller?
  20. @Hooflee, Dual GPUs are widely unsupported within VR and are more or less a dead end at this point in time. Only a tiny handful of titles have multi-GPU support integrated to date. In a nutshell, it's a bit of an SDK and rendering engine nightmare to yield multi-GPU support. To boot, since multi-GPU setup are a niche, there isn't a very clear return on investment for developers to architect their game's rendering pipelines around multiGPU support. AMD and Nvidia both have their own proprietary SDKs - it's definitely possible, it just not widely adopted by devs and the hardware dependency further complicates the matter. To my knowledge, the apps with multi-GPU support that have been actually published to the public are: Serious Sam: Last Hope, VR Funhouse, Trials on Tatooine, Raw Data, and Eve Valkyrie. It's under 10 titles overall - I wouldn't hold my breath for devs to integrate support for existing titles. For the current generation of VR apps, it's definitely best to invest in a single powerful GPU.
  21. @imarin18, please refer to my post here explaining the different SDK options around Pro-Eye. This is a situation where we're licensing Tobii's technology and Tobii has determined the level of API access to their hardware that's available in the SRAnipal SDK. In order to gain deeper API access to the hardware, you need to license Tobii's first party SDK (the Tobii XR SDK) and you'll need to meet and agree to the terms and conditions specific to their XR SDK. Please bear in mind, that once your doing things like accessing retinal images, your legal obligation as a developer/studio to protect user's biometric data and privacy is dramatically increased under international regulatory frameworks, especially GDPR. It isn't a situation where you simply flip a switch in your project and viola - prior to your collection beginning, your organization also has to have the legal and technical framework to safely collect and protect protect bio-metric data and other PII. Your studio may face additional regulations and security audits around your use of biometrics, especially if you're capturing data on EU citizens due to GDPR. It all really depends on where your studio is located and where your users will be - with GDPR currently being the international gold standard. Even though SRAnipal doesn't allow access to retinal imagery, the data the SDK generates is definitely legally protected PII and our SRAnipal SDK requires developers to meet corresponding security requirements as laid out in the SDK's developer agreement. Bio-metrics are a pretty serious topic as you can't alter/change your own bio-metric data if it's leaked in a breach. As such - you're probably going to see a similar setup across all major hardware platforms that integrate eye tracking where the base SDK will be geared towards "feature data" rather than raw data. @yayumura
  22. There currently exists only one primary production model/SKU of Cosmos that's been released as of this point. Any variation you may currently see are small things like the box color or are limited region specific things. "Rev1" isn't coming from us.
  23. @tonton, Cosmo's / Len's "Origin" home-space doesn't support custom models at this time. You could technically accomplish this in SteamVR's home but in order to do so, you'd have to publish it to Steamworkshop and then access it through there which could take some effort depending on what you're trying to bring in. Traduction automatisée: L'espace domestique "Origin" de Cosmo / Len ne prend pas en charge les modèles personnalisés pour le moment. Techniquement, vous pouvez accomplir cela dans la maison de SteamVR, mais pour ce faire, vous devez le publier sur Steamworkshop, puis y accéder par là, ce qui pourrait prendre un certain effort en fonction de ce que vous essayez d'apporter.
  24. @muella91, So ultimately there is some variation on the answer here - depending on how you're developing your project, there could be some variations on the answers: If you're developing in Unity, the SteamVR plugin automatically adjusts for the position so the developer doesn't have to retrieve the data. If you need to retrieve the data manually, you can do so via a camera object or via API. The Z-Axis direction depends on your pipeline. If using Unity, +Z is forward facing (away from the user) and is left handed coordinate system. In OpenVR's, the forward facing direction is (away from the user) is -Z because it's based on a right handed system. The X axis origin is direct between the center of the two lenses. The Y axis origin also passes through the center points of the two lenses. The Z axis is offset from a plane along the center axis' of the lenses. The offset is towards the the user and has a specific value for each HMD: The offset is 16mm towards the user for Vive The offset is 14mm towards the user for Vive Pro So overall, the image you posted turns out to actually be somewhat in the right location if you're querying via OpenVR.
  25. In most cases, it's best to restart the application after adjusting the render target multiplier. A handful of apps let you get away without restarting, but generally speaking - it's going to be far more stable to restart the target app and in some cases, it's mandatory to see a change. SteamVR's old UI used to have text recommending this but it's cleaner now.
×
×
  • Create New...