Jump to content

DanMeeks

Verified Members
  • Posts

    5
  • Joined

  • Last visited

Reputation

1 Neutral

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Using Beta Passthrough Underlay! Works great! Thanks for coming through on this!
  2. Unsolicited Feedback, but I don't know where to place it. Seems a bit odd to me: I'm searching around the room for my controllers to turn them on, but my surroundings are reduced to half opacity and greyed. I suggest removing that so you can look around easier! Dan
  3. That's excellent to hear @dario! I'm not the strongest coder on the team, but I do have time to enquire via forum posts! We once had a capability with our product, but we have since deprecated it, since Stereolabs has pivoted to robotic computer vision and has not further developed their product for XR. What is troubling is that until your product came around we had no viable replacement. In addition to our tabletop capability we can extend everything to a one to one modality. Here is a video of a fly over on July 4th over Tacoma, Washington. I used a HP Reverb 2 with a Stereolabs ZED Mini. It was a mild day, close to office conditions, and the equipment was stored under an umbrella. I know, I know: the XR Elite isn't designed to be soldier rugged; we would only have it used in office equivalent weather with proper precautions. Here I am trying to get a very underpowered A-10 to form up on the visiting aircraft. You can see that we used the depth sensing to about 5m. Past that we have a custom shader for occlusion of buildings and terrain. Looking forward to doing this with the XR Elite! Eager to be rid of the cables! Dan
  4. We've used the latest fix to ALXR--so we have something. We can't deliver this to our customers, however! I shouldn't have to use this middleware to do this. Furthermore, there is no depth information. How should we proceed? If there is a way ahead that I'm not finding, please point me in the right direction! Dan
  5. I'm really personally enjoying my XR Elite, but currently, we can't use this device as we had hoped. Our use case requires the horsepower of a PC in Unity. Building an AIO apk isn't an option. Our software does some pretty heavy simulation. I've investigated the documentation and forum a bit before writing here, but I'm still unsure. Please clarify for me and others that have a similar use case. I'm looking at this page: https://developer.vive.com/resources/openxr/openxr-pcvr/overview/ . I assuming Scene Understanding is segmentation and occlusion via depth, and I see that is unavailable for Unity for the XR Elite. I also read that due to privacy concerns the camera stream isn't available for development. That's noble. We've had success using the chromakey function in ALVR, but as awesome that project is, we really can't deliver that to our customers as a deliverable solution. We made our scene black and chromakeyed it out using ALVRs button combo to allow passthrough. Furthermore, @korejan has noted what we are seeing: our scene is oversaturated. When we purchased our headsets we were eager to use the depth camera as well as the MR capability, but it seems that's just for AIO apps? If that's the case we'd be just grateful to settle for an ALVR like solution where we don't require access to the camera data on the PC: color key it out. Sadly, that's what Varjo does beyond your fingers, disappointingly. Nobody seems to match Stereolabs in occlusion, I guess. So please, let us know what the plan is for PCs. Right now this XR Elite is a VR Elite for us. Enclosed is an example of what our use case is like from Varjo XR-3: Dan
×
×
  • Create New...