tpathak Posted October 29, 2018 Posted October 29, 2018 Hi, My concern is about HTC Vive Focus.When I turn around in my application position tracking is lost. And when I remove my headset while running my app and again put it on looking in other direction I am not at the same place in my app.Can you please tell me why is it happening? Thanks
HackPerception Posted October 29, 2018 Posted October 29, 2018 , while I'm not on the Focus team - I have a few comments that may help them assist you better. What are the lighting conditions like in your environment? Most often than not, this type of tracking lost is linked to environmental lighting conditions. Does it happen at all speeds of movement or only when you turn quickly? This is the intended design. The HMD has a proximity sensor between the eyes - if it fails to detect a user for 4 seconds it enters standby mode to avoid wasting battery life by needlessly rendering environments. The desktop Vive also has a similar standby system. The Focus' 2x2m safe space's orientation point is linked to the location and direction the HMD is facing when that proximity sensor is first triggered - if you need orientation a certain way you just have to initialize it in a consistent fashion to get consistently results. I personally use a piece of opaque tape over the sensor as a quick and easily removable solution to avoid playspace issues when giving demos - the focus team can comment on if there is a way to disable this feature in-software.
tpathak Posted October 30, 2018 Author Posted October 30, 2018 First of all thanks for your reply. we are testing it in white light environment in conference hall with white walls.And this is happening only when quick movement is there.
HackPerception Posted October 30, 2018 Posted October 30, 2018 I highly believe you're bumping up against a limitation of SLAM (surface localization and area mapping) itself. The Focus' computer vision utilizes two monochrome cameras that preform SLAM - they need contrasting visual data in order to generate "tie points" in the environment to track and measure motion against and do so by compairing highlights and lowlights (areas of high contrast as it's monochromatic). SLAM doesn't work with white surfaces because there are no surface features to draw compairsion points from. SLAM can also struggle with repeating patterns but IMU sensor fusion helps greatly with this. When you're moving slowly, the IMU is able to use sensor fusion to provide a rough pose estimate when lacking visual data - when you move too quickly the IMU drift becomes out of tolerance and without the visual data to compare and correct itself against, it leads to a tracking lost message. The quick way to fix this is to add some color or patterns to your environment. Stickers, logos, words, tracking markers, anything that has some contrast will alleviate this situation. In some cases, I've solved this by simply writing a handful of words on a whiteboard for example.
tpathak Posted October 31, 2018 Author Posted October 31, 2018 Thanks I will try it for sure as earlier I thought it could be possible solution but wanted some other solution because I was not confirm about it.Anyways tell me something more if you get to know about Htc Vive Focus.
Recommended Posts
Archived
This topic is now archived and is closed to further replies.