Jump to content

Syncing/matching virtual to real envionments


Tiberiu5

Recommended Posts

So I have modelled an exact replica of my room.

I used a Leica laser scanner to get a point cloud and imported this into Blender, because the mesh was poor quality and the textures didn't look that great, I created a clean model by basically overlaying objects in Blender which aligned with the point cloud surfaces.

I have imported my room from Blender to Unity and adjusted the transform of the room to align virtual with real, the result is quite amazing, its really something to be able to reach out in the virtual space and have the walls and door frames align across both worlds.

My question is, rather than the time-consuming "test and adjust" method of adjusting the transform of the room, (which I'm afraid will go out of sync if I need to carry out the Steam VR Room setup again), is there a smarter way I can align the Unity coordinate system with the real world coordinate system using either the Base Station locations, or a VIVE tracker puck or something?

My setup:
VIVE Pro Eye w/ wireless adaptor
4 Steam VR BaseStation 2.0
Unity

Link to comment
Share on other sites

Howdy @Tiberiu5

Great question! Adding a tracker in a stable position will help you get the two spaces in the same spot.

My top level thoughts are -- all things are relative!

Well... ok, this isn't philosophy 101, so how do we unpack that?

How do we coordinate two different coordinate spaces in real life? We create a shared reference point. A controller placed at a specific point/rotation at a specific time (or a tracker) will help align the room. Just like in real life -- if I see the front of the statue of liberty and you see the back -- if our goal is to meet we now know that we need to walk towards eachother to meet.

So the model needs to be aligned(postiion+rotation) with something in the virtual room that needs to be specified. You could use a tracker or a controller placed at the same point every time for this purpose. Then you match the model of the room ontop of the room based on that. Then the tracker(or controller) will be that "landmark" by which both virtual spaces can merge, and if it's a tracker, you could even see the impact in real time of the rotation until it is stable. With more points of reference (ie trackers) the positioning can be even more stable. If you pasted 4 trackers (overkill, but this is just an example) on the walls then you would know the alignment of the virtual room with a high degree of accuracy and stability - one tracker falling or slipping a little would have less of an impact. 

Applying more points will help, and that just depends on the budget and desire for accuracy. A lot of casual experiences (like the ones we were just demoing on the vive xr elite) will merge two spaces using a single point, and a more serious simulation/enterprise solution might want to have more accuracy over time and be more fault tolerant with more shared points of reference.

Thanks for the question, and we love these use cases and I'm excited to see more content made this way myself. Let me know if you want to dig further into this!

-Alex
 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...