Jump to content

Positioning the headsed in the roomscale


Damir

Recommended Posts

We have just started to make some applications for vive focus. 

And there is a question about start positioning. We need to position our player in the roomscale.

Is there any way to set position of the headset using reference points? So i mean we have a clean room, with 4 walls. Can we put on these walls some points, and then how can we recongnize these points using wave sdk? Or maybe there is other ways...

And it will be better to always correct the player position during the application, using different reference points on different walls. 

 

Link to comment
Share on other sites

Hi guys, I've done some digging based on an answer to another question on this forum, which led me to believe you can't do absolute positioning with the Focus' inside out tracking. This seems to indeed be the case. Markerless inside out tracking cannot do absolute positioning. We would need markers for that.

 

But it seems to me that the Focus already uses natural markers to do the relative positional tracking. If that's the case, then it would just be a matter of storing that information for a specific room, and then using it to compare what the headset is seeing at any given moment to the stored information, in order to get the absolute position in the room. I have no idea how difficult this would be. Or how much processing power it would take.

 

I do own a ZED camera from Stereolabs, and those people have told me that it is possible to access a stored map of previously scanned surroundings, which you could then use to do absolute positional tracking. But I never figured out how to do that, and the ZED is connected to a powerful computer. I gave up on the whole ZED thing because I couldn't get it to do positional tracking of large spaces well enough, and was very happy the Foucs came out, which just worked.

 

So, I'm going to have a look at the Wave SDK code to see where and how it uses a pointmap of its surroundings. Though I expect it to just store a pointmap in the first frame, then compare it to a map it stores the second frame to calculate the positional change, then delete the map stored in the first frame, and so on. Well, if that is the case, why not just compare to a stored map every frame, right? Though the calculations for that would be rather different from the ones used for the tracking procedure I just described. And what would happen if there were other people in the room in a multiplayer game, moving around, whose points would of course not be in the stored map.

 

I will keep you updated on what I find, and please, other people, please respond and help us with this, because absolute positional tracking would be very welcome to many of us.

Link to comment
Share on other sites

But how would the headset be able to relate the pointmap captured in the frame to the stored pointmap? It would need to have some sense of where in the room it is to start with, remember that, and compare relative position change with position related to the stored pointmap? It might be easier to just use markers, and then calculate the absolute position off those every frame, which would be easier, because it's easier to relate to their absolute position than it is to relate to a pointmap?

Link to comment
Share on other sites

All right, I looked at the code, it seems that we get the pose data from a dll, at least in the Unity package. This was to be expected of course. So we would need new code to do positioning differently to have any chance of getting absolute position. This can be done of course, I just don't know how to do it.

Link to comment
Share on other sites

By the way, the simplest solution, which requires no programming, would be to position the headset in a set place to start with. Make some sort of stand for the headset at a specific position, which relates to the starting position in your application, and put the headset on the user at that exact spot. X and Z coordinates are tracked quite reliably, so the user should be able to traverse your room safely. Y position does sometimes drift, but that can be helped by taking off the headset and putting it back on again. Do this while not moving in the X and Z directions, and the user should stay in the correct position.

Link to comment
Share on other sites

Thank you for reply, Peter! We are already using the method you described. So we have some markers on the floor, where the users should put the headsets on. Of course it's not very good and comfortable. And this method doesn't have 100% accuracy.

Link to comment
Share on other sites

All right! Good to know that it is somewhat feasible. Thanks! Hopefully someone will write something using the same method the ZED does, so the Focus can also use absolute positional tracking. I checked by the way, the ZED code for using a reference pointmap is also in a dll :)

Link to comment
Share on other sites

No, I don't think we'll be able to access the dll's. And accessing the Focus'  dll's wouldn't help, because they don't have any code for doing what we want. Someone would have to access the ZED dll's, because that already has the code for doing absolute tracking based on a reference pointmap. But accessing those dll's is of course also not going to happen. So, someone with knowledge of how to do it, would have to write a system for doing absolute positioning using a reference pointmap for the Focus from scratch. Or find some code somewhere from someone who's already done it in a more or less general way, so it can be rewritten for the Focus. But I think there's a lot of complicated math involved here, so I doubt it would be easy to find. But who knows!

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...