Jump to content

VIVE TRACKERS (2018) REFERENCE FRAME AND TRACKING IN UNREAL ENGINE 4


Recommended Posts

Hello everybody, 

I am developing a simple VR scenario in UNREAL ENGINE 4 which extrapolates the orientation data from the VIVE TRACKER (2018) and I am trying to understand how the device (and in general the tracking process) is actually working.

My first question is purely technical: does the VIVE TRACKER implement an inertial sensor to obtain the orientation of the device? If yes there is any specification on the marginal errors or drift effect?

The second question is for those of you that works in UE4 and concerns the reference system of the TRACKER. While it is clear the orientation of the coordinate system (Right Handed as displayed in the developer GUIDELINES) it is not clear how this is transformed and used in the engine to extrapolate the Roll, Pitch and Yaw angles. The UE4 world reference is a Left Handed system which is defined (I suppose) whit the setting up of the room in STEAMVR, and by moving the TRACKER in the environment I obtain values which are not relatable with the GUIDELINES specification. My guess is that the engine operates a fixed transformation on the TRACKER reference frame, but i would like to receive some help from those more experts in the field. 

Link to post
Share on other sites

@ReheLab

  • Per your first question, SteamVR devices employ sensor fusion between the basestations and an integrated IMU. The IMU has a faster refresh rate than the basestations can deliver and the IMU allows you to create approximate pose estimates for frames where you don't have enough optical samples for a proper pose estimate. I recommend anybody interested in learning how SteamVR tracking works to  watch this video.
  • Per the second part, I'd tag in @MariosBikos_HTC
Link to post
Share on other sites

Please sign in to comment

You need to be a member in order to leave a comment

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...