Jump to content

Search the Community

Showing results for tags 'vive tracker'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 5 results

  1. Hi everyone! I have one question about the Vive Tracker. Currently, I would like to detect the posture of the user when running while wearing the HMD. I would like to use the Vive Tracker to detect the user's movements. Since the HTC Vive ProEye is too heavy for the user, we tried to use the Oculus Quest 2 or Vive Focus Plus. Is there any way to use Oculus Quest 2 and Vive Tracker at the same time, because I can't seem to use Vive Tracker with XR Interaction Toolkit. Also, is it possible to use Vive Focus Plus and Vive Tracker at the same time? I'd like to avoid using running VR content for long periods of time while wearing the Vive ProEye.
  2. Hello All, I've been developing an application using HTC Vive Trackers on Foots. When you play the scene the first thing I made user do is make a calibration. then user make sure that the trackers are on the right places. and on the next step, user select a scene to play and gets in and play. However, when user gets back to main menu I loose the calibration states of the trackers and user should calibrate it again before selecting next scene. What am I probobly missing, can you please help me? Kind regards, ukfdr.
  3. Hello everybody, I am developing a simple VR scenario in UNREAL ENGINE 4 which extrapolates the orientation data from the VIVE TRACKER (2018) and I am trying to understand how the device (and in general the tracking process) is actually working. My first question is purely technical: does the VIVE TRACKER implement an inertial sensor to obtain the orientation of the device? If yes there is any specification on the marginal errors or drift effect? The second question is for those of you that works in UE4 and concerns the reference system of the TRACKER. While it is clear the orientation of the coordinate system (Right Handed as displayed in the developer GUIDELINES) it is not clear how this is transformed and used in the engine to extrapolate the Roll, Pitch and Yaw angles. The UE4 world reference is a Left Handed system which is defined (I suppose) whit the setting up of the room in STEAMVR, and by moving the TRACKER in the environment I obtain values which are not relatable with the GUIDELINES specification. My guess is that the engine operates a fixed transformation on the TRACKER reference frame, but i would like to receive some help from those more experts in the field.
  4. Hello ! I've wanted to know if it was possible for a tracker to work properly without it's dongle, if connected directly with the micro usb / usb cable. It was actually hard to find some related posts, the one I've found were fairly old but people said it was not possible ( they said that a firmware update that might make it work was on the roadmap though). So, since I didn't find any proper answer while searching, I wanted to know if it was now doable, and if a "mix" was doable as well, having two wireless trackers with two dongles and one last tracker connected via usb ( no dongle) for exemple. Thank you
×
×
  • Create New...