Jump to content


Verified Members
  • Content Count

  • Joined

  • Last visited

Posts posted by dagillespie

  1. @Marios Bikos - Thanks for this - your code changed fixed things with us. Apologies if we missed this however it would be helpful if these kind of changes could be clearly flagged for users downloading the SDK as otherwise they get missed. We're now back to one project codebase - thanks 🙂

    • Like 1

  2. Just a polite bump for a reply on this from HTC. 

    We are in a situation where as the Vive Pro plugin for eye tracking only works with 4.23 and we have other hardware dependencies that need 4.24 making it hard to develop with a single project. 

    Any indication on whether there will be an update or even pointers on how we can update it ourselves would be most appreciated. 

    @Corvus @Cotta

  3. Hi,

    Is there an ETA for an update to the plugin for Unreal 4.24 at all? We have a project that needs to be on 4.24 however we have just discovered that the drivers have not been updated to support it yet. Similarly is HTC looking to support subsequent releases of Unreal in step with releases?



    @Daniel_Y @Corvus

  4. Hi - we are also looking at tracking across large spaces with Vive and are looking at how to achieve this. Is it possible to create a setup with more than 4 lighthouses successfully yet? We would like to understand the maximum number we could use as would be looking to track as large a space as possible (think warehouse scale so ideally 16).


  5. Answering my own question for everyone's reference

    • On begin play a ViveSRDualCameraImagePlane is spawned and is effectively mapped to the player camera transforms - HOWEVER this doesn't appear to be attached to the player camera component so when the player moves/transforms their location, the image plane orientation messes up.
    • Therefore simply at EventBeginPlay, if you get the spawned actor and attach it to the player's camera, this will ensure the image plane always keeps proper alignment

    18-11-2019 10-09-38.jpg

  6. See screenshot of what happens when you rotate the player pawn by 90 degrees. The SR actors don't keep up and even when you do update their rotations accordingly, the transforms still don't create a properly aligned view.

    Similarly if you have your player facing any direction other than along the X axis then there is a mis match in the attachment/transform between the image and the player camera.

    15-11-2019 15-51-22.jpg@Daniel_Y @reneeclchen

  7. It appears that when you change the players location or rotation in world space in Unreal that the camera image plane that is generated in front of the user doesn't adapt to update to this. Is there a way to ensure that if the player pawn translates that the ViveSR actors keep track and follow this? We've tried manually mapping the transforms to the image plane that is generated at run time however doesn't appear to do this.


  8. Is there a more detailed explanation of the eye tracking accuracy than what is stated on Tobii's site for the Vive Pro eye please?


    It says 0.1deg - 1.1deg, however it's unclear whether this means

    • +/- 0.1 degree min to +/- 1.1 degree max (making full range 0.2deg - 2.2deg)
    • +/- 0.1 degree min to +/- 1.1 degree max (making full range 0.1deg - 1.1deg)
    • somethng else

    We are looking at understanding accuracy over distance so would like to understand what this means please


    @Corvus @zzy @Daniel_Y

  9. Simply use the example experience files provided for Unity or Unreal. Show mixed reality and you'll see the mis-match.


    I did a simple Unreal test based on the files provided. Added motion controllers to the provided Pawn and again you'll see the mis match.


    Also if you just compare the pass through view from inside the headset against what you see from pulling the headset off, you'll see that reality doesn't match the projection in the headset.

  • Create New...