Jump to content
 

dagillespie

Verified Members
  • Content Count

    33
  • Joined

  • Last visited

Everything posted by dagillespie

  1. @Marios Bikos - Thanks for this - your code changed fixed things with us. Apologies if we missed this however it would be helpful if these kind of changes could be clearly flagged for users downloading the SDK as otherwise they get missed. We're now back to one project codebase - thanks 🙂
  2. @Corvus @MariosBikos_HTC @Cotta Any possibility of an update on UE4 support? It would be helpful from our perspective to understand if/how HTC are going to support eye tracking with UE4 as it will help us both understand what we will do with our current project development work and will influence our hardware investment. Many thanks!
  3. Is there any chance of an update please? As mentioned, we have had to split our development efforts right now. Even an update to say "we're on it" without committing to a date would be most appreciated! 😉
  4. Just a polite bump for a reply on this from HTC. We are in a situation where as the Vive Pro plugin for eye tracking only works with 4.23 and we have other hardware dependencies that need 4.24 making it hard to develop with a single project. Any indication on whether there will be an update or even pointers on how we can update it ourselves would be most appreciated. @Corvus @Cotta
  5. Hi, Is there an ETA for an update to the plugin for Unreal 4.24 at all? We have a project that needs to be on 4.24 however we have just discovered that the drivers have not been updated to support it yet. Similarly is HTC looking to support subsequent releases of Unreal in step with releases? Thanks @Daniel_Y @Corvus
  6. Hi - we are also looking at tracking across large spaces with Vive and are looking at how to achieve this. Is it possible to create a setup with more than 4 lighthouses successfully yet? We would like to understand the maximum number we could use as would be looking to track as large a space as possible (think warehouse scale so ideally 16). @VibrantNebula
  7. Hi - Are there any updates to this? We are needing a solution for spatial mapping with an inside out tracking headset where Cosmos + SRWorks may be the solution. Is this now working? @Jad @Daniel_Y @Andy.YC_Wang
  8. @Daniel_Y - awesome many thanks for confirming - it's helpful in determining likely vector error over distance, of course as you said that totally depends on your own eyesight too
  9. @Andy.YC_Wang Many thanks - that would be most appreciated
  10. Try adding a debug text node and seeing if it sends any text to console. That way you can double check if the object's event begin play is executing. @Jawani
  11. Thanks @Andy.YC_Wang - this implementation works for something already in the level however this isn't exposed you can't set this programmatically in blueprint - ie you can't have an actor that is spawned set the value of this. Is this correct?
  12. Answering my own question for everyone's reference On begin play a ViveSRDualCameraImagePlane is spawned and is effectively mapped to the player camera transforms - HOWEVER this doesn't appear to be attached to the player camera component so when the player moves/transforms their location, the image plane orientation messes up. Therefore simply at EventBeginPlay, if you get the spawned actor and attach it to the player's camera, this will ensure the image plane always keeps proper alignment
  13. See screenshot of what happens when you rotate the player pawn by 90 degrees. The SR actors don't keep up and even when you do update their rotations accordingly, the transforms still don't create a properly aligned view. Similarly if you have your player facing any direction other than along the X axis then there is a mis match in the attachment/transform between the image and the player camera. @Daniel_Y @reneeclchen
  14. It appears that when you change the players location or rotation in world space in Unreal that the camera image plane that is generated in front of the user doesn't adapt to update to this. Is there a way to ensure that if the player pawn translates that the ViveSR actors keep track and follow this? We've tried manually mapping the transforms to the image plane that is generated at run time however doesn't appear to do this. Thanks!
  15. If anyone from Vive with experience in this area could chip in it would be most appreciated. We're trying to do some work where understanding this is very important in working with the data from the Vive Pro Eye. Thanks! @Daniel_Y @zzy @Corvus
  16. Exactly what we've just found. It would be useful to have some kind of confirmation that the tracking calibration has been successful from the calibration app itself too!
  17. Update on this - we're finding that the calibration test screen (as per screenshot in first post) isn't responding however in our Unreal applications, tracking is still working.
  18. We have just experienced the exact same issue after the latest steam update to 1.8.19. A fix for this would be most appreciated!
  19. Is there a more detailed explanation of the eye tracking accuracy than what is stated on Tobii's site for the Vive Pro eye please? https://vr.tobii.com/products/htc-vive-pro-eye/ It says 0.1deg - 1.1deg, however it's unclear whether this means +/- 0.1 degree min to +/- 1.1 degree max (making full range 0.2deg - 2.2deg) +/- 0.1 degree min to +/- 1.1 degree max (making full range 0.1deg - 1.1deg) somethng else We are looking at understanding accuracy over distance so would like to understand what this means please @Corvus @zzy @Daniel_Y
  20. Awesome thanks - we've updated our code accordingly.
  21. Is this the same as v2 in "Enable eye version" that is implemented in the Unreal SDK?
  22. Any answer to this would be very most appreciated! @Corvus @zzy @Daniel_Y
  23. Hi, I've just updated to 1.1.2.0 and found Unreal now has duplicate functions and "Enable Eye Version" for v1 and v2. What is the difference between these two versions and what do I need to consider when choosing what version to enable? Thanks! @Corvus @zzy @Daniel_Y
  24. The prebuilt experience app has the same issue. I think the issue arises from the difference between your IPD and the spacing of the front facing cameras. There needs to be some transform applied to the camera feeds to better match the headset IPD
  25. Simply use the example experience files provided for Unity or Unreal. Show mixed reality and you'll see the mis-match. I did a simple Unreal test based on the files provided. Added motion controllers to the provided Pawn and again you'll see the mis match. Also if you just compare the pass through view from inside the headset against what you see from pulling the headset off, you'll see that reality doesn't match the projection in the headset.
×
×
  • Create New...