Jump to content
 

dagillespie

Verified Members
  • Content Count

    33
  • Joined

  • Last visited

Community Reputation

2 Neutral

About dagillespie

  • Rank
    Explorer

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. @Marios Bikos - Thanks for this - your code changed fixed things with us. Apologies if we missed this however it would be helpful if these kind of changes could be clearly flagged for users downloading the SDK as otherwise they get missed. We're now back to one project codebase - thanks 🙂
  2. @Corvus @MariosBikos_HTC @Cotta Any possibility of an update on UE4 support? It would be helpful from our perspective to understand if/how HTC are going to support eye tracking with UE4 as it will help us both understand what we will do with our current project development work and will influence our hardware investment. Many thanks!
  3. Is there any chance of an update please? As mentioned, we have had to split our development efforts right now. Even an update to say "we're on it" without committing to a date would be most appreciated! 😉
  4. Just a polite bump for a reply on this from HTC. We are in a situation where as the Vive Pro plugin for eye tracking only works with 4.23 and we have other hardware dependencies that need 4.24 making it hard to develop with a single project. Any indication on whether there will be an update or even pointers on how we can update it ourselves would be most appreciated. @Corvus @Cotta
  5. Hi, Is there an ETA for an update to the plugin for Unreal 4.24 at all? We have a project that needs to be on 4.24 however we have just discovered that the drivers have not been updated to support it yet. Similarly is HTC looking to support subsequent releases of Unreal in step with releases? Thanks @Daniel_Y @Corvus
  6. Hi - we are also looking at tracking across large spaces with Vive and are looking at how to achieve this. Is it possible to create a setup with more than 4 lighthouses successfully yet? We would like to understand the maximum number we could use as would be looking to track as large a space as possible (think warehouse scale so ideally 16). @VibrantNebula
  7. Hi - Are there any updates to this? We are needing a solution for spatial mapping with an inside out tracking headset where Cosmos + SRWorks may be the solution. Is this now working? @Jad @Daniel_Y @Andy.YC_Wang
  8. @Daniel_Y - awesome many thanks for confirming - it's helpful in determining likely vector error over distance, of course as you said that totally depends on your own eyesight too
  9. @Andy.YC_Wang Many thanks - that would be most appreciated
  10. Try adding a debug text node and seeing if it sends any text to console. That way you can double check if the object's event begin play is executing. @Jawani
  11. Thanks @Andy.YC_Wang - this implementation works for something already in the level however this isn't exposed you can't set this programmatically in blueprint - ie you can't have an actor that is spawned set the value of this. Is this correct?
  12. Answering my own question for everyone's reference On begin play a ViveSRDualCameraImagePlane is spawned and is effectively mapped to the player camera transforms - HOWEVER this doesn't appear to be attached to the player camera component so when the player moves/transforms their location, the image plane orientation messes up. Therefore simply at EventBeginPlay, if you get the spawned actor and attach it to the player's camera, this will ensure the image plane always keeps proper alignment
  13. See screenshot of what happens when you rotate the player pawn by 90 degrees. The SR actors don't keep up and even when you do update their rotations accordingly, the transforms still don't create a properly aligned view. Similarly if you have your player facing any direction other than along the X axis then there is a mis match in the attachment/transform between the image and the player camera. @Daniel_Y @reneeclchen
  14. It appears that when you change the players location or rotation in world space in Unreal that the camera image plane that is generated in front of the user doesn't adapt to update to this. Is there a way to ensure that if the player pawn translates that the ViveSR actors keep track and follow this? We've tried manually mapping the transforms to the image plane that is generated at run time however doesn't appear to do this. Thanks!
  15. If anyone from Vive with experience in this area could chip in it would be most appreciated. We're trying to do some work where understanding this is very important in working with the data from the Vive Pro Eye. Thanks! @Daniel_Y @zzy @Corvus
×
×
  • Create New...