Jump to content


Showing topics in Vive News and Announcements, Vive Community Guidelines and FAQs, General Vive Discussion, Vive Technical Support, Developer Blog, Developer Discussion, Developer Support, Viveport SDKs and Downloads, Vive Wave SDK, Vive SRWorks SDK, Vive Audio SDKs, Vive Input Utility, Vive Hand Tracking SDK, Vive Cosmos Developer FAQs and Vive Eye Tracking SDK posted in for the last 365 days.

This stream auto-updates     

  1. Today
  2. For the question 4, then what is the format of eye blinking data? For example, do i get the opened or closed data(binary data) or the percentage of closed eye(ex_ if half is closed, then 0.5)? Thank you.
  3. It worked yesterday, but today it's not so today i'm using SteamVR v1.12.5 , unity v2019.3.10f1 , SRWorks v0.9.3.0 yesterday is working thanks.
  4. Yesterday
  5. I've purchased a number of Vive Focus Pluses and I'm starting to set them up to be distributed. The batch config utility works great for getting my built application onto the devices. However, I would also like to update the devices' OS to the latest version before shipping. Is there a way to do that using the batch config utility? I'm currently manually updating every headset patch by patch, and it's a huge pain. Also, is there a way to have a device automatically sign into a Viveport account with the batch config utility? @Cotta @JustinVive
  6. @Daniel_Y, Link does not exist from previous post. Here is a similar question about gaze origin, https://community.viveport.com/t5/Vive-SRanipal-SDK/Vive-Pro-Eye-Finding-a-single-eye-origin-in-world-space/gpm-p/31592#M20.The diagram is updated. The gaze origin is with repect to System Origin. @iuilab, Did you get your answer? If so, Please let me know about gaze_origin_mm interms or left and right eye. If I take average of these two points, then does it mean the position of the object in the scene where user looked at? Thanks
  7. @Lhannan Interesting, thanks for sharing your test results! Also, make sure you have recently calibrated ( I find it helps with more accurate facial animation, less twitchy).
  8. 3. Yes, it is possible to show the fixation point on the PC monitor and record it. Recording functionality is not built into the SDK but can be implemented within engine or via 3rd party tools. 4. No, access to the raw eye camera video is not enabled.
  9. Hi @AtomicOtaku, Could you provide logs in $(Driver)\Users\$(UserName)\AppData\LocalLow\HTC Corporation\SR_Logs? Thanks.
  10. 3. Can I get the video that i am looking at? I mean the video with the fixation point something like below. 4. Additionally, can i get the eye video? Thank you for your answer.
  11. @Corvus For me it seems to work better in 4.22 than 4.23! In 4.22 I got the v2 to work, it tracks eyebrows well in the beginning but then seems to lose tracking a bit. Gonna try to darken my eyebrows a bit to see if that helps (they're pretty light naturally, so figure it's worth a shot). In 4.23 the face disappearing was also an ongoing bug in both tests, but with v1 I just needed to restart to get it to pop back in. Haven't had any similar trouble in 4.22 yet so very promising! Just thought I'd update you, thanks for your help so far!
  12. Last week
  13. Cosmos Elite is more advanced, the PRO EYE if you want a PRO, because it has eye tracking. The Cosmos Elite Isn't on AMAZON YET, and i'm still waiting..
  14. @sdfalter @OCH This issue has been identified and solutions are being investigated. There are reports of Unity 2019.3.6 working without the memory leak if downgrading to that version is an available option for your project. Currently Unity 2018.4 LTS + SDK3.1.94 are recommended for best performance and stability.
  15. @Lhannan I believe it works in 4.22 & 4.23 but haven't confirmed. I know that 4.24 & 4.25 require some code changes to work with the eye tracking SDK. Feel free to test 4.22. Make sure you're using the v2 test scene and v2 framework.
  16. Hello @BlueLee, 1. Number of blinking VIVE Eye Tracking SDK can easily measure blinking with the "eye_openness" function. 2. To get the gaze velocity, i need gaze position data. Gaze position data is available via the "gaze_direction_normalized" function. 3. To calculate the video's velocity(to compare with gaze velocity), i need video(or frames) to be decoded. Can you elaborate on the information you require? Please feel free to download the Eye Tracking SDK (SRanipal) and review the documentation for further information. https://developer.vive.com/resources/knowledgebase/vive-sranipal-sdk/
  17. @Corvus Do you know if it's certain to work in a different engine version? I can easily switch and do a test as well 🙂
  18. @Lhannan Okay thanks for reporting this. We will look into Unreal 4.23 + v2 support and follow up with you if we can reproduce the issue.
  19. @Corvus I was told about this and would love to try it, but I've never gotten the v2 test to work in my build. When I VR preview there's just nothing where there should have been a head. I've tried both test scenes, as well as swapping the v1 scene (where the v1 head works) to v2. Still got nothing...using 4.23. I'm used to blueprints so it hasn't been very easy for me to debug this stuff... Do you have any advice? Thank you for responding!
  20. @Lhannan Are you testing with the v2 framework version? Support for extra blendshapes was added in v2.
  21. Good to hear that! It is supposed to work like what you did which is described in VIU's Wiki. But, that didn't work for me when I use 1.12.5. I will try again to see if it actually works. UPDATE: Just double check, it also works for me when using SteamVR 1.12.5. I tried with 2 controllers + 2 Vive trackers.
  22. I found a solution. I changed the tracker role to a different for each one (left foot, right foot etc.) and I'm getting the input for the menu button even when the 2 controllers are connected ! I first tried it with the SteamVR beta version but it also seem to work on the normal 1.12.5 one.
  23. Owning a Cosmos I'm happy how is behaving, for a commercial project I will probably anyway need an external tracked system. So want to ask if there is any specific reason to choose between the Cosmos Elite or the Vive Pro, apart the fact one is LCD and the other OLED so probably better for dark virtual environments.
  24. Hello developers! hope you safe from corona virus. I found a difference between older SRwork api and newest ( and wonder if you have same issue. I put additional objects as a child of each camera, so let them be fixed on camera always. But, when I put them on "Dual Camera", there is some delay between turning head and linked objects. While not on "Render Camera". In older api, I remember that cameras were not separated into two like picture above. Is it intended or sth strange? Do you have same issue? @Daniel_Y @Jad @Corvus
  25. Hello, I have questions before i buy VIVE eye. For my research, I should get data below, but i am wondering if i can. 1. Number of blinking 2. To get the gaze velocity, i need gaze position data. 3. To calculate the video's velocity(to compare with gaze velocity), i need video(or frames) to be decoded. Thank you.
  26. @Jad: We have a similar question. We are working in a consortium on tele-operation of lifelike robotics, and it would be very useful to be able to capture the operators face (including lips) for this purpose. I would appreceate to hear about the availablity of the Lip Tracking Module.
  27. Dear @Asish and @shomik As @Corvus mentioned, we need to see Unity FPS and eye sampling seprately. These two parameters: frame rate and sampling frequency of eye tracker are indepedent from each other. For example, while the Unity FPS changes depending on the VR design and the computer specification (e.g. sometimes it is 50FPS and it changes to 100FPS at another time), the sampling frequency of eye tracker is basically consistent at a specific value. If you use the callback fucntion, the sampling frequency should be more or less at around 120Hz, whereas the Unity FPS may fluctuate. Specifically, while Unity Update() is running on one thread, Eye call back funciton is running on another thread. However, as I wrote in another thread ( ), it seems that the current SRanipal SDK does not properly output the time stamp data. Thus, it may be difficult to evaluate the sampling interval or frequency of eye tracker correctly, using the time stamp data. I instead used DateTime.Now.Ticks property on C# and observed that the sampling frequency was more or less at around 120Hz. I hope this helps. Best regards, imarin18
  1. Load more activity
  • Create New...