Jump to content


  • Content Count

  • Joined

  • Last visited

Community Reputation

7 Neutral

1 Follower

About Corvus

  • Rank
  • Birthday January 1

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. @Djangi For the Tobii XR SDK questions you need to reach out to Tobii directly or maybe @Brand can assist you. We do not have official latency specs to share. There was some independent results shared on the forum you can find here but the latency and filters have improved with the v2 version available with the "SR_runtime v1.1.2.0 " release. https://forum.vive.com/topic/5970-pro-eye-unsuitable-for-research-data-super-heavily-filtered-during-fixation/?do=findComment&comment=27437
  2. @Djangi I'm happy to help you out and answer these questions. Let me know if you need any more information or details. What is the difference between Vive Eye Tracking SDK (SRanipal) and the tobii xr sdk. While tobii published a clear api reference, I can't find something similar about SRanipal - Feel free to download the SRanipal SDK and the api documentation is included with it. The main differences have to do with pricing, access to raw camera images, and licensing terms where the Tobii SDK requires a special license for analytics usages. https://developer.vive.com/resources/knowledgebase/vive-sranipal-sdk/ Are there events that indicate the start and end of saccadic eye movement or are is the prefabricated code available to measure this? - No, the SRanipal SDK does not provide functions for eye saccades. How does does the data flow work from the integrated eye tracker into unity, is all raw data available in Unity or do I get data that is filtered for in game performance. Also, is the data directly fed into the unity plugin or does it run through external software like with other professional tobii products. - You will get filtered data in Unity via the SRanipal Runtime software. If I want to record data from offline analysis, or do I have build some sort of logger or is there a function in the sdk. Also, is the eye tracking data only available in the plugin or can I also access it externally in windows. - There is no function in the SDK but there are samples available on the forums. Please check this link to see an example of recording the data at 120hz. https://forum.vive.com/topic/5897-getting-verbosedata-at-the-fastest-rate-possible/?tab=comments#comment-26428 I've recently read an article on slippage: https://link.springer.com/article/10.3758/s13428-019-01307-0 The Tobii glasses seem to perform better than the competition in this matter, I'm wondering how far this issue is effecting the HTC Vive Pro Eye. - I will review this article, thanks for sharing it. FYI, we have partnered with Tobii for the eye tracking and SDK so should expect similar performance. This question might seem a bit naive. Is there an active HTC Vive Pro Eye and/or VR Eye tracking community. So far I've only encountered the Pupil labs discord(which is not very active), the tobii xr support forum and this forum. I hope there are more sources out there. - Eye tracking and VR eye tracking are still relatively new technology and niche communities. If you find any more useful communities or resources please feel free to share them here. 😃
  3. @GestureLab You can start the calibration via code with the "LaunchEyeCalibration" function or you can start it manually: - Run "EyeCalibration.exe" manually in C:\Program Files\VIVE\SRanipal\tools\eye_calibration
  4. @drsect0r The only way currently is to enable "Display VR View" from SteamVR Settings.
  5. @traclabsar There is no system parameter or API that does this but it's possible to implement yourself. Here is a suggested solution: " The usual solution to this problem is to render the scene to a low res (like 4x4 or 8x8) texture/render target. Then you have the CPU loop over all the pixels in that texture and compute the average brightness." Source: https://answers.unity.com/questions/640447/detect-brightnessdarkness-of-a-scene.html Note: For performance reasons you probably don't want to measure all of the pixel values, it should suffice to render a low res texture.
  6. @ScottHerz Have you tried the troubleshooting steps here: https://forum.vive.com/topic/6482-vive-pro-eye-calibration-initialization-error-troubleshooting/
  7. @Nermeen I've responded to your original post.
  8. @Nermeen Are you using an account with Admin privileges? Most of the calibration initialization errors are due to running the SRanipal runtime without admin.
  9. @saidyy I suggest checking out and using one of the Unity file browser plugins available on the Asset Store or GitHub and adding VR controller support.
  10. @Teresa Liu If you use the latest SteamVR plugin it will generate Cosmos controller bindings files by default and the project will support Cosmos. If you are developing for cross platform with Focus and PC it may be helpful to use the Vive Input Utility (VIU). For multiplayer you can develop a cross platform multiplayer game with plugins like the Unity Multiplayer SDK or Photon SDK. https://github.com/ViveSoftware/ViveInputUtility-Unity
  11. @sttu Have you downloaded the eye tracking (SRanipal) SDK and checked the documentation?
  12. @Minh_26700 You will need to use the base stations and tracking to let a user look at a 360 image. Using the Pro Eye without base stations or SteamVR is not a currently supported use case.
  13. @mrk88 I sent a PM requesting a SteamVR report and SRanipal log files.
  14. @atom You can get the timestamp with this function: ViveSR.anipal.Eye.EyeData_v2.timestamp You can find more info in the documentation. If you want an example of recording data and timestamps at 120hz please check this post: https://forum.vive.com/topic/5897-getting-verbosedata-at-the-fastest-rate-possible/?tab=comments#comment-26428
  15. @vpfrimmer_sxb Here is the link to the thread where you can request beta access: https://forum.vive.com/topic/5747-list-of-supported-gpus/
  • Create New...