Jump to content

Eye-tracking research with VR


Sincrono
 Share

Recommended Posts

What model would you recommend for doing eye-tracker research? 

 

The pro eye has less screen quality than the pro 2, and a pupil labs eye-tracker can be added to other models but I don't know if it is integrated in the softwares also? Where can I find detailed info on the VIVE's eye-tracker? 

So, between the pro eye and pro 2 or cosmos + pupil labs, what is the best quality in these hardware alternatives? 

 

I need to access raw gaze data, and hopefully wireless alternative.

What is the sampling rate of the pro eye tracker?

 

Can I run applications with at least 2 headsets and both accessing in real-time gaze data? For example to see (in VR) where is the other user looking? Just like in real life we see others gaze, my plan is to transfer that to the VR world with two interacting players. 

 

Thanks in advance

Link to comment
Share on other sites

@Sincrono There isn't much if any end-to-end 3rd party documentation about eye tracking since the headset is kind of a niche enterprise product being used by enterprise developers rather than the broader gaming development community.

The SDK has HTML documentation that's hard to find.

See

/SDK-v1.3.3.0/SDK/03_Unreal/Document/Eye/html/index.html

and SDK-v1.3.3.0/SDK/03_Unreal/Document/Eye/Getting Started with SRanipal in Unreal Eye-v1.3.3.0.pdf

image.png.f8f838687cbff74e96f3dd9251419c39.png

 

Link to comment
Share on other sites

@Sincrono - The Vive Pro Eye can be run by two SDKs:

  • SRAnipal - a simplified SDK that licenses Tobii's eye tracking product for basic functionality (gaze, foveated rendering, ect).
  • Tobii XR SDK - A paid SDK that you can license from Tobii to do more advanced analytics. This has a variety of toolkit-type pre-built

In other words, with the basic SRAnipal SDK - there are some limitations on the richness of data that Tobii allows you to access.

You can use Vive Pro Eye to detect people's gazes and can use that to drive things like avatar animations or to build your own tracking analytics (e.g. playback of a user's gaze). E But you can't use that SDK to access the actual images of the iris for instance.

Ovation is an example of an app that is built with SRAnipal. The developer built their own custom tools to derive the analytics from the basic gaze position data. Some of those tools can be purchased prebuilt from Tobii within their SDK.

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...