Jump to content
 
performlabrit

Pro eye unsuitable for research - data SUPER heavily filtered during fixation.

Recommended Posts

To see this for yourself, fixate a point while rotating your head back and forth, (as if indicating "no" in American culture). This is a standard test of the relative latency between the head tracker and eye tracker.  If latency were low, you would see very little wiggle in the fixation point, as your eye/head rotations cancel one another out.  Unfortunately, the Vive pro eye demonstrates TONS of wiggle.  Worse yet, it also does this in a strange way that seems uncorrelated with the rotation of your head.  Normally, this would suggest very high eye tracker latency.  However, the tracker seems to perform well during pursuit or saccade, and so my guess is that this is evidence of some VERY agressive dynamic filters that are corrupting the data signal during fixation to reduce jitter or slippage.  Can someone at Vive comment on this?  Can the filters be turned off?  

 

Note that a similar discussion was held here, on Reddit:

https://www.reddit.com/r/vive_vr/comments/brxbb5/very_high_htc_vive_eye_latency_well_above_100_ms/

 

The more data transparency the better.  Researchers would love to use your equipment, but cannot do so if you obscure the raw signal with dynamic filters.  We can deal with jitter / slippage.

 

Share this post


Link to post
Share on other sites

You can use SetEyeParameter() as defined in SRanipal_Eye.h to tune the filter strength.

Share this post


Link to post
Share on other sites

Great news.  Thanks for the input,  !  I'll let you know how that goes.  I won't be able to test this for a few weeks, but welcome feedback from anyone else.

 

- gD

Share this post


Link to post
Share on other sites
Posted (edited)

@Daniel_Y, SetEyeParameter() is not documented and is poorly described in the code.  Can you please provide any additional information?  What about the focus and accuracy parameters?  Thanks.  

Edited by performlabrit

Share this post


Link to post
Share on other sites
Posted (edited)

Ok, I've created two movies that show 1) I have sucessfully changed the sensitivity, and 2)  this does not resolve the original problem described by PerformLabRIT in the first post of this thread. I've attached the videos and shared via youtube.

That's a 1 meter wide board, and at a distance of 3 meters half the angular subtense is atan(0.5/3)*(180/pi) = 9.5 degrees (the radius of the target).  So, the full angular subtense is 19 degrees.

Notice that the gaze sphere oscillates almost the full radius of the target - 9.5 degrees during the head movement.  Also notice that the change in sensitivity does not affect this.

This makes me think that something is *seriously* wrong with the Tobii/HTC tracking algorithm.

 

 

 

eyeParam1.flveyeParam015.flv

Edited by gazeBehavior

Share this post


Link to post
Share on other sites

To be clear, I am fixating the center of the dart board in both videos.  Slippage is primarily from the tracker, and does not reflect true gaze behavior.

Share this post


Link to post
Share on other sites

From your video, the major factor contributes the oscillation would be the latency between head movement and eye tracker, The other one may be  gaze beyond FOV  20° .

Share this post


Link to post
Share on other sites
Posted (edited)

I agree that it is likely latency - a lot of it..  It's a bit of a guess, but I have enough experience in this area to estimate +100 ms of latency.  This is surprising given that you guys market this for foveated rendering.  This is farrrrr too slow for foveated rendering.

Are there any other filters to worry about or turn off?

Edited by gazeBehavior

Share this post


Link to post
Share on other sites

"desti hosting" seems to be a spam account of some sort. To recap:  using SetEyeParameter() to change the sensitivity does not address the issue, which seems to be high eye tracker latency.  As @performlabrit pointed out, this issue was previously raised in the attached reddit forum post.  The videos I posted above demonstrate what the post was referencing. 

Can anyone at HTC suggest ways to lower the latency?  It's clearly too high to do anything related to foveated rendering.  

 

 

Share this post


Link to post
Share on other sites

Sorry, but that sentence isn't very clear.  Do you mean that you are currently working to fix this issue?

Share this post


Link to post
Share on other sites

😁 Great, and thank you for responding.  I look forward to the updates.  My laboratory is about to measure eye latency relative to the head, and we will let you know what we learn when we do.

Share this post


Link to post
Share on other sites

Any news on this topic? It's been one week, which is a long time in the VR world!

I'd love to buy a Vive Pro Eye today, but this makes me hesitate. In my research as neuroscientist/neuroengineer, the most exciting effects are real-time (foveated rendering, prosthetics, gaze-contingency), which depend on low latency. These effects will tolerate low accuracy/precision better than high latency.

So, does it look like there is a delay in registering eye movements relative to head movements? If the delay is constant, this is workable. If the delay is variable/unpredictable then the situation becomes more difficult.

 

Share this post


Link to post
Share on other sites

 

Here are the latency measurements taken by one of our students.  Images reflect the azimuthal component of the head and eye during vestibulo-ocular reflex (while fixating a stable object and rotating one's head about the vertical axis).  The first figure shows the original signals - deg/s over time.  The second shows the results of the cross correlation, and the third shows a zoomed in look at the signals before/after adjustment for the measured latency.  

The two signals were maximally correlated with 83 ms of shift, meaning that there was 83 ms of latency in the eye tracker relative to the head tracker. Vive head tracking is typically 22-33 ms absolute latency, so that puts the ET above 100 ms of absolute latency.  
 
We have only measured this once.  Really, we should measure a few more times, with different people.  We could also learn more by testing for possible effects of rotational velocity, and GPU load on the amount of latency.  
 
An effect of the former (head velocity) would indicate that this latency results form a dynamic filter.  This seems unlikely, because if it were, they would just have us turn it off.  
 
It's more likely attributed to the core functions of the eye tracking pipeline, such as pupil segmentation in the 2D imagery, or gaze mapping (which converts from 2D pupil centroids to 3D gaze vectors). If so, I would expect the latency to increase with GPU load.
 
Something to test, anyhow. Or, HTC could jump in and say a bit more...

image (1).png

image (2).png

image.png

Share this post


Link to post
Share on other sites

Thanks gazeBehavior, this information is essential!

It sounds like the gaze delay is comparable to other current consumer VR eye tracking solutions around the same price point. It also sounds like the delay is due to software (gaze position calculation) rather than hardware.

To say that the Pro Eye eye tracker is unsuitable for research is too strong, as the eye data can be recorded and corrected after-the-fact. Additionally, task/gameplay demands could enforce longer fixations for real-time user control applications.

With a latency of 83ms relative to head tracking, the Pro Eye eye tracker is on the cusp of tolerable (50-70ms) although not optimal (20-40ms) latency for foveated rendering (https://dl.acm.org/citation.cfm?id=3127589).

This latency seems better than expected (although not as good as hoped). Gaze position calculation could be optimized for speed; however, optimization may require alterations in tobbi's proprietary (and encrypted) code, and therefore might be outside of HTC's and developers' control.

Edited by VrHacker

Share this post


Link to post
Share on other sites

The SMI eye tracker was far below this latency before Apple took it off the market, but what can we do about that?  It also cost a LOT more when it was available. 

Quote

To say that the Pro Eye eye tracker is unsuitable for research is too strong.

  I agree.  I wrote that when I was led to believe (by an HTC dev's comment) that this latency was actually the result of a dynamic filter.  It wasn't until I turned the filter off that I settled on the interpretation that the latency was due to core algorithms, and not a dynamic filter.

Quote

...eye data can be recorded and corrected after-the-fact

I also agree that, if it the latency is stable within a session and across sessions, we might be able to adjust for post-hoc, offline data analysis.  However, we do not yet know how stable the latency is across a single session, and across multiple sessions. One could also measure latency using the VOR trick - it is fairly quick to record.  

Quote

With a latency of 83ms relative to head tracking, the Pro Eye eye tracker is on the cusp of tolerable (50-70ms) although not optimal (20-40ms) latency for foveated rendering (https://dl.acm.org/citation.cfm?id=3127589).

I'll add that it's possible that the 80 ms I saw on my machine could be brought down on a very new machine (mine was great when I bought it, but is now 3 years old).  I'm abroad, but can post the specs when I get back into the USA.  

@Daniel_Y @Cory_HTC @Jad

Share this post


Link to post
Share on other sites

Please sign in to comment

You need to be a member in order to leave a comment

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...