Jump to content

Strabismus calibration and individual eye tracking


Recommended Posts

Hi
 
Before I purchase the Vive Pro Eye(which is pretty expensive for me), I was hoping someone could answer a few questions.
 
I'm interested in developing an application that provides real-time feedback of individual eye gaze location for people who have strabismus(primarily esotropia). I currently use an Oculus Rift CV1 and Unity and my current projects use a two camera setup to be able to display different target objects exclusively to one eye or the other(with all other objects visible to both eyes).  I'd like to be able to see what the individual eyes are doing in the environment and provide visual real-time feedback to the user. This obviously would require the use of eye tracking such as the Vive Pro Eye.
 
For people who have esotropia their "good eye" focuses on the target and the other eye is looking x degrees to the left of the target(if the right eye is affected) or to the right of the target (if the left eye is affected)
image.png
 
 
How does the calibration work for someone whose eyes converge at a point that is not on the target?  I've only seen a screenshot of the calibration but it looks like you have to converge your eyes on a dot to make it light up. Someone with esotropia would only be able to do this by looking to the left or right of the target(and slightly beyond the target). This would produce invalid data. I need to be able to show the user what the left eye is looking at when the right eye is looking at the target. For my application someone with esotropia would see two gaze objects side by side representing where each eye is looking instead of one merged object representing the convergence point of both eyes(which is what someone with normal vision would see).
 
image.png.ee9fb988a99d868b3d1e6a950f7774f0.png
 
The left eye data can't simply be an offset of right eye data because I need to show independent eye movement. Someone with esotropia can move that yellow circle in the example above closer or further away from the blue circle with their eyes muscles much in the same way a normal sighted person can cross their eyes and create double vision.
 
Can the calibration be done with only one eye? If the answer is "no", can I modify the calibration data? If the answer is "no", can I skip the calibration? If the answer is "yes", is the data produced by each eye then assumed to be relative to the sensors with the assumption that the HMD and sensors are centered perfectly over the eyes?(obviously never the case).
 
Thanks for your help!
 
 
 
 
 
 
Edited by Davey
image didnt post
Link to post
Share on other sites
  • 1 month later...

Hi Davey, I see what you are trying to achieve. To answer your first question, in the API, there is a method "ViveSR::anipal::Eye::TrackingImprovements" to modify calibration data; however, I am not sure how helpful it would be. Using raw data gaze data for each eye, one can write a custom calibrator. There is no built-in method to calibrate individual gaze. 
Since the default calibration process involves a six-point 2D plane calibration. In my opinion, even if the gaze is misaligned, it would lead to somewhat proper calibration maybe for some population. The user would still see a double image as natural.

image.thumb.png.6e3b84999a351c41fe055c97793975d3.png

 

  

Link to post
Share on other sites

@Davey @nbhatia

Calibration is 3 step process that requires both eyes.
Step 1 is HMD placement, 
Step 2 is IPD adjustment, 
Step 3 is the gaze dots.


Can the calibration be done with only one eye?
    No, the calibration will not complete without both eyes.

can I modify the calibration data?
    No, currently not possible but a number of devs have requested this feature so we can re-examine.

can I skip the calibration?
    For accurate eye-tracking the calibration is performed for each user when they use the HMD. Every face and HMD placement is different so calibration optimizes for this variance. If calibration is skipped the eye-tracking usually funcitons but is less accurate.

is the data produced by each eye then assumed to be relative to the sensors with the assumption that the HMD and sensors are centered perfectly over the eyes?(obviously never the case).
    The calibration and runtime report the eye position relative to the HMD (usually not perfectly centered but ideally close).

Feel free to check the documentation included with the SDK or the webinars/guides for information about the calibration specifics.

Link to post
Share on other sites

@Davey I will try to look into the research to see if there have been tests of eye tracking with strabismus/esotropia conditions, it may also be helpful for you to contact Tobii about this inquiry. The SDK does report gaze direction for each eye independently.

Link to post
Share on other sites
  • 2 weeks later...

@Corvus Thank you for the reply! Since my original post I have purchased a old used Tobii EyeX tracker. Using an old deprecated SDK I'm able to run my own calibration(Tobii no longer exposes these functions in the newer SDKs unless you buy the pro SDK) . What I have been able to do with this SDK is run the calibration twice. I cover my left eye and look at my calibration points but before committing the temporary buffer of left eye points,  I cover my right eye and repeat the process. I then commit all the points. This works flawlessly because when someone with Strabismus uses one eye at a time they are able to look at the target directly. I have a hunch this could be implemented with the Vive Eye. Could you please confirm that this is or isn't possible? I don't need to be able to run my own custom calibration routine...I just need the option to run the calibration twice(once per eye) and then commit the all the point data.

 Here are the two calls I use from the old Tobii SDk. I'm really hoping something like this can be incorporated into the Vive Eye Calibration routine.

/**
     * Adds data to the temporary calibration buffer for the specified calibration point which the user is assumed to be looking at.
     * @param eye_tracker   An eye tracker instance.
     * @param point         A two dimensional point specified in the ADCS coordinate system (screen size percentage) where the users gaze is expected to be looking.
     * @param callback      A callback function that will be called on command completion.
     * @param user_data     Optional user supplied data that will be passed unmodified to the callback function. Can be NULL.
     */
    TOBIIGAZE_API void TOBIIGAZE_CALL tobiigaze_calibration_add_point_async(tobiigaze_eye_tracker *eye_tracker, const struct tobiigaze_point_2d *point, tobiigaze_async_callback callback, void *user_data);
 /**
     * Computes a calibration based on data in the temporary calibration buffer. If this operation succeeds the temporary calibration buffer will be copied to the active calibration buffer.
     * If there is insufficient data to compute a calibration, TOBIIGAZE_FW_ERROR_OPERATION_FAILED will be returned via the callback.
     * @param eye_tracker   An eye tracker instance.
     * @param callback      A callback function that will be called on command completion.
     * @param user_data     Optional user supplied data that will be passed unmodified to the callback function. Can be NULL.
     */
    TOBIIGAZE_API void TOBIIGAZE_CALL tobiigaze_calibration_compute_and_set_async(tobiigaze_eye_tracker *eye_tracker, tobiigaze_async_callback callback, void *user_data);

 

Thanks

 

 

 

Link to post
Share on other sites

Please sign in to comment

You need to be a member in order to leave a comment

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...