Jump to content

Search the Community

Showing results for tags 'calibration strabismus'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums


  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



About Me

Found 1 result

  1. Hi Before I purchase the Vive Pro Eye(which is pretty expensive for me), I was hoping someone could answer a few questions. I'm interested in developing an application that provides real-time feedback of individual eye gaze location for people who have strabismus(primarily esotropia). I currently use an Oculus Rift CV1 and Unity and my current projects use a two camera setup to be able to display different target objects exclusively to one eye or the other(with all other objects visible to both eyes). I'd like to be able to see what the individual eyes are doing in the environment and provide visual real-time feedback to the user. This obviously would require the use of eye tracking such as the Vive Pro Eye. For people who have esotropia their "good eye" focuses on the target and the other eye is looking x degrees to the left of the target(if the right eye is affected) or to the right of the target (if the left eye is affected) How does the calibration work for someone whose eyes converge at a point that is not on the target? I've only seen a screenshot of the calibration but it looks like you have to converge your eyes on a dot to make it light up. Someone with esotropia would only be able to do this by looking to the left or right of the target(and slightly beyond the target). This would produce invalid data. I need to be able to show the user what the left eye is looking at when the right eye is looking at the target. For my application someone with esotropia would see two gaze objects side by side representing where each eye is looking instead of one merged object representing the convergence point of both eyes(which is what someone with normal vision would see). The left eye data can't simply be an offset of right eye data because I need to show independent eye movement. Someone with esotropia can move that yellow circle in the example above closer or further away from the blue circle with their eyes muscles much in the same way a normal sighted person can cross their eyes and create double vision. Can the calibration be done with only one eye? If the answer is "no", can I modify the calibration data? If the answer is "no", can I skip the calibration? If the answer is "yes", is the data produced by each eye then assumed to be relative to the sensors with the assumption that the HMD and sensors are centered perfectly over the eyes?(obviously never the case). Thanks for your help!
  • Create New...