Jump to content
 
iuilab

A couple of questions about Focus() and Accuracy

Recommended Posts

1. What are the accuracy and the unit of the eye position get from eyeData.verbose_data.left.gaze_origin_mm? I printed the value and found that I can get the value of five or six decimal place. Are they really present the accuracy of the eye position or just only noise data.

 

2. What does SetEyeParameter() really do? What is the sensitive factor of gaze ray in [0,1] mean and what can it affect the user?

 

3. The Focus Sample in the SDK present 3 DartBoards that can show the eye focus point in different areas, but I am curious about the tremble that occurs on the border of different areas is caused by the real physiological natural eye reflex such as nystagmus? or just because the device is not accurate enough?

 

4. I am using EyeFocusSample to get the eye focus point at the Sci-Pi style Environment from Unity Asset Store, however, after adding the Focus(), the performance becomes low to cause the frames latency very obvious. Is this caused by considerable objects or the Raycast() used in the Focus()? if so, what's the limitation or baseline of using Focus()?

 

5. I need to get eye focus points at different realistic VR environments, but due to Q4, the latency actually cause a huge negative effect on my experiment. So, is there any possible to use the two directions from both eyes to calculate the eye focus point? Or the result would be very inaccurate and very different from the FocusInfo.point that get from Focus()?

 

Thanks for helping me to solve my problems.

(  - tag added by moderator) 

Share this post


Link to post
Share on other sites

1. What are the accuracy and the unit of the eye position get from eyeData.verbose_data.left.gaze_origin_mm? I printed the value and found that I can get the value of five or six decimal place. Are they really present the accuracy of the eye position or just only noise data.

Units for eye position are reported in millimeters (mm). I will follow up with more info about accuracy.


2. What does SetEyeParameter() really do? What is the sensitive factor of gaze ray in [0,1] mean and what can it affect the user?

I will follow up with more info about SetEyeParameter and how it works with GazeRayParameter to set the "sensitive_factor".

 

3. The Focus Sample in the SDK present 3 DartBoards that can show the eye focus point in different areas, but I am curious about the tremble that occurs on the border of different areas is caused by the real physiological natural eye reflex such as nystagmus? or just because the device is not accurate enough?

The 3 dartboards are placed at different distances to highlight the ease/difficulty of focusing on small points from different distances.

 

4. I am using EyeFocusSample to get the eye focus point at the Sci-Pi style Environment from Unity Asset Store, however, after adding the Focus(), the performance becomes low to cause the frames latency very obvious. Is this caused by considerable objects or the Raycast() used in the Focus()? if so, what's the limitation or baseline of using Focus()?

Can you share a code snippet showing how you're using Focus()? And how many objects with colliders are involved in the scene? I haven't heard of any developers getting large performance problems from using Focus.

 

5. I need to get eye focus points at different realistic VR environments, but due to Q4, the latency actually cause a huge negative effect on my experiment. So, is there any possible to use the two directions from both eyes to calculate the eye focus point? Or the result would be very inaccurate and very different from the FocusInfo.point that get from Focus()?

You can use the gaze ray or a custom solution to calculate focus point but the Focus function should provide good results.

Share this post


Link to post
Share on other sites

3. I know the three different dartboards are present different distance, but I am wondering the tremble or shake of focus point occur when I try to focus on one area (the present of focus area is seem to move quickly between this area and the adjacent area  in the same dartboard) is cause by the sensitive and accuracy of Vive pro eye device or it actually present the real eye movement? the eye really have nystagmus?

 

4. The environment I use is show as figure,(I'm sorry that I don't know how to caculate how many objects with colliders are involved in the scene..., may I ask how to do so? )

 

and the frame rate from steamVR often become red (occur latency) 

 

main of my code in FocusCube Script: 

 

private void Update(){        if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING && SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.NOT_SUPPORT) return;        EyeData eyeData = new EyeData();        SRanipal_Eye.GetEyeData(ref eyeData);                foreach (GazeIndex index in GazePriority)        {            Ray GazeRay;            if (SRanipal_Eye.Focus(index, out GazeRay, out FocusInfo, MaxDistance))                {                    EyeText.text = "EyePos: " + FocusInfo.point.ToString("F3") + "\n dis: " + FocusInfo.distance                    + "\nL gaze orgin: " + eyeData.verbose_data.left.gaze_origin_mm                    + "\nR gaze orgin: " + eyeData.verbose_data.right.gaze_origin_mm                // use struct to save data then write file when OnApplicationQuit()

break; } }}

when I run my code, I did not let text show on screen.

 

5. Is that mean I can use just only these two vectors: eyeData.verbose_data.left.gaze_direction_normalized and eyeData.verbose_data.right.gaze_direction_normalized caculate the intersection point to be the focus point? or must use gaze ray to achive this purpose?

 

Thank you very much again.

 

Share this post


Link to post
Share on other sites

This satiation might be that you've put too many colliders into the sense.
There is a method to resolve it.
Actually, you can found out that the method Focus() wraps the unity API Physic.Raycast().

Acording to documant of unity 3d user manual, you can ignore colliders with layer mask when casting a ray.

Hence, you can manually create a layer and cast a ray only against the layer and ignore all other colliders.

https://docs.unity3d.com/ScriptReference/Physics.Raycast.html

https://docs.unity3d.com/Manual/Layers.html

Thank you.

Share this post


Link to post
Share on other sites

To follow up the question regrading SetEyeParameter(). 

There is an filter added to the gaze vector to prevent from gaze tremble, ex. when look at far distance in some use cases.  The sensitive factor is contrast to the filter's strength. 

Would suggest to use ViveSR::anipal::Eye::VerboseData.combined for gaze raycast. 

Share this post


Link to post
Share on other sites

This follow-up does not address many of the previous posters questions to which a follow-up was promised.  I'm also confused.  What does, "The sensitive factor is contrast to the filter's strength." Mean?  

@Cory_HTC ?

Share this post


Link to post
Share on other sites

Refer to VIVE PRO EYE SPECS for accuracy in https://www.vive.com/eu/product/vive-pro-eye/.

gaze_ray_paramete set by SetEyeParameter() defines the sensitive factor of gaze ray in [0,1] .

where 1 means it is raw gaze vector w/o adding a filter post-processing for gaze tremble and 0 means it is post processed with strongest filter to remove gaze tremble.

Share this post


Link to post
Share on other sites

I'm also interested in this response.  @Daniel_Y, can you please provide an example line of code here?  Your answer lacks clarity, the SDK documentaton does not touch upon using setEyeParameter(), and I cannot find any examples in Unity. 

Thanks.

Share this post


Link to post
Share on other sites

Ok, I've made some progress in changing the parameter, but see no change in the behavior of the eye tracker.

 

Inside SR_anipal_Eye_Framework.cs, at the end of StartFrameWork(), I've added the folowing code that I lifted from SRanipal_EyeSettingSample.cs, with the intention of setting the parameter.gaze_ray_parameter.sensitive_factor to 1.:


EyeParameter parameter = new EyeParameter
{
gaze_ray_parameter = new GazeRayParameter(),
};

Error error = SRanipal_Eye.GetEyeParameter(ref parameter);
Debug.Log("GetEyeParameter: " + error + "\n" +
"sensitive_factor: " + parameter.gaze_ray_parameter.sensitive_factor);

// This is the only change I've made to the code lifted from the SRanipal_eyeSettingSample.cs
//parameter.gaze_ray_parameter.sensitive_factor = parameter.gaze_ray_parameter.sensitive_factor == 1 ? 0.015f : 1;
parameter.gaze_ray_parameter.sensitive_factor = parameter.gaze_ray_parameter.sensitive_factor = 1;

error = SRanipal_Eye.SetEyeParameter(parameter);
Debug.Log("SetEyeParameter: " + error + "\n" +
"sensitive_factor: " + parameter.gaze_ray_parameter.sensitive_factor);

 

I've attached the file here, in case that helps.

Any feedback?  @Daniel_Y@Cory_HTC?

Thanks in advance.

SRanipal_Eye_Framework.cs

Share this post


Link to post
Share on other sites

Please sign in to comment

You need to be a member in order to leave a comment

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...