Hi,
I have some questions about eye tracking , if you want you can try to help me to find answers or confirm what I done :
1-About « SRanipal_Eye_v2.Focus » method , I saw it uses sphere cast but like I want to use it to find object in an Attention area, I need to get a sphere with different radius size according to the distance which there is between my eye and my combined eye, and not a fixed size ? It exists a mean to send raycast with this mecanisms ?
Currently I use just rayCast and draw a sphere according to distance between my combined eye and the impact of raycast, and fixing an angle to find radius of attention area.
2-About IPD , currently I found how get this value , making difference between the postion of my two eyes, but I though also IPD was linked with virtual camera in Unity, but currently It’s not the case because when I try to get a similar IPD with my two screens in the headset, virtual camera doesn’t move in Unity ? Is it normal or it’s possible also to modify distance betweens virtual cameras ?
3-About CallBack eye data enabled, CallBack function can be called 120 times per second, compare to Update where it’s just each frame it’s called, is it correct ?
4-In my different test « convergence_distance_mm « attribute continue to doesn‘t work because combined.convergence_distance_validity is always false, is it normal ?
5-I saw few months "timestamp" attribute didn't work correctly and it was better to use frameCount, now "timestamp" works well ?
6-When I display a line renderer to show trajectory of my combined eye, I see that the result is often wrong because start position change very often ? Do you get the same issue ?
I have the issue with « localPosition » in SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out localPosition , ….)
So now just I take the middle of both pupils positions to get a result very stable, is it a good alternative ?
7-Is it possible to get an image of our eyes with this eye tracking, or it’s necessary to ask licence or another thing to use for search ?
8-Do you know how to use parameter.gaze_ray_parameter.sensitive_factor, because it’s seems to be an interesting parameter before to do calibration but I am not sure I use it correctly ?
Currently I use it each time before to do a calibration :
EyeParameter parameter = new EyeParameter
{
gaze_ray_parameter = new GazeRayParameter(),
};
Error error = SRanipal_Eye_API.GetEyeParameter(ref parameter);
Debug.Log("GetEyeParameter: " + error + "\n" +
"sensitive_factor: " + parameter.gaze_ray_parameter.sensitive_factor);
parameter.gaze_ray_parameter.sensitive_factor = parameter.gaze_ray_parameter.sensitive_factor == 1 ? 0.015f : 1;
9-Eye tracking use Right coordinate system but unity Left, so is it good to do :
leftCoordLocalPosition = new Vector3(-1 * rightCoordLocalPostion.x, rightCoordLocalPostion.y, rightCoordLocalPostion.z);
Without this methods, my left eye is to the right and my right eye is to the left in my global position in the Unity 3D scene.
10-With eye tracking we can get pupil position but not eye center position, so is it possible to do :
centerEye= pupilPosition- eyeDirection*0.0125f because in average the diameter of an eye is 0.025 cm, so the radius is 0.025/2.