Jump to content

Vive Pro Eye: Finding a single eye origin in world space


Recommended Posts

The sample itself does not match my use case. It uses Focus() to check if the gaze hits a collider. I need to know the origin and direction of the gaze, even if the user is not looking at an object. I log that data and show it to the user afterwards to give them feedback on where they were looking at.

And I traced the method calls and saw that Fccus() calls GetGazeRay(index, out ray) and then GetGazeRay(index, out origin, out direction) and that last one is the same as I am using.

But I did learn in the meantime that I can use the LineRenderer with local space. If I do that and set the points to:
0,gazeOriginLocal
1,gazeOriginLoca+gazeDIrectionLocal*lengthOfRay
I get a quite accurate gaze pointer.

It still would be great to get world coordinates as well, since it requires converting the gazeDirectionLocal to a world verctor, using the head rotation, so if someone has a good method for that I would be very interested (Vector3.Cross is not the right one, I think, as that gives a vector perpendicular to two other vectors and we need something like an addition).

And to get back to my original question, it looks like the asnwer is yes, system origin in the diagram is camera.transform.position in Unity.

Link to comment
Share on other sites

  • 5 months later...

@Daniel_Y

In my case I used also camera.transform.position for "system origin". I think It could be better if It was written in the documentation if this affirmation is good.

It could be better also to indicate that origin of eyes is a local position compared to "system origin", because before to read previous posts, I thought it was compared to lens origin left and right.

Also like you use right coordinate system whereas Unity uses left coordinate system, it's necessary to do new Vector3(-1 * gazeEyeOrigin.x, gazeEyeOrigin.y, gazeEyeOriginl.z) to get position of an eye, because without make that left eye position and right eye position are inversed.

 

@jboss About "It still would be great to get world coordinates...": you talk about something like that ? :

 Vector3 GazeDirectionEyeLeft = Camera.main.transform.TransformDirection(GazeDirectionEyeLeftLocal);
                visualLineLeftEye.SetPosition(0, leftEyeSystemOrigin.transform.position);
                visualLineLeftEye.SetPosition(1, leftEyeSystemOrigin.transform.position + GazeDirectionEyeLeft *lenghtEyeleftLine);

Link to comment
Share on other sites

@monetl To be honest, I don't really remember what I was looking for at that time 😃 I think I solved it by using an avatar head  at the camera position and use the line renderer as a component of that with local coordinates. It may not be 100% accurate, but it looks good enough for now.

My issue right now is getting the 120Hz output of the eyetracker to see if that makes the eyetracking more accurate:

 

  • Like 1
Link to comment
Share on other sites

@Daniel_Y @jboss I don't know if you already watch this article, but you should find answers here: https://www.frontiersin.org/articles/10.3389/fpsyt.2020.572938/full

because app they created used 120hz eye tracking using EyeCallBack, while and coroutines:  "We particularly visualized the sampling interval of the first 3600 samples that we recorded; we assumed 30 s of recording time at the sampling frequency of 120 Hz" .

And here the git of project : https://github.com/MotorControlLearning/SaccadeVR-mobile and here the script to use 120hz:

https://github.com/MotorControlLearning/SaccadeVR-mobile/blob/master/SaccadeVR_ViveProEye/Assets/Scripts/Saccade_measure_rev1.cs

"Measure eye movements at the frequency of 120Hz until framecount reaches the maxframe count set"

 

Link to comment
Share on other sites

  • 2 weeks later...

Did anybody find a suitable transformation of the Gaze ray origin and direction provided by SRanipal_Eye_v2.GetGazeRay() or one of the other SRanipal mechanisms? I really can't find a sane mapping from Gaze origin to some Unity 2020 XRNode or the Main Camera in the SRanipal example scenes.

Without knowing how "System Origin" as posted by @Daniel_Y relates to a SteamVR Camera Rig, Unity XR Nodes, ViveCameraRig, Main Camera or some other known Entity in a Unity Scene it is a tedious guessing game to reliably use the Gaze Origin in the scene. I am also wondering why in SRanipal_Eye_v2.cs line 293 and 304 the x component of the direction gets inverted, but the x component of the origin does not get inverted - is this a bug?

This is my Unity 2020 script to match SRanipal Eye Gaze to the Uniry Head XR Node. Maybe somebody has done that successfully already? I want to match Gaze origin and direction for both yes to world space in the scene. I am using the SRanipal example scenes for testing.

using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR;

public class XRHeadTest : MonoBehaviour
{
    private LineRenderer m_line_renderer_left;
    private LineRenderer m_line_renderer_right;

    private GameObject left, right;
    private GameObject node;
    public XRNode tracked_node = XRNode.Head;

    void Start()
    {
        left = new GameObject();
        right = new GameObject();
        node = new GameObject();
        node.name = "TrackedXRNode";

        m_line_renderer_left = left.AddComponent<LineRenderer>();
        m_line_renderer_left.name = "LeftLine";
        m_line_renderer_left.startWidth = 0.001f;
        m_line_renderer_left.endWidth = 0.001f;

        m_line_renderer_right = right.AddComponent<LineRenderer>();
        m_line_renderer_right.name = "RightLine";
        m_line_renderer_right.startWidth = 0.001f;
        m_line_renderer_right.endWidth = 0.001f;
    }

    // Update is called once per frame
    void Update()
    {
        List<XRNodeState> nodes = new List<XRNodeState>();
        InputTracking.GetNodeStates(nodes);

        var xrnode = nodes.Find((XRNodeState state) => state.nodeType == tracked_node);
        Vector3 xrnode_pos;
        Quaternion xrnode_rot;

        if(xrnode.TryGetPosition(out xrnode_pos) && xrnode.TryGetRotation(out xrnode_rot)) {
            node.transform.position = xrnode_pos;
            node.transform.rotation = xrnode_rot;
            ViveSR.anipal.Eye.EyeData_v2 eyedata = new ViveSR.anipal.Eye.EyeData_v2();
            var error = ViveSR.anipal.Eye.SRanipal_Eye_API.GetEyeData_v2(ref eyedata);
            Vector3 origin, direction;

            Action<ViveSR.anipal.Eye.GazeIndex, LineRenderer, ViveSR.anipal.Eye.EyeData_v2> make_line = 
            (ViveSR.anipal.Eye.GazeIndex index, LineRenderer line, ViveSR.anipal.Eye.EyeData_v2 eye) =>  {
                if(ViveSR.anipal.Eye.SRanipal_Eye_v2.GetGazeRay(index, out origin, out direction, eye)) {
                    Vector3 pos = node.transform.TransformPoint(origin);
                    Vector3 dir = node.transform.TransformDirection(direction);
                    line.SetPosition(0, pos);
                    line.SetPosition(1, pos + (dir * 10.0f));
                }
            };

            make_line(ViveSR.anipal.Eye.GazeIndex.LEFT, m_line_renderer_left, eyedata);
            make_line(ViveSR.anipal.Eye.GazeIndex.RIGHT, m_line_renderer_right, eyedata);
        }
    }
}

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...