Jump to content

geringsj

Verified Members
  • Posts

    1
  • Joined

  • Last visited

Everything posted by geringsj

  1. Did anybody find a suitable transformation of the Gaze ray origin and direction provided by SRanipal_Eye_v2.GetGazeRay() or one of the other SRanipal mechanisms? I really can't find a sane mapping from Gaze origin to some Unity 2020 XRNode or the Main Camera in the SRanipal example scenes. Without knowing how "System Origin" as posted by @Daniel_Y relates to a SteamVR Camera Rig, Unity XR Nodes, ViveCameraRig, Main Camera or some other known Entity in a Unity Scene it is a tedious guessing game to reliably use the Gaze Origin in the scene. I am also wondering why in SRanipal_Eye_v2.cs line 293 and 304 the x component of the direction gets inverted, but the x component of the origin does not get inverted - is this a bug? This is my Unity 2020 script to match SRanipal Eye Gaze to the Uniry Head XR Node. Maybe somebody has done that successfully already? I want to match Gaze origin and direction for both yes to world space in the scene. I am using the SRanipal example scenes for testing. using System; using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.XR; public class XRHeadTest : MonoBehaviour { private LineRenderer m_line_renderer_left; private LineRenderer m_line_renderer_right; private GameObject left, right; private GameObject node; public XRNode tracked_node = XRNode.Head; void Start() { left = new GameObject(); right = new GameObject(); node = new GameObject(); node.name = "TrackedXRNode"; m_line_renderer_left = left.AddComponent<LineRenderer>(); m_line_renderer_left.name = "LeftLine"; m_line_renderer_left.startWidth = 0.001f; m_line_renderer_left.endWidth = 0.001f; m_line_renderer_right = right.AddComponent<LineRenderer>(); m_line_renderer_right.name = "RightLine"; m_line_renderer_right.startWidth = 0.001f; m_line_renderer_right.endWidth = 0.001f; } // Update is called once per frame void Update() { List<XRNodeState> nodes = new List<XRNodeState>(); InputTracking.GetNodeStates(nodes); var xrnode = nodes.Find((XRNodeState state) => state.nodeType == tracked_node); Vector3 xrnode_pos; Quaternion xrnode_rot; if(xrnode.TryGetPosition(out xrnode_pos) && xrnode.TryGetRotation(out xrnode_rot)) { node.transform.position = xrnode_pos; node.transform.rotation = xrnode_rot; ViveSR.anipal.Eye.EyeData_v2 eyedata = new ViveSR.anipal.Eye.EyeData_v2(); var error = ViveSR.anipal.Eye.SRanipal_Eye_API.GetEyeData_v2(ref eyedata); Vector3 origin, direction; Action<ViveSR.anipal.Eye.GazeIndex, LineRenderer, ViveSR.anipal.Eye.EyeData_v2> make_line = (ViveSR.anipal.Eye.GazeIndex index, LineRenderer line, ViveSR.anipal.Eye.EyeData_v2 eye) => { if(ViveSR.anipal.Eye.SRanipal_Eye_v2.GetGazeRay(index, out origin, out direction, eye)) { Vector3 pos = node.transform.TransformPoint(origin); Vector3 dir = node.transform.TransformDirection(direction); line.SetPosition(0, pos); line.SetPosition(1, pos + (dir * 10.0f)); } }; make_line(ViveSR.anipal.Eye.GazeIndex.LEFT, m_line_renderer_left, eyedata); make_line(ViveSR.anipal.Eye.GazeIndex.RIGHT, m_line_renderer_right, eyedata); } } }
×
×
  • Create New...