Jump to content
 

iuilab

Verified Members
  • Content Count

    3
  • Joined

  • Last visited

Community Reputation

0 Neutral

1 Follower

About iuilab

  • Rank
    Contributor

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. 3. I know the three different dartboards are present different distance, but I am wondering the tremble or shake of focus point occur when I try to focus on one area (the present of focus area is seem to move quickly between this area and the adjacent area in the same dartboard) is cause by the sensitive and accuracy of Vive pro eye device or it actually present the real eye movement? the eye really have nystagmus? 4. The environment I use is show as figure,(I'm sorry that I don't know how to caculate how many objects with colliders are involved in the scene..., may I ask how to do so? ) and the frame rate from steamVR often become red (occur latency) main of my code in FocusCube Script: private void Update(){ if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING && SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.NOT_SUPPORT) return; EyeData eyeData = new EyeData(); SRanipal_Eye.GetEyeData(ref eyeData); foreach (GazeIndex index in GazePriority) { Ray GazeRay; if (SRanipal_Eye.Focus(index, out GazeRay, out FocusInfo, MaxDistance)) { EyeText.text = "EyePos: " + FocusInfo.point.ToString("F3") + "\n dis: " + FocusInfo.distance + "\nL gaze orgin: " + eyeData.verbose_data.left.gaze_origin_mm + "\nR gaze orgin: " + eyeData.verbose_data.right.gaze_origin_mm // use struct to save data then write file when OnApplicationQuit() break; } }} when I run my code, I did not let text show on screen. 5. Is that mean I can use just only these two vectors: eyeData.verbose_data.left.gaze_direction_normalized and eyeData.verbose_data.right.gaze_direction_normalized caculate the intersection point to be the focus point? or must use gaze ray to achive this purpose? Thank you very much again.
  2. 1. What are the accuracy and the unit of the eye position get from eyeData.verbose_data.left.gaze_origin_mm? I printed the value and found that I can get the value of five or six decimal place. Are they really present the accuracy of the eye position or just only noise data. 2. What does SetEyeParameter() really do? What is the sensitive factor of gaze ray in [0,1] mean and what can it affect the user? 3. The Focus Sample in the SDK present 3 DartBoards that can show the eye focus point in different areas, but I am curious about the tremble that occurs on the border of different areas is caused by the real physiological natural eye reflex such as nystagmus? or just because the device is not accurate enough? 4. I am using EyeFocusSample to get the eye focus point at the Sci-Pi style Environment from Unity Asset Store, however, after adding the Focus(), the performance becomes low to cause the frames latency very obvious. Is this caused by considerable objects or the Raycast() used in the Focus()? if so, what's the limitation or baseline of using Focus()? 5. I need to get eye focus points at different realistic VR environments, but due to Q4, the latency actually cause a huge negative effect on my experiment. So, is there any possible to use the two directions from both eyes to calculate the eye focus point? Or the result would be very inaccurate and very different from the FocusInfo.point that get from Focus()? Thanks for helping me to solve my problems. ( - tag added by moderator)
  3. Hi there, I am trying to obtain accurate eye positions for my study these days, according to the SDK document, there are two parameters that might suitable for my request, they are the gaze_origin_mm attribute under the SingleEyeData structure and the Vector2 postion from GetPupilPosition( ). However, I have some questions about these two parameters. First of all, the definition of gaze_origin_mm is "The point in the eye from which the gaze ray originates in meter miles. (right-handed coordinate system)". I believe the unit of measurement is millimeter instead of meter miles, I guess. After I tried to get the point from Vive Pro Eye, I concluded the point is the value of eye position related to the lens' center, and both eyes are in the same coordinate system (x: point to me, y: point up, and z: point to the left). But the picture above the context makes me confused. According to the figure, there are two Gaze Origin Vectors point Right and Left separately from Right Eye and Left Eye. So, what is the real coordinate system for these parameters, and are each Lens center the center points (0, 0, 0) of each eye separately? Secondly, the definition of Vector2 postion from GetPupilPosition( ) is "The 2D position of a selected pupil clamped between -1 and 1. Position (0, 0) indicates that the pupil is looking forward; position (1, 1) up-rightward; and position (-1, -1) left-downward.". I am not sure "position" spell into "postion" is on purpose or not. However, I am curious about the range or the border of (1, 1) and (-1, -1), cause I try my best to move my eyes, they still can not reach at least 0.5 or smaller than - 0.5. Thanks for helping me to figure out my problems, and hopefully can give me a suggestion of the best strategy to get the raw data of eye position.
×
×
  • Create New...