Jump to content

eugie

Verified Members
  • Posts

    8
  • Joined

  • Last visited

Posts posted by eugie

  1. Hello.

    I'm trying to use raycasting as extract data from the hit mesh collider.

    I tried using 

    private void Update()
        {
            
            if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING &&
                    SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.NOT_SUPPORT) return;
    
                
    
            if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false)
            {
                SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
                eye_callback_registered = true;
            }
            else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true)
            {
                SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
                eye_callback_registered = false;
            }
    
            Vector3 GazeOriginCombinedLocal, GazeDirectionCombinedLocal;
    
            if (eye_callback_registered)
            {
                if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
                else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
                else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal, eyeData)) { }
                else return;
            }
            else
            {
                if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.COMBINE, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
                else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.LEFT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
                else if (SRanipal_Eye_v2.GetGazeRay(GazeIndex.RIGHT, out GazeOriginCombinedLocal, out GazeDirectionCombinedLocal)) { }
                else return;
            }
    
            Vector3 GazeDirectionCombined;
            GazeDirectionCombined = Camera.main.transform.TransformDirection(GazeDirectionCombinedLocal);
            GazeRayRenderer.SetPosition(0, Camera.main.transform.position - Camera.main.transform.up * 0.05f);
            GazeRayRenderer.SetPosition(1, Camera.main.transform.position + GazeDirectionCombined * LengthOfRay);
    
    
            RaycastHit hit;
            if (!Physics.Raycast(Camera.main.transform.position, GazeDirectionCombined, out hit))
                return;
            Debug.Log(hit.collider);
    }

    but the project just comes to a halt when I run this code. 

     

    I also tried 

     

    private void Update()
        {
            
     		gaze_direct_combine = eyeData.verbose_data.combined.eye_data.gaze_direction_normalized;
    
            RaycastHit hit;
            if (!Physics.Raycast(Camera.main.transform.position, gaze_direct_combine, out hit))
                return;
            Debug.Log(hit.collider);
    
    	}

    but it does not seem to work. 

     

    The rest of the code runs fine without this snippet, so I did not share it for the sake of clarity. 

     

    maybe I'm putting the wrong values into the physics.raycast() ?? 

  2. @luiluks

    No need to thank me 🙂 it is mostly from the the script written by Yu Imaoka and Andri Flury at D-HEST, ETH Zurich. (https://doi.org/10.1016/j.visres.2013.02.007) be sure to reference them if you use the script! 

    did you update the file name every time? if the script detects a file with the same name, it would shut off. 

    Do you have SRanipal Eye Framework prefab in your scene and enabled eye data callback & using the version 2? 

    make an empty game object in your scene and add the script on that empty! 

  3. image.png.be9ad1a1c800006bff94196e7c9ffc49.pngimage.png.bdda1392a7dec64d8d33c1d1e5f77c1c.png

     

    Okay now I understand that the xyz values in the data is not xyz in unity space but rather xzy. 

    I fixed that and now the z value shows the spherical projection.

    However, I still do not understand why it would only show the one sided data projection.

    I understand that the data is normalized to values between -1 and 1.

    Does it mean that it normalizes the data based on the view frame? where end of the right hand side is 1 and left end is -1? 

    Then, if I want to see it on a 360 environment should I calculate it based on head position? 

  4. Hello, I'm working on a project where I need to extract eyetracking data for watching 360 videos. 

    The project I have right now features a sphere in Unity that projects the video within (inside out) and the camera is inside that sphere. 

    I managed to print out the eye tracking raw data, but I'm quite lost in its interpretation.

    when I print out the gaze direction (xyz) data on a 3d space I get this

    image.png.4a77627188db1c462462bca3206d68d3.png

    which is strange as I turned several times while wearing the HMD, which should show the data as a 'sphere'

    I don't quite understand the 'normalized data' for gaze direction. 

    Does it mean its values are relative to my current head position and resets whenever I move?

    For instance, if I turn my head in the virtual space and look at a point on my right, then turn back and look at a point on my right, the data coordinates will be the same?

    Or should I calculate the gaze angle based on the gaze origin and gaze direction?

    I would appreciate the help.

    Thank you. 

  5. Hello,

    I wrote a data logging script referencing the script written by Yu Imaoka and Andri Flury at D-HEST, ETH Zurich. (https://doi.org/10.1016/j.visres.2013.02.007)

    However, the data would not log and only the string from Data_txt() is logged onto the desired .txt file. 

    I've been working on this for the last few weeks but I am lost in what the problem is. 

    The unity project is a simple 360 video projection on a sphere, which runs smoothly, so I do not think it is the project problem. 

    I would really appreciate the help. Thank you. 

     

    Data output : 

     

    image.thumb.png.dc451f545726bd932848dd87a2b79ddd.png

     

    Code : 

    using System.Collections;
    using System.Runtime.InteropServices;
    using UnityEngine;
    using System;
    using System.IO;
    using ViveSR.anipal.Eye;
    using ViveSR.anipal;
    using ViveSR;
    
    
    
    
    public class Eyetracking : MonoBehaviour
    {
        // ********************************************************************************************************************
        //
        //  Define user ID information.
        //  - The developers can define the user ID format such as "ABC_001". The ID is used for the name of text file 
        //    that records the measured eye movement data.
        // 
        // ********************************************************************************************************************
        public static string UserID = "02";       // Definte ID number such as 001, ABC001, etc.
        public static string Path = Directory.GetCurrentDirectory();
        string File_Path = Directory.GetCurrentDirectory() + "\\video_" + UserID + ".txt";
    
    
        // ********************************************************************************************************************
        //
        //  Parameters for time-related information.
        //
        // ********************************************************************************************************************
        public static int cnt_callback = 0;
        public int cnt_saccade = 0, Endbuffer = 3, SaccadeTimer = 30;
        float Timeout = 1.0f, InitialTimer = 0.0f;
        private static long SaccadeEndTime = 0;
        private static long MeasureTime, CurrentTime, MeasureEndTime = 0;
        private static float time_stamp;
        private static int frame;
    
        // ********************************************************************************************************************
        //
        //  Parameters for eye data.
        //
        // ********************************************************************************************************************
        private static EyeData_v2 eyeData = new EyeData_v2();
        public EyeParameter eye_parameter = new EyeParameter();
        public GazeRayParameter gaze = new GazeRayParameter();
        private static bool eye_callback_registered = false;
        private static UInt64 eye_valid_L, eye_valid_R;                 // The bits explaining the validity of eye data.
        private static float openness_L, openness_R;                    // The level of eye openness.
        private static float pupil_diameter_L, pupil_diameter_R;        // Diameter of pupil dilation.
        private static Vector2 pos_sensor_L, pos_sensor_R;              // Positions of pupils.
        private static Vector3 gaze_origin_L, gaze_origin_R;            // Position of gaze origin.
        private static Vector3 gaze_direct_L, gaze_direct_R;            // Direction of gaze ray.
        private static float frown_L, frown_R;                          // The level of user's frown.
        private static float squeeze_L, squeeze_R;                      // The level to show how the eye is closed tightly.
        private static float wide_L, wide_R;                            // The level to show how the eye is open widely.
        private static double gaze_sensitive;                           // The sensitive factor of gaze ray.
        private static float distance_C;                                // Distance from the central point of right and left eyes.
        private static bool distance_valid_C;                           // Validity of combined data of right and left eyes.
        public bool cal_need;                                           // Calibration judge.
        public bool result_cal;                                         // Result of calibration.
        private static int track_imp_cnt = 0;
        private static TrackingImprovement[] track_imp_item;
    
    
    
        // ********************************************************************************************************************
        //
        //  Start is called before the first frame update. The Start() function is performed only one time.
        //
        // ********************************************************************************************************************
        void Start()
        {
            InputUserID();                              // Check if the file with the same ID exists.
            Invoke("SystemCheck", 0.5f);                // System check.
            //SRanipal_Eye_v2.LaunchEyeCalibration();     // Perform calibration for eye tracking.
            //Calibration();
            //TargetPosition();                           // Implement the targets on the VR view.
            Invoke("Measurement", 0.5f);                // Start the measurement of ocular movements in a separate callback function.  
        }
    
    
    
        // ********************************************************************************************************************
        //
        //  Checks if the filename with the same user ID already exists. If so, you need to change the name of UserID.
        //
        // ********************************************************************************************************************
        void InputUserID()
        {
            Debug.Log(File_Path);
    
            if (File.Exists(File_Path))
            {
                Debug.Log("File with the same UserID already exists. Please change the UserID in the C# code.");
    
                //  When the same file name is found, we stop playing Unity.
    
                if (UnityEditor.EditorApplication.isPlaying)
                {
                    UnityEditor.EditorApplication.isPlaying = false;
                }
            }
        }
    
    
    
        // ********************************************************************************************************************
        //
        //  Check if the system works properly.
        //
        // ********************************************************************************************************************
        void SystemCheck()
        {
            if (SRanipal_Eye_API.GetEyeData_v2(ref eyeData) == ViveSR.Error.WORK)
            {
                Debug.Log("Device is working properly.");
            }
    
            if (SRanipal_Eye_API.GetEyeParameter(ref eye_parameter) == ViveSR.Error.WORK)
            {
                Debug.Log("Eye parameters are measured.");
            }
    
            //  Check again if the initialisation of eye tracking functions successfully. If not, we stop playing Unity.
            Error result_eye_init = SRanipal_API.Initial(SRanipal_Eye_v2.ANIPAL_TYPE_EYE_V2, IntPtr.Zero);
    
            if (result_eye_init == Error.WORK)
            {
                Debug.Log("[SRanipal] Initial Eye v2: " + result_eye_init);
            }
            else
            {
                Debug.LogError("[SRanipal] Initial Eye v2: " + result_eye_init);
    
                if (UnityEditor.EditorApplication.isPlaying)
                {
                    UnityEditor.EditorApplication.isPlaying = false;    // Stops Unity editor.
                }
            }
        }
    
    
    
        // ********************************************************************************************************************
        //
        //  Calibration is performed if the calibration is necessary.
        //
        // ********************************************************************************************************************
        void Calibration()
        {
            SRanipal_Eye_API.IsUserNeedCalibration(ref cal_need);           // Check the calibration status. If needed, we perform the calibration.
    
            if (cal_need == true)
            {
                result_cal = SRanipal_Eye_v2.LaunchEyeCalibration();
    
                if (result_cal == true)
                {
                    Debug.Log("Calibration is done successfully.");
                }
    
                else
                {
                    Debug.Log("Calibration is failed.");
                    if (UnityEditor.EditorApplication.isPlaying)
                    {
                        UnityEditor.EditorApplication.isPlaying = false;    // Stops Unity editor if the calibration if failed.
                    }
                }
            }
    
            if (cal_need == false)
            {
                Debug.Log("Calibration is not necessary");
            }
        }
    
    
    
    
    
        // ********************************************************************************************************************
        //
        //  Create a text file and header names of each column to store the measured data of eye movements.
        //
        // ********************************************************************************************************************
        void Data_txt()
        {
            string variable =
            "time(100ns)" + "," +
            "time_stamp(ms)" + "," +
            "frame" + "," +
            "eye_valid_L" + "," +
            "eye_valid_R" + "," +
            "openness_L" + "," +
            "openness_R" + "," +
            "pupil_diameter_L(mm)" + "," +
            "pupil_diameter_R(mm)" + "," +
            "pos_sensor_L.x" + "," +
            "pos_sensor_L.y" + "," +
            "pos_sensor_R.x" + "," +
            "pos_sensor_R.y" + "," +
            "gaze_origin_L.x(mm)" + "," +
            "gaze_origin_L.y(mm)" + "," +
            "gaze_origin_L.z(mm)" + "," +
            "gaze_origin_R.x(mm)" + "," +
            "gaze_origin_R.y(mm)" + "," +
            "gaze_origin_R.z(mm)" + "," +
            "gaze_direct_L.x" + "," +
            "gaze_direct_L.y" + "," +
            "gaze_direct_L.z" + "," +
            "gaze_direct_R.x" + "," +
            "gaze_direct_R.y" + "," +
            "gaze_direct_R.z" + "," +
            "gaze_sensitive" + "," +
            "frown_L" + "," +
            "frown_R" + "," +
            "squeeze_L" + "," +
            "squeeze_R" + "," +
            "wide_L" + "," +
            "wide_R" + "," +
            "distance_valid_C" + "," +
            "distance_C(mm)" + "," +
            "track_imp_cnt" +
            Environment.NewLine;
    
            File.AppendAllText("video_" + UserID + ".txt", variable);
        }
    
    
    
        // ********************************************************************************************************************
        //
        //  Measure eye movements in a callback function that HTC SRanipal provides.
        //
        // ********************************************************************************************************************
        void Measurement()
        {
            EyeParameter eye_parameter = new EyeParameter();
            SRanipal_Eye_API.GetEyeParameter(ref eye_parameter);
            Data_txt();
    
            if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false)
            {
                SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
                eye_callback_registered = true;
            }
    
            else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true)
            {
                SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback));
                eye_callback_registered = false;
            }
        }
    
    
    
        // ********************************************************************************************************************
        //
        //  Callback function to record the eye movement data.
        //  Note that SRanipal_Eye_v2 does not work in the function below. It only works under UnityEngine.
        //
        // ********************************************************************************************************************
        private static void EyeCallback(ref EyeData_v2 eye_data)
        {
            EyeParameter eye_parameter = new EyeParameter();
            SRanipal_Eye_API.GetEyeParameter(ref eye_parameter);
            eyeData = eye_data;
    
            // ----------------------------------------------------------------------------------------------------------------
            //  Measure eye movements at the frequency of 120Hz until framecount reaches the maxframe count set.
            // ----------------------------------------------------------------------------------------------------------------
            ViveSR.Error error = SRanipal_Eye_API.GetEyeData_v2(ref eyeData);
    
            if (error == ViveSR.Error.WORK)
            {
                // --------------------------------------------------------------------------------------------------------
                //  Measure each parameter of eye data that are specified in the guideline of SRanipal SDK.
                // --------------------------------------------------------------------------------------------------------
                MeasureTime = DateTime.Now.Ticks;
                time_stamp = eyeData.timestamp;
                frame = eyeData.frame_sequence;
                eye_valid_L = eyeData.verbose_data.left.eye_data_validata_bit_mask;
                eye_valid_R = eyeData.verbose_data.right.eye_data_validata_bit_mask;
                openness_L = eyeData.verbose_data.left.eye_openness;
                openness_R = eyeData.verbose_data.right.eye_openness;
                pupil_diameter_L = eyeData.verbose_data.left.pupil_diameter_mm;
                pupil_diameter_R = eyeData.verbose_data.right.pupil_diameter_mm;
                pos_sensor_L = eyeData.verbose_data.left.pupil_position_in_sensor_area;
                pos_sensor_R = eyeData.verbose_data.right.pupil_position_in_sensor_area;
                gaze_origin_L = eyeData.verbose_data.left.gaze_origin_mm;
                gaze_origin_R = eyeData.verbose_data.right.gaze_origin_mm;
                gaze_direct_L = eyeData.verbose_data.left.gaze_direction_normalized;
                gaze_direct_R = eyeData.verbose_data.right.gaze_direction_normalized;
                gaze_sensitive = eye_parameter.gaze_ray_parameter.sensitive_factor;
                frown_L = eyeData.expression_data.left.eye_frown;
                frown_R = eyeData.expression_data.right.eye_frown;
                squeeze_L = eyeData.expression_data.left.eye_squeeze;
                squeeze_R = eyeData.expression_data.right.eye_squeeze;
                wide_L = eyeData.expression_data.left.eye_wide;
                wide_R = eyeData.expression_data.right.eye_wide;
                distance_valid_C = eyeData.verbose_data.combined.convergence_distance_validity;
                distance_C = eyeData.verbose_data.combined.convergence_distance_mm;
                track_imp_cnt = eyeData.verbose_data.tracking_improvements.count;
                ////track_imp_item = eyeData.verbose_data.tracking_improvements.items;
    
                //  Convert the measured data to string data to write in a text file.
                string value =
                    MeasureTime.ToString() + "," +
                    time_stamp.ToString() + "," +
                    frame.ToString() + "," +
                    eye_valid_L.ToString() + "," +
                    eye_valid_R.ToString() + "," +
                    openness_L.ToString() + "," +
                    openness_R.ToString() + "," +
                    pupil_diameter_L.ToString() + "," +
                    pupil_diameter_R.ToString() + "," +
                    pos_sensor_L.x.ToString() + "," +
                    pos_sensor_L.y.ToString() + "," +
                    pos_sensor_R.x.ToString() + "," +
                    pos_sensor_R.y.ToString() + "," +
                    gaze_origin_L.x.ToString() + "," +
                    gaze_origin_L.y.ToString() + "," +
                    gaze_origin_L.z.ToString() + "," +
                    gaze_origin_R.x.ToString() + "," +
                    gaze_origin_R.y.ToString() + "," +
                    gaze_origin_R.z.ToString() + "," +
                    gaze_direct_L.x.ToString() + "," +
                    gaze_direct_L.y.ToString() + "," +
                    gaze_direct_L.z.ToString() + "," +
                    gaze_direct_R.x.ToString() + "," +
                    gaze_direct_R.y.ToString() + "," +
                    gaze_direct_R.z.ToString() + "," +
                    gaze_sensitive.ToString() + "," +
                    frown_L.ToString() + "," +
                    frown_R.ToString() + "," +
                    squeeze_L.ToString() + "," +
                    squeeze_R.ToString() + "," +
                    wide_L.ToString() + "," +
                    wide_R.ToString() + "," +
                    distance_valid_C.ToString() + "," +
                    distance_C.ToString() + "," +
                    track_imp_cnt.ToString() +
                    //track_imp_item.ToString() +
                    Environment.NewLine;
    
                File.AppendAllText("video_" + UserID + ".txt", value);
    
                cnt_callback++;
            }
        }
    }

     

  6. @Corvus thank you very much! I managed to make the 360 video player within Unity, but the eye-tracker data is coming out null. 

    When I printed out each steps, the problem seemed to be at the callback process. 

    The eyedatacallback keeps coming back 'false'. 

    I attached my code below. Thank you. 

     

    using UnityEngine;
    using ViveSR.anipal.Eye;
    using System.Runtime.InteropServices;
    using UnityEngine.UI;
    
    /// <summary>
    /// Example usage for eye tracking callback
    /// Note: Callback runs on a separate thread to report at ~120hz.
    /// Unity is not threadsafe and cannot call any UnityEngine api from within callback thread.
    /// </summary>
    public class CallbackExample : MonoBehaviour
    {
        private static EyeData eyeData = new EyeData();
        private static bool eye_callback_registered = false;
    
        public Text uiText;
        private float updateSpeed = 0;
        private static float lastTime, currentTime;
    
    
        private void Update()
        {
    /*        print("-----");
            print(SRanipal_Eye_Framework.Status);
            print(SRanipal_Eye_Framework.FrameworkStatus.WORKING);
            print("-----");*/
            if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING) return;
    
            print("------");
            print(SRanipal_Eye_Framework.Instance.EnableEyeDataCallback);
            print(eye_callback_registered);
            print("-----");
            if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false)
            {
                print("1111");
                SRanipal_Eye.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye.CallbackBasic)EyeCallback));
                eye_callback_registered = true;
            }
            else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true)
            {
                print("2222");
                SRanipal_Eye.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye.CallbackBasic)EyeCallback));
                eye_callback_registered = false;
            }
    
            updateSpeed = currentTime - lastTime;
            uiText.text = updateSpeed.ToString() + " ms";
        }
    
        private void OnDisable()
        {
            Release();
        }
    
        void OnApplicationQuit()
        {
            Release();
        }
    
        /// <summary>
        /// Release callback thread when disabled or quit
        /// </summary>
        private static void Release()
        {
            if (eye_callback_registered == true)
            {
                SRanipal_Eye.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye.CallbackBasic)EyeCallback));
                eye_callback_registered = false;
            }
        }
    
        /// <summary>
        /// Required class for IL2CPP scripting backend support
        /// </summary>
        internal class MonoPInvokeCallbackAttribute : System.Attribute
        {
            public MonoPInvokeCallbackAttribute() { }
        }
    
        /// <summary>
        /// Eye tracking data callback thread.
        /// Reports data at ~120hz
        /// MonoPInvokeCallback attribute required for IL2CPP scripting backend
        /// </summary>
        /// <param name="eye_data">Reference to latest eye_data</param>
        [MonoPInvokeCallback]
        private static void EyeCallback(ref EyeData eye_data)
        {
            Debug.Log("callback started");
            // Gets data from anipal's Eye module
            eyeData = eye_data;
            lastTime = currentTime;
            currentTime = eyeData.timestamp;
    
    
            /*        // The time when the frame was capturing. in millisecond.
                    timeStamp = eyeData.timestamp;
    
                    // The point in the eye from which the gaze ray originates in meter miles.(right-handed coordinate system)
                    gazeOriginLeft = eyeData.verbose_data.left.gaze_origin_mm;
                    gazeOriginRight = eyeData.verbose_data.right.gaze_origin_mm;
                    Debug.Log("gazeOriginLeft: " + gazeOriginLeft);
    
                    // The normalized gaze direction of the eye in [0,1].(right-handed coordinate system)
                    gazeDirectionLeft = eyeData.verbose_data.left.gaze_direction_normalized;
                    gazeDirectionRight = eyeData.verbose_data.right.gaze_direction_normalized;
                    gazeDirectionCombined = eyeData.verbose_data.combined.eye_data.gaze_direction_normalized;
                    Debug.Log("gaze_direction_left: " + gazeDirectionLeft);
    
                    // The diameter of the pupil in milli meter
                    pupilDiameterLeft = eyeData.verbose_data.left.pupil_diameter_mm;
                    pupilDiameterRight = eyeData.verbose_data.right.pupil_diameter_mm;
                    pupilDiameterCombined = eyeData.verbose_data.combined.eye_data.pupil_diameter_mm;
                    Debug.Log("pupilDiameterLeft: " + pupilDiameterLeft);
    
                    // A value representing how open the eye is in [0,1]
                    eyeOpenLeft = eyeData.verbose_data.left.eye_openness;
                    eyeOpenRight = eyeData.verbose_data.right.eye_openness;
                    eyeOpenCombined = eyeData.verbose_data.combined.eye_data.eye_openness;
                    Debug.Log("eyeOpenLeft: " + eyeOpenLeft);
    
                    // The normalized position of a pupil in [0,1]
                    pupilPositionLeft = eyeData.verbose_data.left.pupil_position_in_sensor_area;
                    pupilPositionRight = eyeData.verbose_data.right.pupil_position_in_sensor_area;
                    pupilPositionCombined = eyeData.verbose_data.combined.eye_data.pupil_position_in_sensor_area;
                    Debug.Log("pupilPositionLeft: " + pupilPositionLeft);
    
                    lock (DebugWriter)
                    {
                        CSVWriter.Write();
                    }*/
    
        }
    }

     

×
×
  • Create New...