Jump to content

Justine

Verified Members
  • Posts

    27
  • Joined

  • Last visited

Reputation

0 Neutral

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi there! I need to buy a laptop to run VR using the HTC Vive Pro Eye for my job. I also intend to connect the headset wirelessly to the laptop using the wireless adapter kit. However, unlike a desktop PC, laptops do not provide the option to insert the required PCIe card for wireless connection. Can someone confirm if the HTC Vive Pro Eye can only be connected wirelessly to a desktop PC? If not, is there any way to connect the VR headset wirelessly with a laptop? And what should I look out for before buying a laptop? I hope somebody can help... Thanks.
  2. Hi! I am working on a research project using HTC Vive Pro Eye and SRanipal to collect eye data. My objective with the eye data is to know what the participant is looking at in the world space coordinates (for every frame). I managed to collect raw eye data and cast eye gaze rays in the scene. However, I am struggling on how to use the RaycastHit function to obtain the world space coordinates of the objects the person is looking at (e.g. buildings, cars, signs, lamp posts etc.). Based on my conversations with other researchers, we believe that the RaycastHit function is highly useful for my case. The useful properties of the RaycastHit function I would like to implement on my code is mainly RaycastHit.point , RaycastHit.collider , and RaycastHit.distance. If someone can help me on how to use the RaycastHit function with the eye gaze rays in my script, it would help me a lot! 🙂 The code I used to obtain raw eye data and cast the gaze rays is attached (GazeRays1.cs). In addition, I have attached an image of how my scene looks like with the gaze rays enabled in my scene for reference. Thank you! GazeRays1.cs
  3. Hello! I am a researcher working on a VR experiment where a pedestrian must cross a crosswalk safely, given that there are several moving vehicles in both driving directions. I can collect the position and rotation of the HMD and I understand how to analyze the data. However, the collected raw eye data looks a bit confusing to me (please find attached data for reference). I want to know how I can analyze the raw eye data. If anyone can help me on Eye Data Analysis, it would be highly appreciated!🙂 eyedata_P1_S2.txt
  4. Hi @chengnay, So I added a new variable under "Parameters for time-related information". Then I added the variable to Data_txt() and EyeCallback(ref EyeData_v2 eye_data) methods. However, no data is being extracted. Please find below code for your reference: using System.Collections; using System.Runtime.InteropServices; using UnityEngine; using System; using System.IO; using ViveSR.anipal.Eye; using ViveSR.anipal; using ViveSR; /// <summary> /// Example usage for eye tracking callback /// Note: Callback runs on a separate thread to report at ~120hz. /// Unity is not threadsafe and cannot call any UnityEngine api from within callback thread. /// </summary> public class EyeTrackingRecordData : MonoBehaviour { // ******************************************************************************************************************** // // Define user ID information. // - The developers can define the user ID format such as "ABC_001". The ID is used for the name of text file // that records the measured eye movement data. // // ******************************************************************************************************************** public static int UserID = 3; // Always change Participant Number for every participant public static int scenario = 0; // Always change Scenario Number for every scenario //public string UserID; // Definte ID number such as 001, ABC001, etc. public string Path = Directory.GetCurrentDirectory(); //string File_Path = Directory.GetCurrentDirectory() + "\\video_" + UserID + ".txt"; public string File_Path = Directory.GetCurrentDirectory() + "P" + UserID + "_" + "S" + scenario + ".txt"; // ******************************************************************************************************************** // // Parameters for time-related information. // // ******************************************************************************************************************** public static int cnt_callback = 0; public int cnt_saccade = 0, Endbuffer = 3, SaccadeTimer = 30; float Timeout = 1.0f, InitialTimer = 0.0f; private static long SaccadeEndTime = 0; private static long MeasureTime, CurrentTime, MeasureEndTime = 0; private static float time_stamp; private static int frame; private static int frameCount; // ******************************************************************************************************************** // // Parameters for eye data. // // ******************************************************************************************************************** private static EyeData_v2 eyeData = new EyeData_v2(); public EyeParameter eye_parameter = new EyeParameter(); public GazeRayParameter gaze = new GazeRayParameter(); private static bool eye_callback_registered = false; private static UInt64 eye_valid_L, eye_valid_R; // The bits explaining the validity of eye data. private static float openness_L, openness_R; // The level of eye openness. private static float pupil_diameter_L, pupil_diameter_R; // Diameter of pupil dilation. private static Vector2 pos_sensor_L, pos_sensor_R; // Positions of pupils. private static Vector3 gaze_origin_L, gaze_origin_R; // Position of gaze origin. private static Vector3 gaze_direct_L, gaze_direct_R; // Direction of gaze ray. private static float frown_L, frown_R; // The level of user's frown. private static float squeeze_L, squeeze_R; // The level to show how the eye is closed tightly. private static float wide_L, wide_R; // The level to show how the eye is open widely. private static double gaze_sensitive; // The sensitive factor of gaze ray. private static float distance_C; // Distance from the central point of right and left eyes. private static bool distance_valid_C; // Validity of combined data of right and left eyes. public bool cal_need; // Calibration judge. public bool result_cal; // Result of calibration. private static int track_imp_cnt = 0; private static TrackingImprovement[] track_imp_item; //private static EyeData eyeData = new EyeData(); //private static bool eye_callback_registered = false; //public Text uiText; private float updateSpeed = 0; private static float lastTime, currentTime; // ******************************************************************************************************************** // // Start is called before the first frame update. The Start() function is performed only one time. // // ******************************************************************************************************************** void Start() { //File_Path = Directory.GetCurrentDirectory() + "\\Assets" + UserID + ".txt"; InputUserID(); // Check if the file with the same ID exists. //Invoke("SystemCheck", 0.5f); // System check. //SRanipal_Eye_v2.LaunchEyeCalibration(); // Perform calibration for eye tracking. //Calibration(); //TargetPosition(); // Implement the targets on the VR view. //Invoke("Measurement", 0.5f); // Start the measurement of ocular movements in a separate callback function. } // ******************************************************************************************************************** // // Checks if the filename with the same user ID already exists. If so, you need to change the name of UserID. // // ******************************************************************************************************************** void InputUserID() { Debug.Log(File_Path); if (File.Exists(File_Path)) { Debug.Log("File with the same UserID already exists. Please change the UserID in the C# code."); // When the same file name is found, we stop playing Unity. if (UnityEditor.EditorApplication.isPlaying) { UnityEditor.EditorApplication.isPlaying = false; } } else { Data_txt(); } } // ******************************************************************************************************************** // // Check if the system works properly. // // ******************************************************************************************************************** void SystemCheck() { if (SRanipal_Eye_API.GetEyeData_v2(ref eyeData) == ViveSR.Error.WORK) { Debug.Log("Device is working properly."); } if (SRanipal_Eye_API.GetEyeParameter(ref eye_parameter) == ViveSR.Error.WORK) { Debug.Log("Eye parameters are measured."); } // Check again if the initialisation of eye tracking functions successfully. If not, we stop playing Unity. Error result_eye_init = SRanipal_API.Initial(SRanipal_Eye_v2.ANIPAL_TYPE_EYE_V2, IntPtr.Zero); if (result_eye_init == Error.WORK) { Debug.Log("[SRanipal] Initial Eye v2: " + result_eye_init); } else { Debug.LogError("[SRanipal] Initial Eye v2: " + result_eye_init); if (UnityEditor.EditorApplication.isPlaying) { UnityEditor.EditorApplication.isPlaying = false; // Stops Unity editor. } } } // ******************************************************************************************************************** // // Calibration is performed if the calibration is necessary. // // ******************************************************************************************************************** void Calibration() { SRanipal_Eye_API.IsUserNeedCalibration(ref cal_need); // Check the calibration status. If needed, we perform the calibration. if (cal_need == true) { result_cal = SRanipal_Eye_v2.LaunchEyeCalibration(); if (result_cal == true) { Debug.Log("Calibration is done successfully."); } else { Debug.Log("Calibration is failed."); if (UnityEditor.EditorApplication.isPlaying) { UnityEditor.EditorApplication.isPlaying = false; // Stops Unity editor if the calibration if failed. } } } if (cal_need == false) { Debug.Log("Calibration is not necessary"); } } // ******************************************************************************************************************** // // Create a text file and header names of each column to store the measured data of eye movements. // // ******************************************************************************************************************** void Data_txt() { string variable = "time(100ns)" + "," + "time_stamp(ms)" + "," + "frame" + "," + "frame_Count" + "," + "eye_valid_L" + "," + "eye_valid_R" + "," + "openness_L" + "," + "openness_R" + "," + "pupil_diameter_L(mm)" + "," + "pupil_diameter_R(mm)" + "," + "pos_sensor_L.x" + "," + "pos_sensor_L.y" + "," + "pos_sensor_R.x" + "," + "pos_sensor_R.y" + "," + "gaze_origin_L.x(mm)" + "," + "gaze_origin_L.y(mm)" + "," + "gaze_origin_L.z(mm)" + "," + "gaze_origin_R.x(mm)" + "," + "gaze_origin_R.y(mm)" + "," + "gaze_origin_R.z(mm)" + "," + "gaze_direct_L.x" + "," + "gaze_direct_L.y" + "," + "gaze_direct_L.z" + "," + "gaze_direct_R.x" + "," + "gaze_direct_R.y" + "," + "gaze_direct_R.z" + "," + "gaze_sensitive" + "," + "frown_L" + "," + "frown_R" + "," + "squeeze_L" + "," + "squeeze_R" + "," + "wide_L" + "," + "wide_R" + "," + "distance_valid_C" + "," + "distance_C(mm)" + "," + "track_imp_cnt" + Environment.NewLine; File.AppendAllText("eyedata_" + "P" + UserID + "_" + "S" + scenario + ".txt", variable); } void Update() { if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING) return; if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false) { SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback)); eye_callback_registered = true; } else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true) { SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback)); eye_callback_registered = false; } float timeNow = Time.realtimeSinceStartup; } private void OnDisable() { Release(); } void OnApplicationQuit() { Release(); } /// <summary> /// Release callback thread when disabled or quit /// </summary> private static void Release() { if (eye_callback_registered == true) { SRanipal_Eye.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback)); eye_callback_registered = false; } } /// <summary> /// Required class for IL2CPP scripting backend support /// </summary> internal class MonoPInvokeCallbackAttribute : System.Attribute { public MonoPInvokeCallbackAttribute() { } } /// <summary> /// Eye tracking data callback thread. /// Reports data at ~120hz /// MonoPInvokeCallback attribute required for IL2CPP scripting backend /// </summary> /// <param name="eye_data">Reference to latest eye_data</param> [MonoPInvokeCallback] private static void EyeCallback(ref EyeData_v2 eye_data) { EyeParameter eye_parameter = new EyeParameter(); SRanipal_Eye_API.GetEyeParameter(ref eye_parameter); eyeData = eye_data; // do stuff with eyeData... //lastTime = currentTime; //currentTime = eyeData.timestamp; MeasureTime = DateTime.Now.Ticks; time_stamp = eyeData.timestamp; frame = eyeData.frame_sequence; frameCount = Time.frameCount; eye_valid_L = eyeData.verbose_data.left.eye_data_validata_bit_mask; eye_valid_R = eyeData.verbose_data.right.eye_data_validata_bit_mask; openness_L = eyeData.verbose_data.left.eye_openness; openness_R = eyeData.verbose_data.right.eye_openness; pupil_diameter_L = eyeData.verbose_data.left.pupil_diameter_mm; pupil_diameter_R = eyeData.verbose_data.right.pupil_diameter_mm; pos_sensor_L = eyeData.verbose_data.left.pupil_position_in_sensor_area; pos_sensor_R = eyeData.verbose_data.right.pupil_position_in_sensor_area; gaze_origin_L = eyeData.verbose_data.left.gaze_origin_mm; gaze_origin_R = eyeData.verbose_data.right.gaze_origin_mm; gaze_direct_L = eyeData.verbose_data.left.gaze_direction_normalized; gaze_direct_R = eyeData.verbose_data.right.gaze_direction_normalized; gaze_sensitive = eye_parameter.gaze_ray_parameter.sensitive_factor; frown_L = eyeData.expression_data.left.eye_frown; frown_R = eyeData.expression_data.right.eye_frown; squeeze_L = eyeData.expression_data.left.eye_squeeze; squeeze_R = eyeData.expression_data.right.eye_squeeze; wide_L = eyeData.expression_data.left.eye_wide; wide_R = eyeData.expression_data.right.eye_wide; distance_valid_C = eyeData.verbose_data.combined.convergence_distance_validity; distance_C = eyeData.verbose_data.combined.convergence_distance_mm; track_imp_cnt = eyeData.verbose_data.tracking_improvements.count; //track_imp_item = eyeData.verbose_data.tracking_improvements.items; // Convert the measured data to string data to write in a text file. string value = MeasureTime.ToString() + "," + time_stamp.ToString() + "," + frame.ToString() + "," + frameCount.ToString() + "," + eye_valid_L.ToString() + "," + eye_valid_R.ToString() + "," + openness_L.ToString() + "," + openness_R.ToString() + "," + pupil_diameter_L.ToString() + "," + pupil_diameter_R.ToString() + "," + pos_sensor_L.x.ToString() + "," + pos_sensor_L.y.ToString() + "," + pos_sensor_R.x.ToString() + "," + pos_sensor_R.y.ToString() + "," + gaze_origin_L.x.ToString() + "," + gaze_origin_L.y.ToString() + "," + gaze_origin_L.z.ToString() + "," + gaze_origin_R.x.ToString() + "," + gaze_origin_R.y.ToString() + "," + gaze_origin_R.z.ToString() + "," + gaze_direct_L.x.ToString() + "," + gaze_direct_L.y.ToString() + "," + gaze_direct_L.z.ToString() + "," + gaze_direct_R.x.ToString() + "," + gaze_direct_R.y.ToString() + "," + gaze_direct_R.z.ToString() + "," + gaze_sensitive.ToString() + "," + frown_L.ToString() + "," + frown_R.ToString() + "," + squeeze_L.ToString() + "," + squeeze_R.ToString() + "," + wide_L.ToString() + "," + wide_R.ToString() + "," + distance_valid_C.ToString() + "," + distance_C.ToString() + "," + track_imp_cnt.ToString() + //track_imp_item.ToString() + Environment.NewLine; File.AppendAllText("eyedata_" + "P" + UserID + "_" + "S" + scenario + ".txt", value); } } Can you please let me know what I did wrong? Thank you. Hope to hear from you soon.
  5. Hi @chengnay, Thank you for your reply! So then I should add it inside this method? private static void EyeCallback(ref EyeData_v2 eye_data)
  6. Hi @paulm, Can you please share a sample code of how you achieved to collect eye data using this? I am also having trouble collecting eye data with the system timestamps. Looking forward to hearing from you! 🙂 Thanks.
  7. Hi @chengnay, Which variable should I assign Time.frameCount? Because I assigned it to MeasureTime, time_stamp, and frame in three separate tests, however, the editor crashes and no eye data is collected. Hope to hear from you soon. Thank you.
  8. Hello! 🙂 I am trying to extract eye data using the following code: using System.Collections; using System.Runtime.InteropServices; using UnityEngine; using System; using System.IO; using ViveSR.anipal.Eye; using ViveSR.anipal; using ViveSR; /// <summary> /// Example usage for eye tracking callback /// Note: Callback runs on a separate thread to report at ~120hz. /// Unity is not threadsafe and cannot call any UnityEngine api from within callback thread. /// </summary> public class EyeTrackingRecordData : MonoBehaviour { // ******************************************************************************************************************** // // Define user ID information. // - The developers can define the user ID format such as "ABC_001". The ID is used for the name of text file // that records the measured eye movement data. // // ******************************************************************************************************************** public static int UserID = 2; // Always change Participant Number for every participant public static int scenario = 00; // Always change Scenario Number for every scenario //public string UserID; // Definte ID number such as 001, ABC001, etc. public string Path = Directory.GetCurrentDirectory(); //string File_Path = Directory.GetCurrentDirectory() + "\\video_" + UserID + ".txt"; public string File_Path = Directory.GetCurrentDirectory() + "P" + UserID + "_" + "S" + scenario + ".txt"; // ******************************************************************************************************************** // // Parameters for time-related information. // // ******************************************************************************************************************** public static int cnt_callback = 0; public int cnt_saccade = 0, Endbuffer = 3, SaccadeTimer = 30; float Timeout = 1.0f, InitialTimer = 0.0f; private static long SaccadeEndTime = 0; private static long MeasureTime, CurrentTime, MeasureEndTime = 0; private static float time_stamp; private static int frame; // ******************************************************************************************************************** // // Parameters for eye data. // // ******************************************************************************************************************** private static EyeData_v2 eyeData = new EyeData_v2(); public EyeParameter eye_parameter = new EyeParameter(); public GazeRayParameter gaze = new GazeRayParameter(); private static bool eye_callback_registered = false; private static UInt64 eye_valid_L, eye_valid_R; // The bits explaining the validity of eye data. private static float openness_L, openness_R; // The level of eye openness. private static float pupil_diameter_L, pupil_diameter_R; // Diameter of pupil dilation. private static Vector2 pos_sensor_L, pos_sensor_R; // Positions of pupils. private static Vector3 gaze_origin_L, gaze_origin_R; // Position of gaze origin. private static Vector3 gaze_direct_L, gaze_direct_R; // Direction of gaze ray. private static float frown_L, frown_R; // The level of user's frown. private static float squeeze_L, squeeze_R; // The level to show how the eye is closed tightly. private static float wide_L, wide_R; // The level to show how the eye is open widely. private static double gaze_sensitive; // The sensitive factor of gaze ray. private static float distance_C; // Distance from the central point of right and left eyes. private static bool distance_valid_C; // Validity of combined data of right and left eyes. public bool cal_need; // Calibration judge. public bool result_cal; // Result of calibration. private static int track_imp_cnt = 0; private static TrackingImprovement[] track_imp_item; //private static EyeData eyeData = new EyeData(); //private static bool eye_callback_registered = false; //public Text uiText; private float updateSpeed = 0; private static float lastTime, currentTime; // ******************************************************************************************************************** // // Start is called before the first frame update. The Start() function is performed only one time. // // ******************************************************************************************************************** void Start() { //File_Path = Directory.GetCurrentDirectory() + "\\Assets" + UserID + ".txt"; InputUserID(); // Check if the file with the same ID exists. //Invoke("SystemCheck", 0.5f); // System check. //SRanipal_Eye_v2.LaunchEyeCalibration(); // Perform calibration for eye tracking. //Calibration(); //TargetPosition(); // Implement the targets on the VR view. //Invoke("Measurement", 0.5f); // Start the measurement of ocular movements in a separate callback function. } // ******************************************************************************************************************** // // Checks if the filename with the same user ID already exists. If so, you need to change the name of UserID. // // ******************************************************************************************************************** void InputUserID() { Debug.Log(File_Path); if (File.Exists(File_Path)) { Debug.Log("File with the same UserID already exists. Please change the UserID in the C# code."); // When the same file name is found, we stop playing Unity. if (UnityEditor.EditorApplication.isPlaying) { UnityEditor.EditorApplication.isPlaying = false; } } else { Data_txt(); } } // ******************************************************************************************************************** // // Check if the system works properly. // // ******************************************************************************************************************** void SystemCheck() { if (SRanipal_Eye_API.GetEyeData_v2(ref eyeData) == ViveSR.Error.WORK) { Debug.Log("Device is working properly."); } if (SRanipal_Eye_API.GetEyeParameter(ref eye_parameter) == ViveSR.Error.WORK) { Debug.Log("Eye parameters are measured."); } // Check again if the initialisation of eye tracking functions successfully. If not, we stop playing Unity. Error result_eye_init = SRanipal_API.Initial(SRanipal_Eye_v2.ANIPAL_TYPE_EYE_V2, IntPtr.Zero); if (result_eye_init == Error.WORK) { Debug.Log("[SRanipal] Initial Eye v2: " + result_eye_init); } else { Debug.LogError("[SRanipal] Initial Eye v2: " + result_eye_init); if (UnityEditor.EditorApplication.isPlaying) { UnityEditor.EditorApplication.isPlaying = false; // Stops Unity editor. } } } // ******************************************************************************************************************** // // Calibration is performed if the calibration is necessary. // // ******************************************************************************************************************** void Calibration() { SRanipal_Eye_API.IsUserNeedCalibration(ref cal_need); // Check the calibration status. If needed, we perform the calibration. if (cal_need == true) { result_cal = SRanipal_Eye_v2.LaunchEyeCalibration(); if (result_cal == true) { Debug.Log("Calibration is done successfully."); } else { Debug.Log("Calibration is failed."); if (UnityEditor.EditorApplication.isPlaying) { UnityEditor.EditorApplication.isPlaying = false; // Stops Unity editor if the calibration if failed. } } } if (cal_need == false) { Debug.Log("Calibration is not necessary"); } } // ******************************************************************************************************************** // // Create a text file and header names of each column to store the measured data of eye movements. // // ******************************************************************************************************************** void Data_txt() { string variable = "time(100ns)" + "," + "time_stamp(ms)" + "," + "frame" + "," + "eye_valid_L" + "," + "eye_valid_R" + "," + "openness_L" + "," + "openness_R" + "," + "pupil_diameter_L(mm)" + "," + "pupil_diameter_R(mm)" + "," + "pos_sensor_L.x" + "," + "pos_sensor_L.y" + "," + "pos_sensor_R.x" + "," + "pos_sensor_R.y" + "," + "gaze_origin_L.x(mm)" + "," + "gaze_origin_L.y(mm)" + "," + "gaze_origin_L.z(mm)" + "," + "gaze_origin_R.x(mm)" + "," + "gaze_origin_R.y(mm)" + "," + "gaze_origin_R.z(mm)" + "," + "gaze_direct_L.x" + "," + "gaze_direct_L.y" + "," + "gaze_direct_L.z" + "," + "gaze_direct_R.x" + "," + "gaze_direct_R.y" + "," + "gaze_direct_R.z" + "," + "gaze_sensitive" + "," + "frown_L" + "," + "frown_R" + "," + "squeeze_L" + "," + "squeeze_R" + "," + "wide_L" + "," + "wide_R" + "," + "distance_valid_C" + "," + "distance_C(mm)" + "," + "track_imp_cnt" + Environment.NewLine; File.AppendAllText("eyedata_" + "P" + UserID + "_" + "S" + scenario + ".txt", variable); } void Update() { if (SRanipal_Eye_Framework.Status != SRanipal_Eye_Framework.FrameworkStatus.WORKING) return; if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == true && eye_callback_registered == false) { SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback)); eye_callback_registered = true; } else if (SRanipal_Eye_Framework.Instance.EnableEyeDataCallback == false && eye_callback_registered == true) { SRanipal_Eye_v2.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback)); eye_callback_registered = false; } float timeNow = Time.realtimeSinceStartup; } private void OnDisable() { Release(); } void OnApplicationQuit() { Release(); } /// <summary> /// Release callback thread when disabled or quit /// </summary> private static void Release() { if (eye_callback_registered == true) { SRanipal_Eye.WrapperUnRegisterEyeDataCallback(Marshal.GetFunctionPointerForDelegate((SRanipal_Eye_v2.CallbackBasic)EyeCallback)); eye_callback_registered = false; } } /// <summary> /// Required class for IL2CPP scripting backend support /// </summary> internal class MonoPInvokeCallbackAttribute : System.Attribute { public MonoPInvokeCallbackAttribute() { } } /// <summary> /// Eye tracking data callback thread. /// Reports data at ~120hz /// MonoPInvokeCallback attribute required for IL2CPP scripting backend /// </summary> /// <param name="eye_data">Reference to latest eye_data</param> [MonoPInvokeCallback] private static void EyeCallback(ref EyeData_v2 eye_data) { EyeParameter eye_parameter = new EyeParameter(); SRanipal_Eye_API.GetEyeParameter(ref eye_parameter); eyeData = eye_data; // do stuff with eyeData... //lastTime = currentTime; //currentTime = eyeData.timestamp; MeasureTime = DateTime.Now.Ticks; time_stamp = eyeData.timestamp; frame = eyeData.frame_sequence; eye_valid_L = eyeData.verbose_data.left.eye_data_validata_bit_mask; eye_valid_R = eyeData.verbose_data.right.eye_data_validata_bit_mask; openness_L = eyeData.verbose_data.left.eye_openness; openness_R = eyeData.verbose_data.right.eye_openness; pupil_diameter_L = eyeData.verbose_data.left.pupil_diameter_mm; pupil_diameter_R = eyeData.verbose_data.right.pupil_diameter_mm; pos_sensor_L = eyeData.verbose_data.left.pupil_position_in_sensor_area; pos_sensor_R = eyeData.verbose_data.right.pupil_position_in_sensor_area; gaze_origin_L = eyeData.verbose_data.left.gaze_origin_mm; gaze_origin_R = eyeData.verbose_data.right.gaze_origin_mm; gaze_direct_L = eyeData.verbose_data.left.gaze_direction_normalized; gaze_direct_R = eyeData.verbose_data.right.gaze_direction_normalized; gaze_sensitive = eye_parameter.gaze_ray_parameter.sensitive_factor; frown_L = eyeData.expression_data.left.eye_frown; frown_R = eyeData.expression_data.right.eye_frown; squeeze_L = eyeData.expression_data.left.eye_squeeze; squeeze_R = eyeData.expression_data.right.eye_squeeze; wide_L = eyeData.expression_data.left.eye_wide; wide_R = eyeData.expression_data.right.eye_wide; distance_valid_C = eyeData.verbose_data.combined.convergence_distance_validity; distance_C = eyeData.verbose_data.combined.convergence_distance_mm; track_imp_cnt = eyeData.verbose_data.tracking_improvements.count; //track_imp_item = eyeData.verbose_data.tracking_improvements.items; // Convert the measured data to string data to write in a text file. string value = MeasureTime.ToString() + "," + time_stamp.ToString() + "," + frame.ToString() + "," + eye_valid_L.ToString() + "," + eye_valid_R.ToString() + "," + openness_L.ToString() + "," + openness_R.ToString() + "," + pupil_diameter_L.ToString() + "," + pupil_diameter_R.ToString() + "," + pos_sensor_L.x.ToString() + "," + pos_sensor_L.y.ToString() + "," + pos_sensor_R.x.ToString() + "," + pos_sensor_R.y.ToString() + "," + gaze_origin_L.x.ToString() + "," + gaze_origin_L.y.ToString() + "," + gaze_origin_L.z.ToString() + "," + gaze_origin_R.x.ToString() + "," + gaze_origin_R.y.ToString() + "," + gaze_origin_R.z.ToString() + "," + gaze_direct_L.x.ToString() + "," + gaze_direct_L.y.ToString() + "," + gaze_direct_L.z.ToString() + "," + gaze_direct_R.x.ToString() + "," + gaze_direct_R.y.ToString() + "," + gaze_direct_R.z.ToString() + "," + gaze_sensitive.ToString() + "," + frown_L.ToString() + "," + frown_R.ToString() + "," + squeeze_L.ToString() + "," + squeeze_R.ToString() + "," + wide_L.ToString() + "," + wide_R.ToString() + "," + distance_valid_C.ToString() + "," + distance_C.ToString() + "," + track_imp_cnt.ToString() + //track_imp_item.ToString() + Environment.NewLine; File.AppendAllText("eyedata_" + "P" + UserID + "_" + "S" + scenario + ".txt", value); } } An example of the data extracted is also attached: eyedata_P2_S0.txt The data looks great, however, it is important for my experiment to collect eye data with respect to unix time. The main reason I want to collect eye data with respect to unix time is because I have another script that collects the HMD's motion (position and rotation). The data collected using this script starts from frame 1,2,3 and so on. In order for me to compare and analyze the motion of the HMD with eye data is to compare the data with respect to time. Therefore, I want to collect eye data stating from frame 1, 2, 3, 4 and so on as soon as I enter play mode. If anyone can help me regarding this, I would be highly grateful! Thank you. 🙂 Note: I am using SRanipal Runtime Version 1.3.2.0
  9. Hello everyone, I am also encountering the same problem as @ryanpl89. @eugie and @bfalandays, do you have any suggestions on how to resolve it? I hope you can help.. Thank you.
  10. Hi @chengnay, No specific reason, just out of curiosity. Also, now that the Eye Tracking SDK works with me, I would like to know how to extract Eye Data in my project. I saw a sample script in one of the Vive Forums on how to extract Eye Data, however, it is not working for me. Can you please advise me on how I can extract eye data? Please find link to the forum:
  11. Hi @chengnay, So, I reached out to someone with Unity experience to visit my office and check the issue, and she managed to resolve the DLL error. She mentioned that the issue might be from a bug in the XR plugin when using OpenXR. So instead, I installed the SteamVR Plugin to my project. If you have any suggestions on how to enable OpenVR (without using the SteamVR plugin) instead of OpenXR, please let me know. Please find the below summary of our troubleshooting session: Problem: Compatibility issue between OpenXR (Unity) and SRanipal SDK that causes the error "DllNotFoundException : Assets/ViveSR/Plugins/SRanipal.dll '' to appear. Solution: Uninstall XR from Unity and install SteamVR to use OpenVR instead of OpenXR. But at this stage of your project the above steps caused issues and errors. As a second plan I did the following: 1- Created a new Project that uses URP. 2- Installed SteamVR to this project. 3- Imported SRanipal SDK to this project. 4- Exported your project as a package and unchecked all the XR and the SRanipal SDK related folders. 5- Imported the previous package to the new project. 6- Because you were using a player that's provided from XR Interactions, I had to delete it too and replace it with "Player" prefab from SteamVR which has many more functionalities. For example : hand movements. 7- Tested it and made sure everything was working as intended.
  12. Hi @chengnay, Please see below: Alienware Aurora Ryzen Edition: Device name DESKTOP-CSTO532 Processor AMD Ryzen 9 5900 12-Core Processor 3.00 GHz Installed RAM 64.0 GB System type 64-bit operating system, x64-based processor Pen and touch No pen or touch input is available for this display Please let me know if you require any further information. Thanks.
  13. Hi @chengnay, My External Script Editor was not set to Visual Studio 2019 at first, so I changed it and made the settings identical to yours. Unfortunately the error did not go away... Do you have any other advice for me on how to resolve my issue? Thanks.
  14. Hi @chengnay, Which visual studio version are you using? Maybe it has something to do with the Visual Studio?
  15. Hi @chengnay, I tried uninstalling SRanipal runtime, rebooting, and reinstalling again, but the issue still remains. That is very strange that the error did not show in your colleagues computer... Do you have any other suggestions?
×
×
  • Create New...