Jump to content

Hank_Li

Employee
  • Posts

    15
  • Joined

  • Last visited

Everything posted by Hank_Li

  1. Hi @VGagliano Yes it was~. Please flow the sample code to get system timestamps. int result = ViveSR::anipal::Eye::SRanipal_UpdateTimeSync(); if( result == ViveSR::Error::WORK ){ int64_t time; ViveSR::anipal::Eye::SRanipal_GetSystemTime(&time); printf("%u",(unsigned )time); }
  2. Hi @patrickabroad: Version 2 has two more BlendShapes than Version 1 (eye_wide and eye_squeeze). eye_wide is a value representing how open eye widely. eye_squeeze is a value representing how open eye closed tjghtly. Thank you.
  3. Hi @cte What does your mean about the local address? And what data do you want to get? thank you.
  4. Hi @goose_r_s We will fix it in the next version. And maybe you can refer to this article . Thank you.
  5. Hi @qxxxb: We will fix it in the next version. Thank you.
  6. Hi Asish, We did not provide such API for calculating the number of blinks per minute. Maybe you can calculate it according to the value of this blendshape (blinks) Thanks
  7. Hi @Av Suggest you do the calibration process first then try again. Thanks
  8. Hi @banila2 Please give me your log file which path is C:\Users\user_name\AppData\LocalLow\HTC Corporation\SR_Logs thanks
  9. Hi @nbhatia In the current version,there is no way yet to get the system time stamps. And we will implement that in the next version. Thanks
  10. Hi @DustProductions According to the log message you provided , we consider that the device can't collect your eye's data in the last part.Please, make sure you can see the point certainly. Thanks
  11. Hi @Asish Could you take a picture or record the process of your calibration. Thanks
  12. Hi @Asish According to the log file you provided , we consider that the device can't collect your eye's data in the last part. Please, make sure you can see the point certainly. Thanks
  13. Hi @hollandera Please refer to this unity code And you can get nose tracking you want in unity. int dart_board_layer_id = LayerMask.NameToLayer("NoReflection"); Vector3 direction = Vector3.forward; Vector3 origin = Vector3.zero; Ray rayGlobal = new Ray(Camera.main.transform.position, Camera.main.transform.TransformDirection(direction)); RaycastHit hit; Physics.Raycast(rayGlobal, out hit, 20, (1 << dart_board_layer_id)); You can download the unity sample here: https://developer.vive.com/resources/knowledgebase/vive-sranipal-sdk/ Thank you.
  14. Hi @hollandera Do you want to use the SRanipal function in unity code on Vive pro(no eye tracking machine)? Thanks.
  15. Hi Nikki: If you want to initial Eye V2 engine: 1. Please do not modify any header file we provide. E.g. in SRanipal_Eye.h, the following line should be: const int ANIPAL_TYPE_EYE_V2 = 2; 2. You need a separate flag to check if eye v2 is enabled, e.g., int error = ViveSR::anipal::Initial(ViveSR::anipal::Eye::ANIPAL_TYPE_EYE_V2, NULL); if (error == ViveSR::Error::WORK) { EnableEyeV2 = true; } Then in the function void streaming(), you need separate code to handle eye v2 data, e.g., if (EnableEyeV2) { int result = ViveSR::anipal::Eye::GetEyeData_v2(&eye_data_v2); if (result == ViveSR::Error::WORK) { float *gaze = eye_data_v2.verbose_data.left.gaze_direction_normalized.elem_; printf("[Eye v2] Wide: %.2f %.2f \n", eye_data_v2.expression_data.left.eye_wide, eye_data_v2.expression_data.right.eye_wide); } }
×
×
  • Create New...