Jump to content
 

mrk88

Members
  • Content Count

    22
  • Joined

  • Last visited

Community Reputation

1 Neutral

About mrk88

  • Rank
    Explorer

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks, but my question is how to calculate gaze position from gaze direction vector. For velocity calculation I need to have start and end gaze position.
  2. The problem with denormalizing is that I only have Height and Width of the display for each eye, but the Gaze Direction Vector is 3d. How to map the Z coordinate to the display resolution?
  3. Hello I am trying to calculate instant gaze velocity, using gaze position. I found out that pupil_position_on_sensor does not indicate the true gaze position. Therefore I have to use gaze direction vector, which is normalized. - How can I calculate instantaneous gaze velocity based on normalized Gaze Direction Vector? - Do I need to denormalize it? How? Thank you so much for nay ideas/help! @Corvus @Daniel_Y
  4. Hi Somnath I am also interested in knowing how to denormalize the gaze direction vector. Have you found out anything?
  5. So pupil position in sensor area cannot be used for gaze velocity calculation? What gaze position-related data can we use to calculate gaze speed? I am interested in doing a saccade detection algorithm, which is based on gaze speed (for which I need gaze position ) @Daniel_Y
  6. Thanks, I'm using Unity Recorder to record the HMD, (with Capture set to "GameView," and output resolution set to "Match Window Size")
  7. Also, why is the image resolution I have in the post above, less (536 x 1076) than the vive pro HMD resolution? Is that a Unity setting thing?
  8. Dear all, Does Gaze position data (pupil_position_in_sensor_area) correspond to the exact gaze position on the image being viewed? - It appears even when I look to the right most position of HMD, the values for gaze position (R and L) don't go above 0.7XXX. - When I map the points on the image, they don't match! Take a look at the attached image. The red crosses show gaze position during that frame. But I was looking at the yellow sphere during the whole time. - What does Pupil Position on Sensor Area mean? Is it different from where it would correspond to on the image being viewed?
  9. Hello! I am trying to measure the eye tracker latency (difference between when an eye sample is captured on the system sensor and the current time of the computer). Does anybody know what this timestamp of the computer actually shows? I know it is a number in millisecond, but what is the time zone of it? It should be comparable with the PC time in ms. - Would the headset's clock reset on every restart of the headset? - Are the timing matters similar to Tobii? Does this document from Tobii relate to the timings in Vive Pro as well? - Are there any more info on latency? (not on the forum posts, but official referable manuals/documents released by Vive Pro Eye people) Thanks Me and many others are using this headset for research, and things like timing are quite important to us (I am doing gaze-contingent stuff), and unfortunately there is nobody to clarify these things about the device! The manual is not very helpful (with many typos!), and the last place I find for getting more information is this forum, and I only find more people with questions like myself, and not many useful responses! @Daniel_Y @Corvus
  10. ViveProEyeProducerThread.cs Thanks. I attach this script to a GameObject in my scene. I am not able to collect any verbose data while using a thread. The verbose data I can only collect in Update function. and that is where I'm confused. If we can only collect verbose data in Update(), then we will be collecting them with 90Hz frequency not 120.
  11. Also, when collecting gaze data (Such as below), should we collect them in a thread (same as above)? or in the Unity's Update function? gazeOriginLeft = eyeData.verbose_data.left.gaze_origin_mm; //gaze origin gazeOriginRight = eyeData.verbose_data.right.gaze_origin_mm; gazeDirectionLeft = eyeData.verbose_data.left.gaze_direction_normalized; //gaze direction gazeDirectionRight = eyeData.verbose_data.right.gaze_direction_normalized; pupilDiameterLeft = eyeData.verbose_data.left.pupil_diameter_mm; //pupil size pupilDiameterRight = eyeData.verbose_data.right.pupil_diameter_mm; pupilPositionLeft = eyeData.verbose_data.left.pupil_position_in_sensor_area;// pupil positions pupilPositionRight = eyeData.verbose_data.right.pupil_position_in_sensor_area; eyeOpenLeft = eyeData.verbose_data.left.eye_openness; // eye openness eyeOpenRight = eyeData.verbose_data.right.eye_openness;
  12. and this is the code I collect the data with: void QueryEyeData() { while (Abort == false) { SRanipal_Eye_API.GetEyeData(ref eyeData); ViveSR.Error error = SRanipal_Eye_API.GetEyeData(ref EyeData); if (error == ViveSR.Error.WORK) { logResults(frameCount); logResults(eyeData.timestamp); logResults(eyeData.frame_sequence); frameCount++; logFile.WriteLine(" "); if (frameCount % 120 == 0) frameCount = 0; } Thread.Sleep(FrequencyControl); } }
  13. I am trying to poll eyeData in a thread, and I am recording eyeData.timestamp and eyeData.frame_sequence. - It seems that the timestamps are not received consecutively (there is a 7 or 8 difference between two timestamps). - Also, the frameCounts have missing frames too. For example in the samples below, the frame number 3506, 3511, 3518 and etc are missing: Frame #, eyeData.timestamp, eyeData.frame_sequence 0 599227 3500 1 599235 3501 2 599243 3502 3 599251 3503 4 599260 3504 5 599268 3505 6 599285 3507 7 599293 3508 8 599301 3509 9 599310 3510 10 599326 3512 11 599335 3513 12 599343 3514 13 599351 3515 14 599360 3516 15 599368 3517 16 599385 3519 17 599393 3520 18 599401 3521 19 599410 3522 20 599418 3523 21 599435 3525 22 599443 3526 23 599451 3527 24 599460 3528 25 599468 3529 26 599476 3530 27 599493 3532 28 599501 3533 29 599510 3534 30 599518 3535 31 599535 3537 32 599535 3537 33 599551 3539 34 599560 3540 35 599568 3541 36 599576 3542 37 599593 3544 38 599601 3545 39 599610 3546 40 599618 3547
  14. Can you elaborate a bit more about your calculations? are you using difference in gaze positions to compute velocity? How did you measure change in angle in 15 seconds?
  15. Hello! Thanks for sharing this. Can I ask how you actually measured these? Thanks
×
×
  • Create New...