Jump to content

mrk88

Verified Members
  • Posts

    26
  • Joined

  • Last visited

Posts posted by mrk88

  1. 1 hour ago, Corvus said:

    @mrk88 Can you elaborate on what values you're getting that are wrong?

     

    From physics, I am using this formula to calculate the distance between two gaze position vectors:

    Mathf.Sqrt(Mathf.Pow((curPos.x - previous.x), 2) + Mathf.Pow((curPos.y - previous.y), 2) + Mathf.Pow((curPos.z - previous.z), 2));

    But this is not correct for displacement. Because the vectors are normalized and the distance between the two vectors is not actual values and not to the right scale (much smaller).

    I expect to have a figure like the one below, for hundreds of eye samples .. (the top figure)

    https://www.researchgate.net/publication/235788020/figure/fig5/AS:667218010456066@1536088578181/Example-of-eye-movement-data-Vertical-eye-position-upper-panel-and-eye-velocity-lower.ppm

    But. the plot I am getting looks like the attached figure. The vertical axis values should change with the respect to eye fixations and saccades. And when for example I fixate on a point for a several seconds it should remain constant over there (not quickly drop to zero).

    fig1.png

  2. I would like to measure the displacement of eye movements between two eye samples.

    I was using GazeDirection vector for that. However, since these values are normalized between [-1,1], the values and the plot I am getting is wrong.

    Can you please guide me about how these values have been normalized? and how to calculate the value of the displacement between two consecutive eye samples.

    @Corvus @Daniel_Y

  3. 1 hour ago, VibrantNebula said:

    @mrk88 - There's not a clean answer here - the Pro has "eye relief" so you can adjust the distance until it comes into focus. The bigger problem though is that everybody has different facial morphology and depending on how the HMD sits on their face, the type of facial interface you're using, and how "sunk" into their face their eyes are, there can be a pretty significant range of values across a sample pool even when you control for the eye relief.

    Okay, thank you very much for the info.

    Is there a way I can measure it , and then keep it at a fixed distance? ... I mean do you know of any references or suggestions about that?

    I am trying to measure the angluar magnitude of an object being displaced from one position to another, and for that I need to know the distance from the subject (because it makes a lot of difference in the calculations).

    • Like 1
  4. Hello

    I am trying to calculate instant gaze velocity, using gaze position.

    I found out that pupil_position_on_sensor does not indicate the true gaze position. Therefore I have to use gaze direction vector, which is normalized.

    - How can I calculate instantaneous gaze velocity based on normalized Gaze Direction Vector?

    - Do I need to denormalize it? How?

    Thank you so much for nay ideas/help!

    @Corvus @Daniel_Y

  5. On 3/17/2020 at 2:11 AM, Daniel_Y said:

    There is no direct angle mapping between pupil position in sensor and gaze vector.

    Pupil position in sensor is mainly used for positioning HMD up/down/left/right and IPD adjustment in calibration process to make user's eyes in correct position, i.e.: make pupil in the center of sensor.

    Have discussed in another thread.

    So pupil position in sensor area cannot be used for gaze velocity calculation?

    What gaze position-related data can we use to calculate gaze speed? 

    I am interested in doing a saccade detection algorithm, which is based on gaze speed (for which I need gaze position )

    @Daniel_Y

     

  6. Dear all,

    Does Gaze position data (pupil_position_in_sensor_area) correspond to the exact  gaze position on the image being viewed?

    - It appears even when I look to the right most position of HMD, the values for gaze position (R and L) don't go above 0.7XXX.

    - When I map the points on the image, they don't match! Take a look at the attached image. The red crosses show gaze position during that frame. But I was looking at the yellow sphere during the whole time.

    - What does Pupil Position on Sensor Area mean? Is it different from where it would correspond to on the image being viewed?

     

     

     

    frame.jpg

  7. Hello!

    I am trying to measure the eye tracker latency (difference between when an eye sample is captured on the system sensor and the current time of the computer). Does anybody know what this timestamp of the computer actually shows? I know it is a number in millisecond, but what is the time zone of it? It should be comparable with the PC time in ms.

    - Would the headset's clock reset on every restart of the headset?

    - Are the timing matters similar to Tobii? Does this document from Tobii relate to the timings in Vive Pro as well?

    - Are there any more info on latency? (not on the forum posts, but official referable manuals/documents released by Vive Pro Eye people)

    Thanks

    Me and many others are using this headset for research, and things like timing are quite important to us (I am doing gaze-contingent stuff), and unfortunately there is nobody to clarify these things about the device! 

    The manual is not very helpful (with many typos!), and the last place I find for getting more information is this forum, and I only find more people with questions like myself, and not many useful responses!

     

    @Daniel_Y @Corvus

    • Like 1
  8. 10 minutes ago, alwyuyang said:

    Could you upload the script how do you create the thread , read and write the eye data?  

    ViveProEyeProducerThread.cs

     

    Thanks. I attach this script to a GameObject in my scene. 

    I am not able to collect any verbose data while using a thread. The verbose data I can only collect in Update function. and that is where I'm confused. If we can only collect verbose data in Update(), then we will be collecting them with 90Hz frequency not 120.

  9. Also, when collecting gaze data (Such as below), should we collect them in a thread (same as above)? or in the Unity's Update function?

     

    				gazeOriginLeft = eyeData.verbose_data.left.gaze_origin_mm; //gaze origin 
                                gazeOriginRight = eyeData.verbose_data.right.gaze_origin_mm;
                                gazeDirectionLeft = eyeData.verbose_data.left.gaze_direction_normalized; //gaze direction
                                gazeDirectionRight = eyeData.verbose_data.right.gaze_direction_normalized;
                                pupilDiameterLeft = eyeData.verbose_data.left.pupil_diameter_mm; //pupil size
                                pupilDiameterRight = eyeData.verbose_data.right.pupil_diameter_mm;
                                pupilPositionLeft = eyeData.verbose_data.left.pupil_position_in_sensor_area;// pupil positions
                                pupilPositionRight = eyeData.verbose_data.right.pupil_position_in_sensor_area;
                                eyeOpenLeft = eyeData.verbose_data.left.eye_openness; // eye openness
                                eyeOpenRight = eyeData.verbose_data.right.eye_openness;

     

  10. and this is the code I collect the data with:

     

       void QueryEyeData()
        {
            while (Abort == false)
            {
                SRanipal_Eye_API.GetEyeData(ref eyeData);
                ViveSR.Error error = SRanipal_Eye_API.GetEyeData(ref EyeData);
                if (error == ViveSR.Error.WORK)
                { 
                    logResults(frameCount);
                    logResults(eyeData.timestamp);
                    logResults(eyeData.frame_sequence);
          		   frameCount++;
                    logFile.WriteLine(" ");
                      if (frameCount % 120 == 0)
                          frameCount = 0;
                }
                Thread.Sleep(FrequencyControl);
            }
        }

     

  11. I am trying to poll eyeData in a thread, and I am recording eyeData.timestamp and eyeData.frame_sequence.

    - It seems that the timestamps are not received consecutively (there is a 7 or 8 difference between two timestamps).

    - Also, the frameCounts have missing frames too. For example in the samples below, the frame number 3506, 3511, 3518 and etc are missing:

    Frame #, eyeData.timestamp, eyeData.frame_sequence

    0 599227 3500  
    1 599235 3501  
    2 599243 3502  
    3 599251 3503  
    4 599260 3504  
    5 599268 3505  
    6 599285 3507  
    7 599293 3508  
    8 599301 3509  
    9 599310 3510  
    10 599326 3512  
    11 599335 3513  
    12 599343 3514  
    13 599351 3515  
    14 599360 3516  
    15 599368 3517  
    16 599385 3519  
    17 599393 3520  
    18 599401 3521  
    19 599410 3522  
    20 599418 3523  
    21 599435 3525  
    22 599443 3526  
    23 599451 3527  
    24 599460 3528  
    25 599468 3529  
    26 599476 3530  
    27 599493 3532  
    28 599501 3533  
    29 599510 3534  
    30 599518 3535  
    31 599535 3537  
    32 599535 3537  
    33 599551 3539  
    34 599560 3540  
    35 599568 3541  
    36 599576 3542  
    37 599593 3544  
    38 599601 3545  
    39 599610 3546  
    40 599618 3547   

     

  12. On 8/14/2019 at 4:45 AM, gazeBehavior said:

     

    Here are the latency measurements taken by one of our students.  Images reflect the azimuthal component of the head and eye during vestibulo-ocular reflex (while fixating a stable object and rotating one's head about the vertical axis).  The first figure shows the original signals - deg/s over time.  The second shows the results of the cross correlation, and the third shows a zoomed in look at the signals before/after adjustment for the measured latency.  

    The two signals were maximally correlated with 83 ms of shift, meaning that there was 83 ms of latency in the eye tracker relative to the head tracker. Vive head tracking is typically 22-33 ms absolute latency, so that puts the ET above 100 ms of absolute latency.  
     
    We have only measured this once.  Really, we should measure a few more times, with different people.  We could also learn more by testing for possible effects of rotational velocity, and GPU load on the amount of latency.  
     
    An effect of the former (head velocity) would indicate that this latency results form a dynamic filter.  This seems unlikely, because if it were, they would just have us turn it off.  
     
    It's more likely attributed to the core functions of the eye tracking pipeline, such as pupil segmentation in the 2D imagery, or gaze mapping (which converts from 2D pupil centroids to 3D gaze vectors). If so, I would expect the latency to increase with GPU load.
     
    Something to test, anyhow. Or, HTC could jump in and say a bit more...
     

    image (1).png

    image (2).png

    image.png

    Hello!

    Thanks for sharing this. Can I ask how you actually measured these?

     

    Thanks

  13. Hi,

    I have been using the Vive pro on my current PC (Win7, Intel core i7 CPU 3.4GHz, 16GM RAM, and a NVidia GeForce GTX TITAN X Graphics card).

    Everything worked well until August, and I did not use it again until now. And now I just keep getting the " Initialization Failed" error on my calibration screen.

    I had a look at the solutions here, and tried them all, but none of them seem to help me.

    I feel that doing all these updates resulted in this failure. Has there been any other updates in the past 3-4 months, that I have missed? This is my school project and I really need to speed up my work, can't afford to be stuck on just a calibration failure for long.

     

    Thanks for any help/suggestions.

    @Corvus @Daniel_Y

  14. On 9/6/2019 at 2:08 PM, Corvus said:

    Vive Pro Eye Calibration Initialization Error Troubleshooting

    Problem:
    Launching calibration shows error: "Initialization Failed"

    Solutions:

    - Power off/on PC & Link Box.
    - Run 'sr_runtime.exe' as Admin. Default install path: 'C:\Program Files (x86)\VIVE\SRanipal'.
    - Update SteamVR Runtime.
    - Update graphics card drivers.
    - Possible issue with some models of MSI laptops fixed with rollback to earlier NVIDIA driver. Fresh install of NVIDIA driver 417.71 (GPU BIOS not being updated and it does not support the latest NVIDIA driver).
    - Uninstall 'VIVE_SRaniaplInstaller' & 'Tobii VRU02 Runtime'. Restart and Install latest 'VIVE_SRanipalInstaller_1.1.0.1.msi'. Plug in HMD and wait for any updates to complete.
    - Update motherboard integrated Intel Graphics Driver (fixes system reference to incorrect openCL.dll (intel) instead of NVidia).
    - Disable integrated graphic card
    - Possible issue with wireless adapter, try running calibration with wired.
    - Possible issues with early dev kits.

    Check System Requirements:
    - Use DisplayPort, mini-DisplayPort, or USB-C output from the dedicated GPU.
    - Windows 8.1 or later (64-bit).

     

    I have Win 7. I was able to do the calibration in July without any problems. When I try it now, I get the Initialization Error. Could it be because I need to upgrade to Win 8 or later? 

    But it worked fine before with my current windows!

    @Corvus @Daniel_Y

×
×
  • Create New...