Jump to content

Daniel_Y

Employee
  • Posts

    251
  • Joined

  • Last visited

Posts posted by Daniel_Y

  1. 2 hours ago, nini613 said:

    @Corvus @Daniel_Y

    And I've noticed that the document describes the timestamp means "The time when the frame was capturing. in millisecond.",

    6. Is timestamp comparable with the local time? I've converted the timestamp to local time, but it doesn't match.

    7. Every time when I restart the headset, I get different value, would the headset's clock reset on every restart of the headset?

    Looking forward to your reply. Thank you.

    timestamp is is generated on the HMD. It would be be reset if to on/off headset.

  2. On 8/4/2020 at 3:46 PM, nini613 said:

    @Corvus Thank you very much for the replying!!

    But I still have a few questions want to know:

    1. I've noticed this figure(VIVE PTO EYE: use the "Right-handed coordinate system"), I guess it means that if the user is looking at the right, the x axis of right/left gaze direction would be +. I tested the results as the figure, if I didn't moved my head, looked at the right, but the x axis of gaze_direction was negative value, what does the (0,1)value and negative/positive value mean? how to interpret it?

    2. Does the the figure of coordinate mean relative position with the HMD? How to interpret the coordinates of gaze_direction if user turn his/her way left/right?

    3. Although I know that Gaze_Origin means the starting point of Gaze_Direction, I still don't know how to interpret the value, what does the (0,1)value and negative/positive value mean?

    4. I know that the CombinedEyeData Struct is not supported yet, Is the Combine Origin equal to "eye_data.verbose_data.combined.eye_data.gaze_origin_mm.elem_;"? What does the (0,1)value and negative/positive value mean?

    Thank you very much for helping me, I am looking forward to your reply.

    @Corvus @Daniel_Y

    1596098056778.thumb.jpg.0df3206f01059dad8125aa2ce216a582.jpg1596523654647.jpg.43dc4ad49b1aa2586b66d6a88a8cbb79.jpg

    • The eye gaze is from eye to outward, so +Z direction is from eye to outward. Thus you need to rotate the coordinate shown in your figure according to Right-handed system.
    • (0,1) means the gaze vector is normalized to a unit vector.
  3. @summerlemon

    1. It means it is normalized to a unit vector as the figure in right-handed coordinate system to represent your eye gaze direction, not representing to the object's global position.

    2. Yes, Pupil_Position is a 2D information to represent pupil position located in the eye-tracking camera sensor. Main purpose is to help you to adjust your HMD wearing in a right position.

    3. Not sure exactly how you tested and loge. Head movement does not mean eye gaze movement.

    4.  To use  DecodeBitMask() to check data validity as below.

                    if (DecodeBitMask(eye_data.verbose_data.left.eye_data_validata_bit_mask, ViveSR::anipal::Eye::SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY))
                        printf("SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY == true\n");

     

    Gaze.thumb.png.8d065adb2d6b7179412c26da4a4ab82d.png

     

  4. 12 hours ago, Dana said:

    Get the same crash as above both when running the sample application and play from unity.

    SteamVR version: 1.12.5 (this is a Beta or official release) Is there a way to downgrade to 1.11?

    Unity: 2019.3.15

    HTC Vive Pro

    Latest SRWorks SDK

    image.png.bf091daa04f794ac245d1b0ac521b0bd.png

    @Daniel_Y

    I have tested v0.9.3.0 with SteamVR version: 1.12.5 working.

    If you run  sample application failed, could you Zip these logs under C:\Users\$(user_name)\AppData\LocalLow\HTC Corporation to us?

  5. It may look looks like code snippet below.

     

    void callback(ViveSR::anipal::Eye::EyeData const &eye_data)
    {
        float const *gaze = eye_data.verbose_data.left.gaze_direction_normalized.elem_;
        printf("[Eye] Gaze: %.2f %.2f %.2f\n", gaze[0], gaze[1], gaze[2]);
    }

    int main() {

    .......

                    int ret = RegisterEyeDataCallback( callback);

                    while (looping)
                        dosomething();

    .......

    }

     

    • Like 1
  6. Which Unity version you use?

    Just double confirm if you follow the following?

    I did the following steps with Unity 2019.3.1.f1/SRWorks 0.9.3.0 and it works.

    Step 1: import Vive-SRWorks-Unity-Plugin.unitypackage

    Step 2: restart Unity

    Step 3: Import SteamVR from Unity asset store

    Step 4: import Vive-SRWorks-Unity-Experience.unitypackage

    Step 5: open scene Demo.Unity

    Step 6: Play

  7. 57 minutes ago, zardosch said:

    @Corvus

    Thank you very much for your reply!

    I'm coming from none-computer science background; Could you please send me a code snippet or give me a hint how can I extract the gaze_direction_normalized value.

    Thanks in advance!

     

    There are sample code supported in C/Unity/Unreal enclosed in SRanipal  SDK . Are you able to access it?

  8. 9 minutes ago, im_psk said:

    @uantwerpenFTI @Daniel_Y @mbaat Did anyone manage to get it working? I tried with three different vive pro eyes, two similar spec PC's, and different windows 10 operating systems. It has to be a sr runtime or steamVR issue

    Please try the latest version v0.9.3.0 here, "https://developer.vive.com/resources/knowledgebase/vive-srworks-sdk/"  which addressed incompatible issue with SteamVR 1.11.

    • Like 1
  9. On 5/16/2020 at 6:56 AM, watswat5 said:

    Hi,

    I'm trying to develop some AR apps for my cosmos elite. The SRWorks Unity example isn't working correctly (black screen with white rectangle) but also isn't throwing any errors.

    I also cannot enable my camera in the Vive console. The "Camera" option is greyed out. Does this mean that I can't access any of the 4 cameras on my Cosmos Elite?

    SRWorks v0.9.0.3

    SteamVR v1.12.3 beta

     

    Unfortunately, SRWorks does not support Vive cosmos Elite.

    • Like 1
  10. 13 hours ago, mrk88 said:

     

    Thanks, but my question is how to calculate gaze position from gaze direction vector.

    For velocity calculation I need to have start and end gaze position.

    You could refer to Focus sample enclosed in SDK to know what object you are looking at in VR 3D world from gaze vector.

     

     

     

×
×
  • Create New...