Search the Community
Showing results for tags '@daniel_y'.
Found 2 results
Dear HTC support team: After I read the document and posts, I still have a few questions about data, could you please tell me whether I'm correct or not? I'm using Native C. 1. Dose the (x, y, z) of "Gaze_Direction_Normalized" mean the range of the virtual world? If it does, where are (0,0,0) and (1,1,1)? Does the object of the location in the virtual world as same as the value of Gaze_Direction_Normalized? 2. Do "Pupil_Position" and "Gaze_Direction_Normalized" are related? How? To my Opinion, "Pupil_Position" is 2-dimensional, but "Gaze_Direction_Normalized" is 3-dimensional, are their x and y matched? 3. Please refer to the picture below, I turned my head right, and I found that only Combine_Gaze_Origin changed a lot, could you please tell me the range and how to interpret it? Thank you. 4. I've noticed that the value of Eye_data_validata_bit_mask is always be 3.00, what does it mean? The document describes that it is the bits containing all validity for this frame, but I don't understand what it means, could you explain? Thank you Thank you very much for your help. Best Regards, summerlemon Not moved my head ____________________________________________________________________________________________________________________>>>>>turned my head right @Corvus @Tony PH Lin @Daniel_Y
I'm creating experiments in Unreal to test with the VIVE PRO EYE (which I just bought) to extract analytics of gaze time and points. Does it generate heat maps or visualisations like Tobii Pro Lab? Can you provide any information of how to extract and visualise data with Vive pro eye and Unreal? Or tutorial for how to use the eye tracking in Unreal? Thank you.