Search the Community
Showing results for tags 'gazedata'.
-
I want to perform something like the video Raycasting video where using raycasting , I could get the object display name in unreal. In this video, I used first person camera to get the world Location and world rotation. Now, I would like to do the same using user gaze data to observe at which objects the user is gazing currently. I have used many blueprints using getGazedata but nothing seems to be working efficiently like the video above. I have attached a blueprint as an example of what I am currently trying to do. Gazedata blueprint Please tell me how can I get something like the video above using the Sranipal eye tracking data using blueprints in unreal.
-
- sranipal
- ray-tracing
- (and 6 more)
-
Data Structure Vive.SR.Eye.SingleEyeData: ulong eye_data_validata_bit_mask float eye_openness Unity.Engine.Vector3 gaze_direction_normalized Unity.Engine.Vector3 gaze_origin_mm float pupil diameter_mm Unity.Engine.Vector2 pupil_position_in_sensor_area Theoretically it should be possible to get the focused point by calculating the intersection point of the linear equation of each eye (left and right) using gaze_direction_normalized and gaze_origin_mm. When debugging the data, I recognized that the vector gaze_direction_normalized is the same for each eye, which doesn't make any sense. Do you have any explanations why this is so? did I misunderstood something during the calculation of the focused point? Debugged Values: Combined Left Right gaze_direction_normalized (0.0, -0.1, 1.0) (0.0, -0.1, 1.0) (0.0, -0.1, 1.0) gaze_origin_mm (3.0, 5.4, -44.3) (32.4, 5.3, -44.3) (-28,6, 5.6, -44.2) @Daniel_Y @Corvus