Jump to content

Federica

Verified Members
  • Posts

    14
  • Joined

  • Last visited

Reputation

0 Neutral

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi, I'm working on a project in Unity with HTC Vive Pro Eye headset and the SRWorks SDK. I would like to process the textures of camera image created in ViveSR_DaulCameraImageCapture.cs script with OpenCV. For example I would like to apply a Gaussian filter to see a blurred effect through the lenses of the headset. Do you know how and where in the scripts provided by the SRWorks SDK I can work to obtain this effect? Thank you.
  2. @Sean_Su @Daniel_Y Hi, I can't understand how in the ViveSR_DaulCameraImageCapture.cs script I can convert the texture to opencv mat. I would like to apply a gaussian blur filter to the textures left and right. Can you help me? Thank you.
  3. Hi, thank you for your answer. Let's hope then!
  4. Hello, I have imported the ViveSR Experience package into my Unity project. When I try to run anyone of the scene included, it is showed the error shown in the picture attached. I'm using: - Unity 2019.4.11 - SteamVR 1.16.10 - SRWorks SDK 0.9.3.0 Can someone help me solve? Thank you
  5. Thank you for your answer, I'll do as you have suggested
  6. Hi, I'm working with a HTC Vive Pro Eye headset. I need to know the distance value between the gaze origin and the lens (segment a) and the distance value between the centre of the eye and the lens (segment b) as shown in the schema attached. Is there the possibility to have a documentation with all the other technical specs of the hardware building? Thank you.
  7. Thank you for your answer. Anyway can you please tell me what pupil position is? Is it the position of the pupil on the telecamera or does it indicate where the user is watching on the headset screen? Another question. I have plotted the (x,y) gaze directions for the left and the right eye normalized respect to the z gaze direction obtaining the result in the picture attached. Why is there that offset between the plot and the red squares? I expected the plot overlayed on the squares. I've read that the origin of the HTC Vive headset is located 15 mm "behind" the eye tracker's origin. Can be the offset observed caused by this fact? Is there a way to solve it so that the plot is over the red squares? Thank you
  8. Hi, I'm using the SRanipal SDK in Unity to get eye tracking data from the HTC Vive Pro Eye headset. I have a question regarding the pos_sensor_Lx, pos_sensor_Ly, pos_sensor_Rx, pos_sensor_Ry variables (normalized by default in [0,1]). It is generically said that they rapresent the pupil position. I tried to follow with the Vive Pro Eye a square figure and then overlay the plot of the pupil position for each eye as in the picture attached below (where the image in the background represents the left and right eye view as seen in the Unity game view). However the result obtained does not convince me, I expected to have the plot overlayed on the red square. So my question is: what do (pos_sensor_Lx, pos_sensor_Ly), (pos_sensor_Rx, pos_sensor_Ry) represent? Are them the position of the pupil on the telecamera or do they indicate where the user is watching on the headset screen? Are the variables I have to consider to get the gaze coordinates or I have to consider something else? Thank you!
  9. Hello, I'm using Vive Pro Eye headset with Unity and SR Works SDK for an Augmented Reality project. I would like to get the stream video experienced through the front facing cameras and then apply to it some processing tecnique to have a real time modified aspect of the real world. Do you know where I can get the video stream and how I can use it? I have already read the documentation provided by SRWorks and the previous topics but I didn't find a solution. Thank you!
  10. Hello, I'm using the VIVE Pro Eye headset for a project in Unity (version 2019.4.11) in Augmented Reality with the SR Work SDK. To get the outside world through the front facing cameras in pass-through modality I have Imported the SRwork_FrameWork prefab in my Unity Project. I have to obtain the stream video and apply to it some image processing rendering. I suppose I have to search in the scripts related to the SRwork_FrameWork prefab, anyway I can't understand where in these scripts I can get the stream video information and where it is saved. I searched in the previous topics but there was nothing suitable for me. Can someone help me solve this problem? Thank you.
  11. Hello, I am using HTC Vive Pro Eye with Unity (version 2019.4.11) in pass-through modality. To do so, I have imported the SRWorks SDK. I want to see separate displays for the right and left camera, as in the picture attached. How can I do so? Thank you
  12. Hello, I am using HTC Vive Pro Eye with Unity (version 2019.4.11) in pass-through modality. To do so, I have imported the SRWorks SDK. I want to see separate displays for the right and left camera, as in the picture attached. How can I do so? Thank you
×
×
  • Create New...