Jump to content
 

Search the Community

Showing results for tags 'eye tracking'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 7 results

  1. Vive Pro Eye Calibration Initialization Error Troubleshooting Problem: Launching calibration shows error: "Initialization Failed" Solutions: - Run "EyeCalibration.exe" manually in C:\Program Files\VIVE\SRanipal\tools\eye_calibration - Power off/on PC & Link Box. - Run 'sr_runtime.exe' as Admin. Default install path: 'C:\Program Files (x86)\VIVE\SRanipal'. - Update SteamVR Runtime. - Update graphics card drivers. - Possible issue with some models of MSI laptops fixed with rollback to earlier NVIDIA driver. Fresh install of NVIDIA driver 417.71 (GPU BIOS not being updated and it does not support the latest NVIDIA driver). - Uninstall 'VIVE_SRaniaplInstaller' & 'Tobii VRU02 Runtime'. Restart and Install latest 'VIVE_SRanipalInstaller_1.1.0.1.msi'. Plug in HMD and wait for any updates to complete. - Update motherboard integrated Intel Graphics Driver (fixes system reference to incorrect openCL.dll (intel) instead of NVidia). - Disable integrated graphic card - Possible issue with wireless adapter, try running calibration with wired. - Possible issues with early dev kits. Check System Requirements: - Use DisplayPort, mini-DisplayPort, or USB-C output from the dedicated GPU. - Windows 8.1 or later (64-bit).
  2. Hello. Is it possible to display the eyes of the user wearing Vive pro eye as an infrared image on the monitor? It will be helpful if someone tells me.
  3. Is it possible to spectate during the eye calibration from HTC? I use Unity and start the eye calibration by calling SRanipal_Eye_API.LaunchEyeCalibration() the HTC Eye Calibration is neither visible from the Unity game window nor being picked up from a screen recorder like unity recorder v2. I would like/I need to help participants (especially children!) during the calibration step, and it would be great if I can see what they see. It is also possible to change the language that the eye calibration screen uses programmatically? Thanks, @Corvus @Daniel_Y @zzy
  4. Hello I bought vive pro eye to do eye tracking at unity. But, when I start eye tracking calibration in steamVR. adjust the headset up and down to fit the frame. Even if the headset up and down, the image does not change. and I can't go next step...just stop ; - ; In addition, the camera in SRruntime is not recognized. (And I connect my vive port to graphics card connected to the monitor.) I reinstall several times after deleting all the program(steamVR, SRruntime, etc,,,). SteamVR also 'Camera communication failed.' I think it is a problem with the headset recognition or camera recognition. No matter how many time I search, I can't get the result. Please help me,,, T ^ T (I'm already do this : ) I'm using the following desktop computer: GeForce Nvidia GTX 1080 Ti 11GB VRAM Intel Core i5-4670 CPU 16 GB RAM Windows 10 pro (64bit)
  5. How do I get the current timestamp from the eye tracker in Unity? @Corvus @Daniel_Y @Andy.YC_Wang
  6. Specifically, I need to know where my participants will be looking at a precise time, so the video and the samples from the EyeData object must be perfectly aligned in time. I am also wondering what the equation is to calculate where the participant is looking from the left and right gaze_origins and the left and right gaze_directions or if there is a better way than trying to compute where the participant is looking in VR from those 4 sets of values. @Corvus @Daniel_Y
  7. Cory, from HTC VIVE, will conduct a free workshop on Eye Tracking development using HTC VIVE’s SRAnipal SDK. Topics will include eye tracking, foveated rendering, variable-rate shading, and using eye tracking for avatar animations. If you are interested in using eye tracking or foveated rendering in your VR content then come to learn, network, ask questions, and try a demo! This workshop is free and opened to the public. You will not need a SIGGRAPH badge to attend. RSVP for your free ticket here This workshop is in partnership with LA Unity 3D Meetup, XRLA Meetup, and Two Bit Circus. There's going to be a strong & passionate community of XR developers attending, and it'll be a great opportunity to connect / network. Location: Two Bit Circus – 634 Mateo St, Los Angeles, California Date: 7/30/2019, Tuesday Time: 6:30pm - 9:00pm Hope to see you there! Check back here on our developer blog for more workshop dates in the future. - Global Developer Relations Team
×
×
  • Create New...