Jump to content

Search the Community

Showing results for tags 'api'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 3 results

  1. Hi, I am unable to receive eye data at 120 Hz. On several Vive Pro Eye headsets I am getting very low data report rates using the C++ SRAnipal API. Using the C++ sample project included the SRanipal SDK (link), I print out gaze data once every 120 samples. The data reports at a rate much slower than 120 Hz: ~30Hz when executing ViveSR::anipal::Eye::GetEyeData in a loop ~60Hz when executing callback My SRanipal runtime and tobii platform information is attached. Vive software and drivers are up to date.
  2. Application: Academic Research Goals: Install SDK ----------------------------- [ X ] Get Eye Gaze ------------------------- [ ] Get Fixation --------------------------- [ ] Get Pupil Dilation ------------------- [ ] Run Subjects & Get Tenure ----- [ ] Question: How do I reference the SDK's framework / API to extract close to real-time eye tracking data that prints either in a data.frame or CSV file. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Hi there, I've been able to get the VR and eye tracking set up and working per a previous thread with UE 4.25.3 and have a demo with VR functional. The only thing is I'm not sure where to go from here to get the SDK to write/print data to a csv or similar file. Presently I don't need any eye tracking interactions within the VR environment so the dartboards and mannequin head are useful in that they show this data exists but I need to get that data out of whatever loop it is in and write it to a data file for processing in statistical programs and the like. @MariosBikos_HTC , you've been a great help so far. Let me know if you or another HTC fellow are the right ones to ask about this function. Once I get something functional I'll definitely be sharing it for future folks in my position.
  3. Hi, I have a HTC Vive Eye Pro and I am interested in recording eye tracking data as a way of seeing where the user's attention is focusing in the VR environment - for research purposes. So I am thinking being able to record the API check focus for example may be useful to this end. There would be different levels that the user would experience so the recording would need to be carried in such a way that displays what is being seen across the different levels of the simulated environment. What is the best way to go about this? I have seen API check focus works for any object in the scene that has a collider on it but I could not find how this can be setup and recorded. Is there any tutorial available that explains the workflow to do it? (I am currently using UE but it would be useful to know how this would work for Unity as well). I hope this makes sense and thanks so much for your help.
×
×
  • Create New...