Jump to content

Search the Community

Showing results for tags 'api'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • VIVE Community Forums
    • VIVE Technical Support
    • VIVE Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 6 results

  1. I am thinking of developing an API in Python using Flask library. I learnt the below. It has a /auth to just take any username and password and save it in database i.e. for registration. It has a /login to take username, password and validate it with database and then send a token. It then has a /xx call to take the token, validate the identity of user and return him with the requested data. I am now trying to understand the below. If I make an API call via custom script execution on SPLUNK to this API that I will develop, how will I pass Splunk API login credentials to the API and how will the API make sure that they are correct?
  2. There’re many inquires on how to use Passthrough image or access raw camera images to create a Mixed Reality experience or MR effects. Due to privacy policy, we do not offer ways to access the raw camera images, but here’re some examples on how to use our existing WAVE Passthrough Underlay/Overlay APIs. The following message will show you the difference between the Passthrough Underlay and Overlay, you may choose which one is best suited for your content design. Passthrough Underlay By calling "WVR _ ShowPassthroughUnderlay" API, you can show the Passthrough images underneath the original rendering content. Here are some examples that use Passthrough Underlay, such as VIVE Room Setup, VR boundaries, and 3D Objects. Passthrough Overlay By calling "WVR _ ShowPassthroughOverlay", the Passthrough image will be showing on top of everything in the scene.
  3. Application: Academic Research Goals: Install SDK ----------------------------- [ X ] Get Eye Gaze ------------------------- [ ] Get Fixation --------------------------- [ ] Get Pupil Dilation ------------------- [ ] Run Subjects & Get Tenure ----- [ ] Question: How do I reference the SDK's framework / API to extract close to real-time eye tracking data that prints either in a data.frame or CSV file. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Hi there, I've been able to get the VR and eye tracking set up and working per a previous thread with UE 4.25.3 and have a demo with VR functional. The only thing is I'm not sure where to go from here to get the SDK to write/print data to a csv or similar file. Presently I don't need any eye tracking interactions within the VR environment so the dartboards and mannequin head are useful in that they show this data exists but I need to get that data out of whatever loop it is in and write it to a data file for processing in statistical programs and the like. @MariosBikos_HTC , you've been a great help so far. Let me know if you or another HTC fellow are the right ones to ask about this function. Once I get something functional I'll definitely be sharing it for future folks in my position.
  4. Hello Everyone, In my company QBranch, we are making different tangible devices for AR/VR entertainment and professional training. Following our different customer requests and generally our own needs, we "link virtual and real" with the VIVE technologies and especially the Vive Tracker (3.0 for now) To get some inputs linked to the application, we are used to work with the pogo pins. But now, for our current project, we need some analog inputs that we can't get via pogo pins. So we decided to investigate the USB API to transfer to the PC via the Vive Tracker, all the axis you would find in a classic VIVE Controller. Following my different researches, here is all the different resources i've tried to work on to achieve my goals : https://dl.vive.com/Tracker/FAQ/HTC Vive Tracker 3.0_FAQ_v1.0_01192021.pdf https://dl.vive.com/Tracker/Guideline/HTC_Vive_Tracker_Developer_Guidelines_v1.3.pdf https://www.manualslib.com/manual/1730181/Htc-Vive-Tracker.html http://www.talariavr.com/blog/ https://www.brightdevelopers.com/communication-htc-vive-tracker-android/ https://github.com/matzman666/USBHost And many others, but here is the best I've found. I'm trying to make it running by myself for about 3 weeks now, so I finally try this forum to ask for some help from anyone, as I'm running now out of time to finish the device... We are currently dealing with the device with a development board from Texas instrument : Launchpad TM4C1294XL (an arduino-like board) The USB connection is established, everything is dealing correctly during enumeration phase, we can deal/select/ask with different USB layers and we managed to see different informations of the device. As an example, we managed to get the "Default Config" information, and run the Interface indexed "2" => String description : "Controller" Now we send the final packet with the payload defining the different "Controller inputs" we need to transfer, and nothing happen in SteamVR (tracking, top button and pogo pins are still working, but the USB gives no result in the "Controller Test Panel") Here are some questions to start this topic : - Is the Vive Tracker 3.0 still able to get some inputs informations with this API (like the 2018 version), or is it not working anymore, and this is a copy/paste error on the documentation (which is very similar to the 2018) - Is there a USB test application (on PC or Android) to send some testing inputs to the Vive Tracker, and see if we get something in the other side (in SteamVR inputs) - Is there any technical information I'm missing (Timing to send data, payload tricks, way to send it etc...) as the documentation has some mistakes I've tried to cross many different informations to get something running - Do you have some code samples of an embedded technology doing this (Arduino like or whatever) , I've found only old projects, before 2018. so I'm not sure about what I'm reading... - Is there a specific way to see the "Controller inputs" in SteamVR, as the Tracker inputs shown are only the ones linked to the pogo pins. For now, we cheated in the tracker firmware to have it shown as a VIVE Controller, so we have all the inputs (for my point of view this is more a cheat than a real solution, so I was wondering if there is something better possible) - Is There anybody here facing the same issues, or being able to tell me "hey we had it working!" - Any help please... Hope you will be able to help me and anybody trying to do the same in the world! Thanks for any help, I wish everybody a nice day, hoping we will be able to do nice things with this device soon! Cheers, Quentin MAYET
  5. Hi, I am unable to receive eye data at 120 Hz. On several Vive Pro Eye headsets I am getting very low data report rates using the C++ SRAnipal API. Using the C++ sample project included the SRanipal SDK (link), I print out gaze data once every 120 samples. The data reports at a rate much slower than 120 Hz: ~30Hz when executing ViveSR::anipal::Eye::GetEyeData in a loop ~60Hz when executing callback My SRanipal runtime and tobii platform information is attached. Vive software and drivers are up to date.
  6. Hi, I have a HTC Vive Eye Pro and I am interested in recording eye tracking data as a way of seeing where the user's attention is focusing in the VR environment - for research purposes. So I am thinking being able to record the API check focus for example may be useful to this end. There would be different levels that the user would experience so the recording would need to be carried in such a way that displays what is being seen across the different levels of the simulated environment. What is the best way to go about this? I have seen API check focus works for any object in the scene that has a collider on it but I could not find how this can be setup and recorded. Is there any tutorial available that explains the workflow to do it? (I am currently using UE but it would be useful to know how this would work for Unity as well). I hope this makes sense and thanks so much for your help.
×
×
  • Create New...