Jump to content
 

Activity

Showing topics in Vive News and Announcements, Vive Community Guidelines and FAQs, General Vive Discussion, Vive Technical Support, Developer Blog, Developer Discussion, Developer Support, Viveport SDKs and Downloads, Vive Wave SDK, Vive SRWorks SDK, Vive Audio SDKs, Vive Input Utility, Vive Hand Tracking SDK, Vive Cosmos Developer FAQs and Vive Eye Tracking SDK posted in for the last 365 days.

This stream auto-updates     

  1. Yesterday
  2. @alforno Like this? This is an overlay application that lets you create pass through frames into any SteamVR app. https://store.steampowered.com/app/864790/FragmentVR/
  3. Thank you VibrantNebula, but what I meant is much simpler. I would just need a camera showing me the real piano keys with my real hands at the same time as creating the virtual space around it, no tracking of hands necessary. In a way just like augmented reality where I have part of my real environment plus the VR.
  4. Greeting, Currently I having some issues that my app will crash whenever my client are using it from his side. I tested and casted from my side here that every time, there are no crashes. But when my client using it, it will crash and specially during casting at Windows Connect at his PC and using Windows Display Wireless Adapter. Here are my client PC hardware specs are:- Lenovo p330 i7 8th, 16G memory, LEADTEK p400 display card with 2G memory. Please, if anyone with have these kind of experiences, please share your solutions. Thank you.
  5. Last week
  6. Developers are the most important pillar of the Viveport Ecosystem, so we’re helping you kickstart 2021 with an extra special boost. Throughout 2021, Viveport developers will receive more payouts. From January 1st to December 31st we are increasing net revenue share from 70/30 to 80/20 (developers receive 80%, Viveport receives 20%). That’s right! Developers will receive 80% revenue share for all titles that are opted-in to any of the following Viveport Programs: – Viveport Store: one-time purchase – Viveport Arcade Program – Viveport In-App Purchase service And of course, the revenue share of titles opted-in to the Viveport Infinity program will remain at the 80% split. We hope this extra boost will make way for a fresh and bright new year. Already Published on Viveport? The new revenue share will be automatically applied to titles currently published under the eligible distribution programs.1. Login to the Developer Console 2. Select your VR title 3. On the default ‘Program Opt-ins’ tab, make sure your titles are opted into at least one of the eligible Viveport Distribution Programs. New to Viveport? Three easy steps to start making more from your VR titles!1. Sign up as a Viveport developer via the Developer Console 2. On the default My Titles page, select “Add New Title” and craft a new store listing. 3. When you see the ‘Program Opt-ins’ step, make sure you have opted in at least one of the eligible Viveport Distribution Programs
  7. @LochoChoco Were you able to fix the issue?
  8. @jboss It is possible to have the runtime start with windows startup and only require the UAC prompt once. Copy the app shortcut (the one created on the desktop during install) to the windows startup folder (%APPDATA%\Microsoft\Windows\Start Menu\Programs\Startup)
  9. @Ashish, it's in the SDK download. Under each version of the SDK (unity or unreal), there is a "document" folder with interactive HTML documentation. For example, Unity is at: .../SDK/02_Unity/Document/Eye/html/index.html Unreal is at: .../SDK_v1.3.1.1/SDK/03_Unreal/Document/Eye/html/index.html
  10. Since writing my previous response, I've become a little more bullish about taking frequent breaks when using VR. I now strongly think that it's wise every hour or so, you should take the headset off and spend a few minutes focusing at objects in your environment at various distances. This is a play on the "20-20-20 rule." Current VR headsets only have one focal plane and focusing on that single plane for too long can definitely contribute to eye strain.
  11. @alforno - It's technically possible but the technology is not fully there yet. There is a sidequest app called VRtous that demos the idea but at the end of the day handtracking technology isn't fully baked right now and there's a limit on how much precision you can get. In another generation or two - pass through and hand tracking will be good enough for this to work really well.
  12. @Corvus I did not find it in the SRanipal SDK documentation. Could you plz send me the link of this documentation? Thanks in advance
  13. @Asish Here is the EyeData image included with the SDK documentation which has information about the available output data and gaze angles.
  14. @Annabell @killerbee @AstroVR If someone is not focusing on an object (i.e. "staring off into the air") then each eye's gaze vector will generally not intersect. Even with perfect tracking quality, the eyes may be looking parallel at a far enough away point and parallel lines do not intersect. https://en.wikipedia.org/wiki/Vergence
  15. I'm trying to measure gaze angles in these way from the scene Please let me know what you think @Corvus.
  16. @Asish Were you able to measure the gaze angles? It should be possible with the gaze direction and Focus api.
  17. @qxxxb @jboss Thanks for reporting this bug. The team will look into it.
  18. @monetl I'll try and respond to each of these questions but let me know if you need further clarification on anything. 1. I don't fully understand this question. Would you like clarification on the "Focus" api functionality? Or are you looking for suggestions of an alternative solution? Please explain a little more or include a diagram. 2. IPD of virtual cameras is generally handled automatically by the game engine and plugins used. Are you using the legacy SteamVR plugin, the Unity XR Steam plugin, or the built in Unity SteamVR support on older Unity versions? It is not normal to modify the distance of the virtual camera IPD but was previously possible depending on the plugin/engine/rendering method. 3. The Unity Update loop is called for each rendered frame which is usually ~90Hz when running with Vive Pro Eye (unless performance issues are causing frames to drop). The callback method allows for multi-threaded data access at the fastest rate possible for the eye tracking device which is on average ~120Hz. 4. convergence_distance_mm is not implemented and will not report valid results. I believe this feature is available in the Tobii XR SDK which does work with the VIVE Eye Tracking SDK (SRanipal). 5. There was a bug fixed related to timestamps in the latest release. Please test and confirm if it is fixed for you. 6. Are you able to reproduce this issue in the sample scene with the mirror & gaze ray? Can you take a screenshot or video? 7. There is no access to the raw camera eye images. Some developers have explored this option with Tobii directly. 8... Filtering is added to prevent gaze tremble (ex. when looking at distant objects). Use SetEyeParameter to set the filtering level with "sensitive_factor". You can adjust the variable to effect the cutoff slope in the filter. 1 is raw gaze vector without adding a filter post-processing for gaze tremble. 0 is post processed with strongest filter to remove gaze tremble.` The default value is 0.007 and you can multiply or divide the variable by 10 until you notice the effect on the gaze data. Note: SetEyeParameter filtering only works on gaze rays. void SetFiltering(double filteringLevel) { EyeParameter parameter = new EyeParameter { gaze_ray_parameter = new GazeRayParameter(), }; parameter.gaze_ray_parameter.sensitive_factor = filteringLevel; Error error = SRanipal_Eye_API.SetEyeParameter(parameter); } 9. This looks related to a bug reported previously. https://forum.vive.com/topic/9172-gaze-origin-bug-in-getgazeray-of-unity-package/?ct=1610732330 10. You can use the "Combined Gaze Origin" to get the center point between the eyes. See the EyeData.png diagram in the documentation included with the SDK download.
  19. @Ihshan Gumilar What version of the eye tracking SDK & Runtime are you using? There are bug fixes in the latest release that fixed timestamp issues.
  20. @Davey I will try to look into the research to see if there have been tests of eye tracking with strabismus/esotropia conditions, it may also be helpful for you to contact Tobii about this inquiry. The SDK does report gaze direction for each eye independently.
  21. @Davey @nbhatia Calibration is 3 step process that requires both eyes. Step 1 is HMD placement, Step 2 is IPD adjustment, Step 3 is the gaze dots. Can the calibration be done with only one eye? No, the calibration will not complete without both eyes. can I modify the calibration data? No, currently not possible but a number of devs have requested this feature so we can re-examine. can I skip the calibration? For accurate eye-tracking the calibration is performed for each user when they use the HMD. Every face and HMD placement is different so calibration optimizes for this variance. If calibration is skipped the eye-tracking usually funcitons but is less accurate. is the data produced by each eye then assumed to be relative to the sensors with the assumption that the HMD and sensors are centered perfectly over the eyes?(obviously never the case). The calibration and runtime report the eye position relative to the HMD (usually not perfectly centered but ideally close). Feel free to check the documentation included with the SDK or the webinars/guides for information about the calibration specifics.
  22. @Emagnetto Thanks for sharing your feedback. Foveated rendering does not support the new Unity render pipelines (HDRP & URP). We will post in the forums if we have any updates to the roadmap or release schedule that we can share publicly. Also, if any other developers would like the plugin to support this feature in a future release please share your feedback here.
  23. Hi would it be possible to play a real (physical) grand piano, i.e. see the whole keyboard and at the same time be in a VR environment. Has this been done?
  24. For any devs experiencing error 50001 it is usually either the wrong app ID/Key copied from the dev console or not being logged into the Viveport PC client with the main developer account (that created the title listing). Most Viveport SDK api calls don't actually work fully with beta testers yet (only the DRM function fully works). If anyone else is experiencing these issue please post on the forums or contact support.
  25. @jaalto1 You can find some helpful steps in the SRanipal get started guide included with the SDK download. Download the SRanipal SDK Import the Unity plugin (.unitypackage) into the Unity project Open the sample scene and play in editor or build Check eye tracking works in sample scene and robot eyes are green in notification tray
  26. @ArAnish, If you start from a Blueprint project you can always convert it to a C++ project by creating a new C++ Class from the Unreal Editor (Add New-->C++ Class). Here is a link: https://allarsblog.com/2015/11/05/converting-bp-project-to-cpp/ You can use our SRanipal SDK both in a Blueprint-only project and in a C++ project. We recommend the latter, because then you can easily extend what is available in the plugin, e.g you can create extra BP nodes for your project.
  1. Load more activity
×
×
  • Create New...