Jump to content

Daniel_Y

Employee
  • Posts

    251
  • Joined

  • Last visited

Everything posted by Daniel_Y

  1. @mstengel You could leave a request in this thread List of Supported GPUs for v0.8.5.0.0 beta program.
  2. Accuracy is 0.5°~1.1° within FOV 20°, not related to distance. Eye surgery, eye disease, heavy makeup, and high myopia may affect eye tracking performance. @dagillespie
  3. May I have your detailed test scenario? always been reproduced?
  4. A simpler way, 1. Perform 3D reconstruction like this video first and save the scanned result as OBJ. 2. Reload the OBJ into you scene. 3. Through SRWorks Unity API, ex: GetAllHorizontal(Vertical) / GetLargestHorizontal(Vertical) / GetColliderWithProperties( property array ), to get your desired collider property for your virtual object placement. 4. Or, you could even perform 3D Semantic Segmentation like this video to find wall or floor. A Sample code provided in Experience_Unity > Assets > ViveSR_Experience > Scenes\Sample9 – Semantic Segmentation.
  5. The current version on developers website v0.8.0.2 does not support Turing architecture, ex: RTX 2080. You could leave a request in this thread, List of Supported GPUs, for Beta trial program.
  6. May I know what is your display resolution? Vive Pro Eye's front camera is a fisheye camera in 640x480 and HMD's panel is 1440x1600 per eye, so it may not be as good as seen in the desktop display.
  7. Do you use EyeData.timestamp in your code? v1.1.2.0 addressed an issue existed in v.1.1.0.1 that returns a wrong unit of timestamp. If you did your own workaround for it in your code with v.1.1.0.1, it may have unexpected behavior with v1.1.2.0.
  8. v2 is recommended. The major differences are O-mouth is split into multiple blend-shape for better animation To support Tongue To support eye's Wide blend-shape
  9. vivecosmos"s passthrough is different pipeline from SRworks with vive pro. The passthrough latency of SRWorks with Vive Pro is ~ 100ms since v0.7.5.0.0.
  10. It has been addressed in SR_runtime to v1.1.2.0. please update it here, https://hub.vive.com/en-US/download.
  11. You could try Eroding and Dilating in OpenCV to remove small islands, referring to https://docs.opencv.org/2.4/doc/tutorials/imgproc/erosion_dilatation/erosion_dilatation.html
  12. If you do not plan to use SteamVR, what way you use to render the content to HMD?
  13. "eye_frown" and "eye_squeeze" data has not been supported in the current version.
  14. @wky1995 SRWorks is not formally supported on the Cosmos at this time. We will have more communication about SDK support on Cosmos in the coming weeks.
  15. SingleEyeData as its name you can only know single eye's information but without the relation between 2 eyes. Thus, you need to use OpenVR API to know the current distance of 2 lens of HMD, i.e IPD.
  16. There is callback mode supported in 120 FPS now after check "Enable Eye Data Callback" as the figure shown below; And, then register a callback function by SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(). Could refer to its usage in sample included in SDK.
  17. After you performance eye calibration, you could read IPD via SteamVR/OpenVR.
  18. Leave your request for v0.8.5.0 in this thread List Of Supported GPUs
  19. @yanscheu You could consider your gaze vector mapping to HFOVxVFOV is ~110 degree covering 1440x1600 resolution per eye although it is not precisely because it is distorted image shown on LCD panel via a Fresnel lens to your sight.
  20. One possibility is due to OpenCL is not configured correctly on your PC. You could use a tool like GPU-Z to check if OpenCL is detectable on your PC.. @Mila
×
×
  • Create New...