Jump to content


  • Content Count

  • Joined

  • Last visited

Everything posted by Daniel_Y

  1. A simpler way, 1. Perform 3D reconstruction like this video first and save the scanned result as OBJ. 2. Reload the OBJ into you scene. 3. Through SRWorks Unity API, ex: GetAllHorizontal(Vertical) / GetLargestHorizontal(Vertical) / GetColliderWithProperties( property array ), to get your desired collider property for your virtual object placement. 4. Or, you could even perform 3D Semantic Segmentation like this video to find wall or floor. A Sample code provided in Experience_Unity > Assets > ViveSR_Experience > Scenes\Sample9 – Semantic Segmentation.
  2. The current version on developers website v0.8.0.2 does not support Turing architecture, ex: RTX 2080. You could leave a request in this thread, List of Supported GPUs, for Beta trial program.
  3. May I know what is your display resolution? Vive Pro Eye's front camera is a fisheye camera in 640x480 and HMD's panel is 1440x1600 per eye, so it may not be as good as seen in the desktop display.
  4. Do you use EyeData.timestamp in your code? v1.1.2.0 addressed an issue existed in v. that returns a wrong unit of timestamp. If you did your own workaround for it in your code with v., it may have unexpected behavior with v1.1.2.0.
  5. v2 is recommended. The major differences are O-mouth is split into multiple blend-shape for better animation To support Tongue To support eye's Wide blend-shape
  6. vivecosmos"s passthrough is different pipeline from SRworks with vive pro. The passthrough latency of SRWorks with Vive Pro is ~ 100ms since v0.
  7. @marcellotham VIVE_SRanipalInstaller_1.1.2.0.msi + SRanipal_SDK_1.1.0.1.zip
  8. It has been addressed in SR_runtime to v1.1.2.0. please update it here, https://hub.vive.com/en-US/download.
  9. You could try Eroding and Dilating in OpenCV to remove small islands, referring to https://docs.opencv.org/2.4/doc/tutorials/imgproc/erosion_dilatation/erosion_dilatation.html
  10. If you do not plan to use SteamVR, what way you use to render the content to HMD?
  11. "eye_frown" and "eye_squeeze" data has not been supported in the current version.
  12. @wky1995 SRWorks is not formally supported on the Cosmos at this time. We will have more communication about SDK support on Cosmos in the coming weeks.
  13. SingleEyeData as its name you can only know single eye's information but without the relation between 2 eyes. Thus, you need to use OpenVR API to know the current distance of 2 lens of HMD, i.e IPD.
  14. There is callback mode supported in 120 FPS now after check "Enable Eye Data Callback" as the figure shown below; And, then register a callback function by SRanipal_Eye_v2.WrapperRegisterEyeDataCallback(). Could refer to its usage in sample included in SDK.
  15. After you performance eye calibration, you could read IPD via SteamVR/OpenVR.
  16. Leave your request for v0.8.5.0 in this thread List Of Supported GPUs
  17. @yanscheu You could consider your gaze vector mapping to HFOVxVFOV is ~110 degree covering 1440x1600 resolution per eye although it is not precisely because it is distorted image shown on LCD panel via a Fresnel lens to your sight.
  18. One possibility is due to OpenCL is not configured correctly on your PC. You could use a tool like GPU-Z to check if OpenCL is detectable on your PC.. @Mila
  19. What 2D system your are refering to? LCD panel?
  20. Have you checked if you enabled camera in SteamVR like this, https://developer.vive.com/resources/knowledgebase/vive-pro-srworks-sdk-qa/?
  21. This is an unexpected behavior. It is expected in ms. We will have a fix for it.
  • Create New...