Jump to content

Tony PH Lin

Employee
  • Posts

    885
  • Joined

  • Last visited

Everything posted by Tony PH Lin

  1. Hi @Thomas B., Do you mean you're using enterprise service (like DMS/VBC) to do batch configuration? Please describe more or attach the photos for us to lead you to get the right support. Thanks.
  2. Hi @psychicvrlab, Same result here when we tried on Focus 3. 2022.1: works. 2022.2: does not work due to Unity API parameters changed. We're still investigating what's conflict in between and try to figure out the solution or workaround, but it may still take some time. Will keep posted here. Thanks.
  3. Hi @Lacota, Thanks for issue feedback. We have an update version 5.2.1 to resolve DLLNotFound issue. Please check below. https://developer.vive.com/resources/vive-wave/download/latest/ Let us know if this could address the issue, thanks.
  4. Hi there, We're pleased to share an updating news to enable the Depth Sensor via Beta ROM, so developers could start development to blend the real and virtual worlds to create new experiences for mixed reality. (how to join Beta program.) Check it out how it works with Julbee MR Demo video we showed in MWC and GDC. Note: In latest Beta version, we have simplified and integrated the setup steps inside MR Room Setup in VRS Launcher. MRJelbee_forDeveloper.mp4 You may also interested to know what's the features and SDK APIs for MR developments, how to do performance tuning with MR contents, and how to manage the alignments with virtual objects and passthrough etc. Here are more info. which can help you to deeper implementation. Tutorials for Mixed Reality Development Unity Development Unreal Development
  5. Hi @jcm01, Scene Mesh can be broken down into two parts: Scanning and Reading. The demo scene you are looking at can only read Scene Mesh, and in order for the demo scene to be able to read anything, you need to scan and construct a scene mesh first. As for scanning scene mesh, this can only be achieved through a Beta ROM on XR Elite which has a version of MR Room Setup that can scan scene mesh. If you have access to that Beta ROM, you should be able to scan a scene mesh using MR Room Setup and read the scanned mesh in the demo scene. Also, the visual and collider mesh actually refers to scene mesh data with different levels of detail, which can be used for different purposes as implied by their names. For apply beta ROM, please refer to
  6. Hi @vethetheth, Sorry for the late reply since we're just back from GDC. And newest Direct Preview which support both USB/Wifi and hand tracking are online on Wave 5.2.0. Please refer to the link to download new driver and the steps. https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRDirectPreview.html Let us know any issue if you hit, and we will follow up to resolve. Thanks.
  7. Hi @Dan Lauer, Thanks for inquiry. For OpenXR support with Unreal 5.1, it's ON. https://developer.vive.com/resources/openxr/openxr-mobile/download/latest/ We keep working with Wave version, and it will be on next release (schedule may be around May).
  8. Hi @Dan Lauer, To support Unreal 5.1 on our next update in March is under development. However we still have some integration issues with UE 5.1, so we may release first our support on OpenXR plugin soon. For WaveVR plugin, once we have earlier progress, will let you know if we can try to provide Beta version for your trial. Will keep you posted.
  9. Hi @julie.helbecque, @vethetheth, @Austin Sullivan Unity, Sorry for the late response. Currently we're working on Direct Preview refactoring to improve connecting stability, user flow and add new feature like hand tracking support etc. and plan to have an update version with next SDK release (in March timeframe). URP support is on our plan, but it also requires more efforts and investigations to make the whole paths work. Will keep posted if the schedule is confirmed. Thanks.
  10. Hi @bjarmenzeit, @Austin Sullivan Unity, Currently we're working on Direct Preview refactoring to improve connecting stability, user flow and add new feature like hand tracking support etc. and plan to have an update version with next SDK release (in March timeframe). URP support is on our plan, but it also requires more efforts and investigations to make the whole paths work. Will keep posted if the schedule is confirmed. Thanks.
  11. Hi @jalley, May I know your target device and use application? Since currently we move most resources to provide OpenXR related functionalities, so once we know more about your application then we can recommend the proper solution for you. Thanks.
  12. Hi @wzs, Are you looking something to disable Chaperone when user is out of boundary? For example like LBE or Arcade use case for larger space movement, is this your application case? If it's your case then I'll route you to enterprise forum since we have different configurations for your application. Thanks.
  13. Hi @Identical Josh, Thanks for your inquiries, and you're right we disabled the direct access the raw data from camera image due to privacy policy. But we do understand there're many applications that require camera image to do further development so we provide alternative ways for developers to can achieve similar goals and results. For example, we provide several passthrough methods on latest Wave 5.1.0 and also add Scene SDK session for processing planes, anchors, meshes etc. https://hub.vive.com/storage/docs/en-us/ReleaseNote.html#release-5-1-0 If you have any specific targets which it's difficult to reach through latest SDK, we'd like to learn the use cases and discuss how to support these scenarios. Thanks.
  14. We did encounter similar symptom when our project managed one project across Oculus and Wave, and sometimes even we disabled Wave in the XR Management settings the related stuffs still exist in the logs. However it’s not 100% existing in Unity version and we guessed it’s also related to the sequences which one we import first. Since it might be more related to Unity environment sequence and flow problem, we have no idea at this moment. The alternative (workaround) we did internally was to remove either one of plugin in the project and redo the target build. Maybe you can try this way first and if the issue disappear, then it’s almost same issue we met. This is the experience and suggestion we can share.
  15. Hi Sir, Maybe you can refer our tutorials based on OpenXR to check if it's helpful for your case. Unity: https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/interact-real-world-openxr-scene-understanding/ Unreal: https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unreal-engine/interact-real-world-openxr-scene-understanding/ Thanks.
  16. Hi All, VRS-Studio open source project is update with V0.3.3a on Github. Please visit the link, thanks. https://github.com/ViveSoftware/VRS-Studio
  17. Hi @vethetheth, Sorry for late response. For the delay maybe due to hand occlusion? We have a FOTA 5.3 ROM update so maybe you can try again with this latest ROM. For developing question, we have numerous build and tests inside the headset. In terms of 32-bit build, we think no differences functionality wise, however, the 64-bit build should be less prone to out-of-memory issues (not that we have encountered memory issues on 32-bit so far). Personally recommend using 64-bit anyway. Hope this is helpful. Thanks.
  18. Hi @focus3fan, Thanks to bring this question for us. Currently in OpenXR there is no definition to do re-center. For system-level re-center, Focus 3 provides “Reset View” in VIVE menu. For in-app re-center, app has to record the pose transform by itself and covert runtime pose with this re-center transform to target pose. Here are more description from OpenXR: When a user needs to recenter LOCAL space, a runtime may offer some system-level recentering interaction that is transparent to the application, but which causes the current leveled head space to become the new LOCAL space. When such a recentering occurs, the runtime must queue the XrEventDataReferenceSpaceChangePending event, with the recentered LOCAL space origin only taking effect for xrLocateSpace or xrLocateViews calls whose XrTime parameter is greater than or equal to the changeTime provided in that event. When the user redefines the origin or bounds of the current STAGE space, or the runtime otherwise switches to a new STAGE definition, the runtime must queue the XrEventDataReferenceSpaceChangePending event, with the new STAGE space origin only taking effect for xrLocateSpace or xrLocateViews calls whose XrTime parameter is greater than or equal to the changeTime provided in that event. Event (XrEventDataReferenceSpaceChangePending) is provided from Openxr runtime for app. Thanks.
  19. Hi @ibrews, Thanks for point this out, also we update and turn the project to public for every developers now.
  20. Hi @EddyChan, Please contact with call center for trouble shooting support. Thanks.
  21. Hi @Knase, Thanks for asking. Basically this post is mainly for PCVR content developer who used OpenXR PC or SRanipal SDK, so they can stream PC content via VBS on Focus 3. We plan to roll out official Vive WAVE SDK next week to support eye expression, and also an OpenXR Mobile plugin. You're welcome to get a pre-release version through our server, and I'll PM you to support creating the access to have a trial and feedback.
  22. Here is the tutorial for reference. https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unity/how-integrate-hand-tracking-data-your-hand-model/
  23. Hi All, We'd pleased to share the update of new APK (v.0.0.3) for Vive Reality System (VRS) Studio. VRSStudio_0.0.3_20220914.apk The main change is to add new mini game: Dice and poker. The mini game demos how to use hand to interact with small objects, and how to handle the case when two hands are close. Thanks and let us know your feedbacks. Note: You may need to log in to download the APK.
×
×
  • Create New...