Jump to content

dario

Verified Members
  • Posts

    450
  • Joined

  • Last visited

Everything posted by dario

  1. Most of our effort is going into the new Wave XR plugin (2019.3+) especially since the current XR support is deprecated and will be removed in 2020.1 which is also forthcoming. It was only via the new XR architecture (2019.3+) that we are able to support LWRP/URP. Unity kept LWRP to not break the namespace for existing apps.
  2. Yes as referenced in this thread, you can start using the Mock HMD XR plugin for now to access singlepass URP. until the Wave XR Plugin is available.
  3. Yes we've been testing it, stay tuned.
  4. LWRP==URP, re: SRP we'll have a new XR plugin shortly,, will check about the 2 signals you mentioned - thanks!
  5. Hi @razoredge Thanks for the feedback, many of these settings depends on the applicaiton. Additionally for best performance, both Unity plugins Wave and VIU will prompt you for recommended settings that you can accept or ignore. For the most part these haven't been changing with each Unity release. Please do note if using recent Unity features like ECS or DOTS you many need to change your backend scripting to IL2CPP and .NET to 4.x. (e.g. our latest SRWorks SDK now requires .NET 4.x). Most of the time following Unity's defaults (and Wave SDK's recommended defaults) should suffice depending on your application's requirements. Unity documentation:here: https://docs.unity3d.com/Manual/class-PlayerSettingsAndroid.html#Configuration
  6. Good news on the URP and single-pass support, the current early access SDK (3.1.94) and preview rom (see pinned thread for details on how to access) has it working with the Mock HMD XR plugin (Wave XR plugin is still forthcoming). As with all transitions to URP it depends on your shaders (if you haven't transitioned yet) the shader graph is tested as working and URP does improve performance. Here's a couple screenshots of project settings for enabling URP as well as XR plugin settings: Here's a quick review on how to add URP to your project (after installing URP and the Mock HMD XR plugin via the package manager): 1) Create a pipeline asset under Assets - right click on Assets-> Create->Rendering->URP 2) add it to Graphics project setting (top screenshot) 3) optionally convert shaders for specific materials or whole project. - Edit->Render Pipeline -> URP...
  7. It shouldn't be necessary to modify androidmanifest dynamically, you should be able to edit the one in your project (e.g. to add permissions etc). Or you can avoid the manifest way of doing this and just call the api directly. wvr.Interop.WVR_SetInteractionMode(wvr.WVR_InteractionMode.WVR_InteractionMode_Gaze);
  8. The package and activity names should be discoverable along with all installed apps.
  9. Just to clarify, the new XR plugin architecture from Unity will be supported, in the meantime for cross platform development you can continue using both VIU and Wave SDKs for bests results.
  10. @lv1We would need to update the streaming app to provide that additional option. In the meantime, you could write a mobile app that would set that feature and launch the streaming app which you could publish on Viveport and brand it .
  11. @lv1, Right, so in this case, until we add that option to the Viveport Streaming app, you would have to write an app that would call that before starting the Viveport Streaming - technically you should be able to launch Viveport Streaming from this app after calling that API. And this app could be published and branded by you.
  12. For those interested in another way to disable the popup pairing dialog: we also have this API: wvr.Interop.WVR_SetInteractionMode(wvr.WVR_InteractionMode.WVR_InteractionMode_Gaze); Remember the original way using intents is no longer and the current way using the manifest should still work.
  13. Please check out last year's GDC Vive Developer day presentations and we'll have a new GDC 2020 sessions shortly as webinars. You're not limted to that old Unity blog anymore (plus OpenVR has changed since). Again the SRWorks SDK is your best bet for Vive Pro and Vive Cosmos and the upcoming XR faceplate for Cosmos. Upcoming webinar sessions: https://blg.vive.com/us/2020/03/24/htc-vives-gdc-sessions-go-live/ Last year's sessions: https://www.youtube.com/watch?list=PL1JC6N9u4ERirCMPljaSzud3UNNrM2PNB Again this is AR for developers to include in their apps - for overlay solutions that work arcoss any app I would recommend the aardvarkxr project: https://github.com/aardvarkxr/aardvark You can always play games currently with the pass through games on if you set the mode to transparent (no filter effects) for applications with black backgrounds it works well - e.g. TiltBrush.
  14. Just want to emphasize that stereo seperation / ipd can be handled with SRWorks and that we include an overlay example. Yes, this has to be included in your app and is not a 3rd party overlay solution for all existing apps, but you can definitely experiment with more AR/MR features until a universal overlay solution is available.
  15. Hi @jakelogg, For MR to be properly managed in your application you need the camera characteristics and ability as you mentioned to manage undistorted and distorted images. There's limited support in OpenVR APIs for all things XR that can be done with the front facing cameras. Starting with the Vive Pro we introduced our AR/MR "XR" SDK called SRWorks. You can read more here with example videos: https://developer.vive.com/resources/knowledgebase/intro-vive-srworks-sdk/ This SDK supports, Vive Pro, Vive Pro Eye and Vive Cosmos as well as Vive Cosmos + XR face plate (higher res cameras). You can even manage the calibration between the camera positions and store them so that every app can use that calibration.
  16. In case this is related: https://steamcommunity.com/app/250820/discussions/3/1752402219403291891/
  17. https://forum.vive.com/topic/2408-android-debug-bridge-tips/?tab=comments#comment-2415https://forum.vive.com/topic/2408-android-debug-bridge-tips/?tab=comments#comment-2415
  18. https://forum.vive.com/topic/2408-android-debug-bridge-tips/?tab=comments#comment-2415
  19. Gaze control (cursor in center of view based on head tracking) is available if the application developer adds it. We have sample code for this with the Wave SDK for developers. Ideally developers should consider fall back support for controllers -> hand tracking-or eye tracking or basic gaze support (somewhat in that order based on hardware), @Keane0411
  20. Did you see the Focus example (dart boards)? Check out the Focus call back on the dartboard object's script as well as the script on it's parent.
  21. Yes you need the base station and controller to run setup. To make sure your PC is VR ready and capable you can run the SteamVR Performance Test on Steam. @affan101
  22. If you can, try testing with different cables to confirm it's not a cable issue (check for loose connections) Are you using USB 3 ports?
  23. If you hear the USB connected/disconnected sounds please check your cables (try swapping them with other cables if possible to confirm) or check for loose connections.
  24. Is this with Unity? Please consider using the VIU plugin on the assetstore if so, just drop in ViveRig into your scene and controllers for Vive Focus, Focus+, and all the Vive PC hmds (original, Pro, Pro Eye) as well as all other platforms (Quest, WMR, etc_) will be supported. @dvp_dominic
×
×
  • Create New...