Jump to content


Verified Members
  • Posts

  • Joined

  • Last visited


3 Neutral
  1. Hi, I have a Vive Focus Plus, and recently it stopped detecting one of the controllers. I have changed the batteries, tried rebooting, etc. The controller vibrates if I press the home button, but nothing is detected from the headset, and the controller white light blinks. At some point I had the brillant idea of doing a factory reset. Now not only one of the controllers is not detected, but I cannot complete the setup (I'm stuck because it seems it requires two working controllers). I thought it was a problem with the controller, and sent it to Vive to repair it, but today it returned and it's still not working (exactly the same issue). We use it for testing our products, I really really need a solution asap ๐Ÿ˜• Any ideas? Thanks
  2. I managed yesterday to setup the VIU on URP and it seems to work perfectly. Just to update everyone who like me were struggling to setup a 2019+ setup (after learning how to do it, it's pretty easy and straightforward btw) these are the final steps I'm using: 1 ) Create a new URP project on Android 2 ) Go to Project settings -> XR Plug-in Management, and install it 3 ) Open the Package Manager, enable Show preview packages 4 ) Under Unity Registry, install XR Interaction Toolkit 5 ) Open the Asset Store, search for VIVE Input Utility and install it 6 ) Under Preferences -> VIU Settings, search for WaveVR, click on Add Vive Registry and press Add 7 ) Press Add Wave XR Plugin Package 8 ) Select WaveVR 9) Right click on the Hierarchy -> XR -> Room-scale XR Rig 10 ) Build and Run Thanks @Dario & @lawwong ๐Ÿ™‚ Francesco
  3. Thanks Dario for the tip, I'll just write here what I did in case someone will need it in the future. I'm using Unity 2019.4, and I cannot guarantee that all these steps are necessary, but at least now I have few samples I can work with to undestand how it works ๐Ÿ™‚ 1) Open a new android project, of course 2) Go to Project settings -> XR Plug-in Management, and install it 3) Open the Asset Store, search for VIVE Registry Tool and install it 4) Open the Package Manager, enable Show preview packages 5) Under Unity Registry, install XR Interaction Toolkit 6) Under My Registry, install all the VIVE Wave XR Plugin packages (default, essence and native) 7) Return to Project Settings, and under XR Plug-in Management select Wave XR ๐Ÿ˜Ž Open Wave XR (from the list on the left) and select Essence 9) Here you can download a lot of samples Cheers, Francesco
  4. Hi, I only had experience with VR dev using the Wave SDK with Unity, I've never tried any other framework, I just got used to import the wavevr sdk, drag the WaveVR prefab, and manipulate it. Looking to the new release, I saw that is now listed as "SDK for Unity Legacy Plugin (Unity 2017.4+)", and I really would like to use a 2019+ version of Unity. So I tried to use the other method, the VIU approach, but I'm a little lost. As far as I understood, this approach is more generic, and it will allow my program to compile for other devices as well. Of course more generic implies that a lot of features already implemented in the sdk are no there, and I'm wondering if anyone can point me to the right direction. My questions are: 1) The Origin element in WaveVR_Render (WVR Pose origin on the ground/head/etc) exists somewhere? Or should I just place the prefab to an average human height? 2) The four cameras (center, both, left, right) are just replaced by the camera inside the ViveCameraRig? Any change there will be applied to both eyes? 3) If I want to change the rendering path to deferred, should I modify directly the Project Settings -> Graphics? Thanks for any information you can provide me, Francesco @Dario, @chengnay
  5. Update: I got a slightly better result with the beforeRenderEye callback: private void BeforeRenderEye(WaveVR_Render render, WVR_Eye eye, WaveVR_Camera wvrcamera) { var fakePlayerCamera = eye == WVR_Eye.WVR_Eye_Left ? fakePlayerCameraLeftEye : fakePlayerCameraRightEye; var playerCamera = eye == WVR_Eye.WVR_Eye_Left ? playerCameraLeftEye : playerCameraRightEye; var localToWorldMatrix = playerCamera.transform.localToWorldMatrix; fakePlayerCamera.projectionMatrix = playerCamera.projectionMatrix; localToWorldMatrix = otherPortal.transform.localToWorldMatrix * transform.worldToLocalMatrix * localToWorldMatrix; fakePlayerCamera.transform.SetPositionAndRotation(localToWorldMatrix.GetColumn(3), localToWorldMatrix.rotation); fakePlayerCamera.Render(); } But there's a slightly delay between the right and the left eye, causing an annoying asymmetric delay. Any suggestion is welcome ๐Ÿ™‚
  6. Hi, I'm trying to create a portal, and the user should see what's on the other side. To do that, I'm rendering two cameras (one for each eye) at the "destination", and the RenderTexture is projected with a simple custom shader to a surface (two actually, but it doesn't matter). My issue is that while on the simulator works (both on the editor and on direct preview), on the device the camera is rendered on a different time than the surrounding, giving the impression of a delayed render and breaking the effect of a continuous area. Right now this is the lateupdate code I'm using, anyone has clues on why this doesn't render correctly? private void LateUpdate() { var localToWorldMatrixLeftEye = playerCameraLeftEye.transform.localToWorldMatrix; var localToWorldMatrixRightEye = playerCameraRightEye.transform.localToWorldMatrix; fakePlayerCameraLeftEye.projectionMatrix = playerCameraLeftEye.projectionMatrix; fakePlayerCameraRightEye.projectionMatrix = playerCameraRightEye.projectionMatrix; localToWorldMatrixLeftEye = otherPortal.transform.localToWorldMatrix * transform.worldToLocalMatrix * localToWorldMatrixLeftEye; localToWorldMatrixRightEye = otherPortal.transform.localToWorldMatrix * transform.worldToLocalMatrix * localToWorldMatrixRightEye; fakePlayerCameraLeftEye.transform.SetPositionAndRotation(localToWorldMatrixLeftEye.GetColumn(3), localToWorldMatrixLeftEye.rotation); fakePlayerCameraRightEye.transform.SetPositionAndRotation(localToWorldMatrixRightEye.GetColumn(3), localToWorldMatrixRightEye.rotation); } I'm copying on the start method the right and left eye cameras settings, just to be sure to get the same fov, position, etc. Any idea? Thanks, Francesco
  7. Thanks, there is a model for the Focus Plus & controllers as well? @Tony PH Lin
  8. Amazing, thanks for the 3.1.1 ๐Ÿ™‚ The docs are still the one for the 3.0.2, I think the link has not been updated ๐Ÿคจ
  9. Hi everyone, I am trying to add a camera shake following this tutorial: http://www.zulubo.com/gamedev/2019/1/5/vr-screen-shake-the-art-of-not-throwing-up Sadly, while it works inside the unity editor (I'm using Unity 2018.2.21f1) it does nothing while deployed inside the Vive Focus Plus. You coul find the slightly modified script attached here, if anyone could point me in the right direction it will be amazing, and perhaps it will benefit other users as well ๐Ÿ™‚. Thank you, Francesco ScreenShakeVR.shader ScreenShakeVR.cs
  10. Hi, I'm having troubles while downloading the sdk, I get a permission denied. It's just me or the download link is broken? Thanks, Francesco
  • Create New...