Jump to content

Dominique

Verified Members
  • Posts

    5
  • Joined

  • Last visited

Reputation

0 Neutral
  1. Hi, I didn't knew such objects existed! I will order one and try out. Thanks !
  2. Hello there, I am working on a little simulation bay where I have a master computer and 2 worker computers. The worker computers have the Vive Wireless adapters installed and when I plug the HMDI in the worker computers, the Headset is detected and works fine. However, the final workflow would be to have no hdmi / screen connected to the worker computers and they could be accessed through windows's "Remote connection" utility. The issue is that when there is no screen connected such as at startup when the vive wireless app launches at boot or when i launch it manually through windows's "Remote connection" utility, the headset is detected but doesnt work and asks to check the cable between the vive wireless adapter and the HMD. I am pretty sure this isn't the issue since i can simply re-connect a screen using a HDMI cable, restart the vive wireless adapter software and it would work fine again. Are there any workaround for this issue ? Thanks ! Small visual representation of the setup (Yeah my paint skills are definitly not great)
  3. Hi everyone ! I am asking again how I would be able to setup a VR experience where I would have 2-3 VR Systems in the same space. For a more complete understanding, this is a description of the wanted setup in Unreal Engine : One Computer (without HMD) that only tracks Vive Tracker 3.0 to represent and manage complex simulations of a 3D object, such as a turret One Computer (with HMD) where the player should be able to interact / manipulate the 3D object One Computer (with HMD) where the player should be able to oversee the other player and be able to lightly interact with the 3D object Right now I am trying to make work only 1. and 2. together by syncing manually the world position of the vive trackers over network but it doesn't work as expected. When I tried to copy and past the config files (chaperone and lighthousedb), the vr player did not have correct distance with the 3d model. I'm not sure if I did everythin right, I have some experience in 3d graphics but 3d engine and gameplay is very recent stuff for me. I also read something about LBE but I didn't understand what it is ? Thanks in advance ! Here are some links i have already found and didn't help : https://forum.vive.com/topic/6443-vive-cosmos-multiplayer-in-same-room/ > SteamVR has implemented a bunch of error correction methodologies within the last year which actually makes a chaperone file dynamic - SteamVR now makes adjustments to the file at run-time without the user being aware of the changes which makes co-presence VR rather tricky (we have a tool for arcades that attempts to restrict these changes). https://forum.vive.com/topic/2828-4-players-in-same-play-area/
  4. Depending of your use case, yes it is possible. On my side, using Unreal Engine I can set up a VRPawn using a MotionControllerComponent (With a Tracker_[RoleInSteamVr] source) that gives me the position and rotation in Unreal World Space with respect to real-world distances between HMD / Controllers. I Use 2 trackers, one for locating a turret-like object base, another one attached on the turret for computing the facing rotation. I am actually investigating how OpenVR API is working for a network-calibration solution and if you are using OpenVR, once you find the tracker's ID you can simply get their poses. Hope it helps !
×
×
  • Create New...