Search the Community
Showing results for tags 'rendering'.
Found 3 results
In my project, it is very important to use separate cameras for each eye. Any camera with the Target Eye set to Both or Left appears to work as intended, but when set to Right, the result on the Android HMD is like a left-eye camera; only the left part of the display renders, and the right side does not render. My project has successfully used per-eye cameras on other platforms previously. This is the first time implementing Wave platform. I can't see how to fix this issue, which is probably not a concern for most developers, but for me it is critical. I am using Wave XR Plugin 1.0.0 (com.htc.upm.wave.xrsdk) on Unity 2019.4.8 LTS. @Tony PH Lin
Hi there, we are developing an eye test over a vive pro, and for comparison to an existing test we need to capture exactly what viewport is rendering at the original resolution (1400x1600 per eye). Does anybody think about a solution or aproach to get this. Put another camera in the rig is not the solution, we need to know which pixels are the user seeing and which not. @Corvus @MariosBikos_HTC
Hi there:) I'm graduated student and i studying ray tracing in VR with Unity. My program make rendering image of each eye and send them to Unity. Then show those images to each camera. My question is how can i make stereoscopic rendering image. I found some HTC Vive's official paper but i couldn't find. Temporarily i used off axis projection stereoscopic method. If you guys know official paper or thesis of this question plz give me some feedback. Thank you 🙂 @Tony PH Lin @Cotta