Jump to content
 

pablocael

Verified Members
  • Content Count

    16
  • Joined

  • Last visited

Community Reputation

0 Neutral

About pablocael

  • Rank
    Settler

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. After analysis very shortly, it seems it might be a update problem. It seems the camera position is updated in the C++ driver, and the unity object is updated later, so the plane that renders the image background moves with some delay in relation to the tracked camera within the unit.
  2. hi There, this shifting problem is preventing us from using Vive Pro trackers to match virtual and real objects. If any progress is made on this front please let us know! Its very frustrating to not be able to use the vive pro headset because of this problem. It is very strong shifiting and virtual and real object get totally unmatched. Thanks!
  3. Hi There, I notice a delay between tracking update and camera image update. This makes virtual objects to "oscilate" when you move the head: they dont mach the image perfectly because image plane moves after image gets updated. Im trying to match the delays. However, I cannot change FPS camera settings! Nothing I´ve tried works.. it just restar steamvr and the same (something about 30 and 60) value is set. Any hint? Thanks
  4. I just notice layer is set to DualCameraLeft, but it was just a test before taking the screenshot. It doesnt work on Default either... ;/
  5. Hi There. Its been a month since I tried the solution, but now I came back to try the exact same steps, and its not working anymore. Im posting the setup of my DepthMaskQuad. It just has no effects on virtual objects in the scene.
  6. Thanks. Using my own camera set to mainCamera works. If you set as LEFT or RIGHT the sistem changes fov. Now Im struggling to be able to access the final device texture. I realize that the final displayed scene is not equal to the preview of the Dual Left or Right cameras. It has some cropping. The final device texture is 1535 by 1706 and its just a crop part of the full ImagePlane quad. Can I know which rectangle (and its center) of the quad is used? I manage to manually set a orthographic camera that can generate exatcly the same rendertexture (with maybe 1 pixel or so different since it was set manually) from the final game screen. But since it is a render texture, it is kind of slow to access it as a Texture2D. Since I already have a Texture2D of undistorted image, it would be easier to just crop it if I knew exatly the scale / crop center of the final device texture. Can it be found in any easy way? Thanks
  7. Hi there, Im currently using OpenCV to track markers from VIVE Pro camera. Ive been sucessfull so far and Im working in an integration script. However, the reconstructed FOVY from images is about 33.12 degrees and currently DuaCamera Left and Right have 110.05 FOVY that cannot be changed. Is there any way of changing it? Thanks
  8. Thanks for the reply! What I really want is to position the camera based on a tracked marker in the real world. So I need to disable vive trackers and user vive pro cameras to calibrate de view and projection matrices. Then I want to set them myself. So in the case of a track overrride driver, I would need to access the textures of the front cameras as well.
  9. What are distorted and undistorted textures? Are they radial distortion corrected and non radial distortion corrected images? Thanks
  10. Hi, Im trying to position cameras using a pose reconstruction from a AR marker in a paper. So I need to place projection and view myself based on the reconstructed camera parameters. 1) can I disable vive pro tracking? 2) can I set projection / view matrix for vive pro displays (eye cameras)? Thanks!
  11. Thanks! Whats is the distorted texture? Its the webcam texture with radial distortion correction? Another two questions: 1) can I disable vive pro tracking? 2) can I set projection / view matrix for vive pro displays (eye cameras)? Im trying to position cameras using a pose reconstruction from a AR marker in a paper. So I need to place projection and view my self based on the reconstructed camera parameters. Thanks!
  12. Hi, is it possible to access VIVE pros webcameras as textures in Unity? I know SRWorks unity does that in a shader, and applies that to a quad that will be rendered in front os screen. Is this webcam texture (left, right or both) available from API? Thanks!
  13. Hi! Im trying to use Depth Image Texture to discard pixels of virtual objects whose depth are behind real objects. For instance, if my hand passes in front of a virtual cube, it would occlude it. The first naive ideia is to use a shader to directly compare depth values from virtual scene (using Depth Buffer) and real world (using HTC Depth Texture). However, those textures does not have the same proportion. If there a way of doing this using resources from SRWorks? A more depth think brought me to think that I would need to sample world points and generate a texture for each eye (since occlusion occurs differently for each eye). However I think I would need some kind of "raycast" for generating the depth values. Thanks!
  14. Yes, the Valve (display port) and the monitor must be in the same board (usualy side by side). At least it was what worked here. Good luck.
×
×
  • Create New...