Jump to content
 

Search the Community

Showing results for tags 'camera'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 9 results

  1. Hello, We have been testing around the VIVE Hand Tracking API + Unreal plugin with our VIVE Cosmos and after some profiling found out that as soon as detection started, FPS dropped from 90 FPS to 45~60 FPS. The following is the information of the test: - Test Computer: Intel i7 7700k, Nvidia GTX 1060 - Unreal 4.24: Blank map with nothing but a SkySphere, Hand Tracking Provider Blueprint Actor and the Cartoon Hands Blueprint Actor (both actors provided by the VIVE Hand Tracking Unreal plugin). - Performance: In this blank level, the FPS still dropped then to 45~60FPS, and this is not an acceptable frame rate for an optimal VR experience. - What we found out: We know that the resolution of the Camera affects largely on the performance of the image processing and that's why the VIVE VR (lower resolution camera) outperforms the VIVE Cosmos (higher resolution camera) in terms of hand tracking performance (and overall rendering of the scene). What we found out is that if the overall resolution is decreased to lowest as possible, we would get back at 90 FPS (although it might only be due to the performance gains in overall scene rendering). - What we want to ask you: Is there a way to change the Camera Resolution of the VIVE Cosmos? We want to make image processing faster to process and take less GPU resources. It does not matter if it's via using another SDK, we just need to know a way to do so.
  2. Hello, Trying to integrate a "passthrough" feature that will emulate the passthrough feature accessed by the double tap on power button on the Vive Focus Plus, so I can use the image to setup the virtual environment for my users. However, we're running into a problem : when the application is launched, if the headset enters power saving mode and the application is paused, the camera doesn't work and update anymore. Unity version is 2018.3.0f2, WaveSDK version is 3.1.94. If anyone has a bit more stability with their camera usage in their application, I would love to know how you made it.
  3. Hi, Could you tell me about Vive pro camera’s spec? I would like to know number of pixels.
  4. Just received my Vive pro yesterday and the setup was flawless and without problems. The controllers and the HMD was updated by STEAM VR to their latest firmware - so all looked fine. But while playing around with the Chaperone, i noticed that my Dual Front Cameras are not working or at least can not be enabled. I always get a "Camera Communication Error" in STEAM VR when testing the CAM So here is the setup and description for everything. All the info is also provided with the attached screenshots. I am using Win10 64 BIT ( win10_1909_build_18363.535) and STEAM VR Version 1.9.16 Windows Device Manager shows the Vive Pro Multimedia Camera (Driver Version 10.0.18362.1 dated 21.06.2006) and the device has the VID/PID 0BB4&PID_030E and it is connected to a USB 3 SuperSpeed Port. I also disabled the power management for this port (as instructed in the VIVE online documentation) The Windows USB Drivers for the SS USB Hubs are up-to-date (Version 10.0.18362.1 dated 18.03.2019) Also attached is the LOG File from the STEAM VR Developer module. I also checked the Windows 10 Privacy Settings for Cameras (everything enabled but no special entry for Vive or Steam can be found there) but to make sure, i tested if the cam is visible/available to other windows applications. And yes, the cam gets detected as a device in e.g. Corel Video Studio but i could not capture something - but that could be normal/OK. So, what can be done or are there any tools or progs where i can test/see if the cam is just "dead" or if that is only a Software issue, or any things i could try in addition? Best regards and TY for all help - much appreciated. Steam_VR_LOG.txt
  5. I received the Cosmos on Friday, set it up and played a few hours with it and everything went perfectly. Yesterday, I went to use it but couldn't get the cosmos to track at all. Its almost as if the cameras wont turn on. Doing room setup just shows a black screen behind the instructions and if I follow instructions it gets to the scanning phase and then stays there at 0%. While Room Setup is running SteamVR says headset and controllers are tracking but when room setup isn't running they flash not tracking. On the VIVE Console, they are always not tracking. I've tried the "reset headset and settings"and the "clear the environment information" options in troubleshooting multiple times. Ive tried rebooting the PC, restarting Steam, restarting SteamVR and Vive Console, restarting the LInkbox, moving USB cables around. I'm probably just missing something simple but I don't know where to go from here. Any suggestions on how to fix this are welcome. @stvnxu
  6. Hello, I was wondering if anyone could share what is the algorithm or worflow used by SR Works for example to make a see-through mode ? So first, is there any possibilities to get each camera intrinsics from OpenVR ? Do I need to call another API for that ? In OpenVR I am able to acquire a current front facing camera framebuffer, which is upside down and displayed as top/bottom for the left and right camera in the same image ( i actually use the distorted framebuffer). The problem comes after that... what can I do with this image ? In Unity, i apply the image on a mesh material as a texture. This is a particular mesh : it's an anamorphic plane (96° horizontally, 80° vertically). this mesh is transformed by the HMD matrices, and it's offseted a little on forward axis by few centimeters. Then i scaled up the mesh to become really far without changing anything in the field of view. If test it this way, everything seems promising, but if i add 3d objects in the scene, we can clearly see this is not working... The rate of rotation/translation is inequal. Any idea ? Thank you. @Daniel_Y @reneeclchen @Andy.YC_Wang @Jad
  7. There has been quite a bit of speculation on the types of cameras used on the Cosmos. Windows Mixed Reality and Oculus headsets reportedly use IR cameras, while the Cosmos uses RGB cameras. IR cameras are known for better low light performance. Many report poor exposure and dynamic range with the Cosmos RGB cameras while using them as a passthrough, which may explain why the headset has so much trouble with lighting conditions. If the root problem is the type of cameras used, then this can't be solved by software fixes, correct? Is there anything the team can say to assuage worries that core problems with Cosmos tracking can be resolved via software fixes, specifically addressing the types of cameras used on the Cosmos?
  8. I'm trying to show 2 kinds of screens(external camera - or webcam / front camera on HMD) on lens. I find a function? class? "WebCamDevice" - https://docs.unity3d.com/2020.1/Documentation/ScriptReference/WebCamDevice.html and also a project using that on Oculus VR. Anyway I imitate that project for study how to use that. I thought front camera is connected with PC so may they capture that cameras as webcams. but It doesnt. how can I project a texture on any object with a video from front camera? Is "WebCam" sth function right? this is an example what I want to do... that 2 Main Camera means 2 lens for stereoscopy and 2 quads are for screen sth(from Webcam or Front camera). I will attach a code I found at the bottom. It is tooo long to put here... I'm beginner in VR and Unity so I may think wrong about that. hope your wisdom and know-how!... thx developers! ------------------------------------------------------------------------------------------------------------------------------------------------- using System.Collections; using System.Collections.Generic; using UnityEngine; public class WebcamTexture : MonoBehaviour { WebCamTexture webcamTexture; public int webcamNumber; float cameraAspect; private float margin = 0f; public float scaleFactor = 1f; public bool rotatePlane = false; public bool fitting = false; //Transform cameraOVR; //public static Vector3 cameraPos; // Use this for initialization void Start() { WebCamDevice[] devices = WebCamTexture.devices; webcamTexture = new WebCamTexture(); if (devices.Length > 0) { webcamTexture.deviceName = devices[webcamNumber].name; Renderer renderer = GetComponent<Renderer>(); renderer.material.mainTexture = webcamTexture; webcamTexture.Play(); } if (rotatePlane) transform.Rotate(Vector3.forward, 180); if (fitting) FitScreen(); // camera position //cameraOVR = GameObject.Find("OVRCameraController") as Transform; } void FitScreen() { Camera cam = transform.parent.GetComponent<Camera>(); float height = cam.orthographicSize * 2.0f; float width = height * Screen.width / Screen.height; float fix = 0; if (width > height) fix = width + margin; if (width < height) fix = height + margin; transform.localScale = new Vector3((fix / scaleFactor) * 4 / 3, fix / scaleFactor, 0.1f); } // Update is called once per frame void Update() { //cameraPos = cameraOVR.position; //print(cameraPos); } } @Corvus @Daniel_Y
  9. I'm trying to track a webcam streaming live video using a vive tracker puck in unrealengine. I got most of it to work but because there is some delay in the video feed I need to delay the tracker data by an equal amount (several frames) so the tracker data will match up with the video. Is there a way to do this? @Jad @Dario @foo
×
×
  • Create New...