Jump to content

xulzee

Verified Members
  • Posts

    17
  • Joined

  • Last visited

Reputation

0 Neutral

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi @Tony PH Lin , I am sorry to reply you now. I have tried your solution and have done a lot of testing. But it doesn't work. Judder seems to happen randomly I want to try to add VIVE Focus option for ALVR ( https://github.com/polygraphene/ALVR ). So I need to display the captured video stream in VIVE FOCUS. ALVR works great in other helmets (Gear VR and Oculus Go). However, in VIVE Focus, if the video stream produces some frame loss, the picture in the helmet will produce some jitter, which will affect the experience. I don't know how to solve this problem.Or is this problem caused by dropped frames(Decode)? Thank you for your generous assistance.
  2. Hi , I checked fps, my app is about 75fps. But the Judder also occurs. Yes, I really want to show still image(or video) always in front of you head even pose change. I tried to look at the Overlay solution, but it might not be work for me. Because I need some differences between the left and right eye images. The Overlay method is always the same imgae in the center of the two screens.Can I use other methods to render this as an Overlay way? Thanks.
  3. Hi everyone, I used vive focus to render a still picture (that is, a picture) before my eyes. It can be displayed. But when I swing my head from side to side or up and down, the Judder occurs. I tried to guess that it might be due to the head pose. So I used higher frequencies to update the HMD pose, and it worked(frequency is lower), but Judder still did. Here is my rendering process: In the render thread:...while(true){ if (app->handleInput()) break; if (app->renderFrame()) break; } In the update pose thread: ... while(true){ WVR_DevicePosePair_t mVRDevicePairs[WVR_DEVICE_COUNT_LEVEL_1]; WVR_GetSyncPose(WVR_PoseOriginModel_OriginOnHead, mVRDevicePairs, WVR_DEVICE_COUNT_LEVEL_1); } bool MainApplication::renderFrame() { LOGENTRY(); if (!gMultiview) { mIndexLeft = static_cast<uint32_t>(WVR_GetAvailableTextureIndex(mLeftEyeQ)); mIndexRight = static_cast<uint32_t>(WVR_GetAvailableTextureIndex(mRightEyeQ)); } // for now as fast as possible renderStereoTargets(); if (!gMultiview) { // Left eye WVR_TextureParams_t leftEyeTexture = WVR_GetTexture(mLeftEyeQ, mIndexLeft); WVR_SubmitError e; updateHMDMatrixPose(); // update hmd pose to reduce judder e = WVR_SubmitFrame(WVR_Eye_Left, &leftEyeTexture, &(mVRDevicePairs[0].pose))); if (e != WVR_SubmitError_None) return true; // Right eye WVR_TextureParams_t rightEyeTexture = WVR_GetTexture(mRightEyeQ, mIndexRight); e = WVR_SubmitFrame(WVR_Eye_Right, &rightEyeTexture, &(mVRDevicePairs[0].pose))) if (e != WVR_SubmitError_None) return true; } updateTime(); // Clear // We want to make sure the glFinish waits for the entire present to complete, not just the submission // of the command. So, we do a clear here right here so the glFinish will wait fully for the swap. glClearColor(0, 0, 0, 1); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); usleep(1); return false; } In the rendering process, I did not pass in HMDPoseMatrix to shader, so my image should be unrelated to HMDPose. So what else could lead to Judder? Here are the specifications of the various software tools I'm using: Developer Kit Vive Focus Updated to 1.93.1400.2 Wave SDK Version is 3.0.2 Android Native SDK
  4. Hi everyone, I did some time-consuming tests on rendering. I found it took a long time in WVR_SubmitFrame(WVR_Eye_Right, &rightEyeTexture); Can anyone tell me where something went wrong? Or should it be? My test code is as follows: // Left eye start_1 = nanotime(); WVR_TextureParams_t leftEyeTexture = WVR_GetTexture(mLeftEyeQ, mIndexLeft); WVR_SubmitError e; end_1 = nanotime(); start_2 = nanotime(); updateHMDMatrixPose(); // update hmd pose to reduce judder e = WVR_SubmitFrame(WVR_Eye_Left, &leftEyeTexture); if (e != WVR_SubmitError_None) return true; end_2 = nanotime(); // Right eye start_3 = nanotime(); WVR_TextureParams_t rightEyeTexture = WVR_GetTexture(mRightEyeQ, mIndexRight); end_3 = nanotime(); start_4 = nanotime(); e = WVR_SubmitFrame(WVR_Eye_Right, &rightEyeTexture); if (e != WVR_SubmitError_None) return true; end_4 = nanotime(); LOGD("render a frame part 1 cost %.2f ms part 2 cost %.2f ms part 3 cost %.2f ms part 4 cost %.2f ms sum %.2f ms", (end_1 - start_1) / 1.0e6, (end_2 - start_2) / 1.0e6, (end_3 - start_3) / 1.0e6, (end_4 - start_4) / 1.0e6, (end_1 - start_1) / 1.0e6 + (end_2 - start_2) / 1.0e6 + (end_3 - start_3) / 1.0e6 + (end_4 - start_4) / 1.0e6); I got the following results: render a frame part 1 cost 0.01 ms part 2 cost 0.07 ms part 3 cost 0.01 ms part 4 cost 10.20 ms sum 10.29 ms render a frame part 1 cost 0.01 ms part 2 cost 0.07 ms part 3 cost 0.01 ms part 4 cost 11.78 ms sum 11.87 ms render a frame part 1 cost 0.01 ms part 2 cost 0.06 ms part 3 cost 0.01 ms part 4 cost 11.58 ms sum 11.66 ms render a frame part 1 cost 0.01 ms part 2 cost 0.06 ms part 3 cost 0.01 ms part 4 cost 10.96 ms sum 11.04 ms render a frame part 1 cost 0.01 ms part 2 cost 0.06 ms part 3 cost 0.01 ms part 4 cost 10.93 ms sum 11.01 ms render a frame part 1 cost 0.01 ms part 2 cost 0.09 ms part 3 cost 0.01 ms part 4 cost 11.74 ms sum 11.85 ms render a frame part 1 cost 0.02 ms part 2 cost 0.06 ms part 3 cost 0.01 ms part 4 cost 11.72 ms sum 11.81 ms You can see that e = WVR_SubmitFrame(WVR_Eye_Left, &leftEyeTexture); only takes a short time at a time(~ 0.1ms), while e = WVR_SubmitFrame(WVR_Eye_Right, &rightEyeTexture); takes a long time(~12ms).
×
×
  • Create New...