Jump to content
 

juanyvos

Verified Members
  • Content Count

    17
  • Joined

  • Last visited

Community Reputation

0 Neutral

About juanyvos

  • Rank
    Settler

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Up @Cotta @Tony PH Lin @Sean Lu @Cotta
  2. Hi, I fixed it for the most part. Even though it can still happen. There's mostly 3 things that helped. 1. Using ASTC for textures. It saves up a lot of memory. But it decreases FPS up to 15%. 2. 64 bit build for extra memory. 3. Getting rid of AsyncOperation.allowSceneActivation = false. Always set it to true. You'll have to manage scene activation by yourself. As mentioned above, there's still a lot of memory to spare. But it's most likely a concurrent CPU & GPU call to shared RAM that breaks it. Cameras weren't the issue. And there's just so much more that Unity fails to support robustly for Adreno 540. Especially with LWRP. Can't really be helped though, since it's quite new. @Cotta @Tony PH Lin
  3. Hi @Cotta, @Tony PH Lin, I setup a new project using Unity 2019.2, LWRP 6.9.0. & Wave SDK 3.0.2. And foveation does indeed work on my DK headset! I'll try and figure out what went wrong in the original project.
  4. Hi, I'm trying to update to 3.1.1 but I'm having trouble building the project with Unity. Here is the output: Here is the AndroidManifest.xml. It worked fine with 3.0.2 and Unity 2019.2. Is there any new extra step needed? @Tony PH Lin
  5. Hi @Tony PH Lin, Sorry I've been on vacation. I'll check tomorrow!
  6. Ah! Wrong account. Hi @Cotta, I use Mono for dev builds & IL2CPP for releases. I checked with Mono but it doesn't seem to work. Thanks a lot for the info nonetheless!
  7. I forgot to mention that I use Unity LWRP. Could that impact foveation? Especially since it doesn't work in the editor either.
  8. Hi @Cotta, I have a DK on 1.95.xx.
  9. Hi @VibrantNebula, Thanks for the prompt reply! I'm using Unity 2019.2, Wave Unity SDK 3.0.2, with render mask & multi pass. I'll take a look into the native SDK. Thanks for the info!
  10. Hi, I have a question regarding the foveated rendering on Unity. Is it a feature that works on all HTC headsets, or only some? I'm trying to use it with the Vive Focus, but no matter what the parameters, it doesn't change what's displayed. Either it be with the Unity Editor or inside the headset. Has anyone had any success with it?
  11. I finally found out a solution. The whole thing is actually a Unity 2019 bug. Basically you can't use allowSceneActivation=false with some Android GPUs. Specifically the Adreno 540 at least.
  12. Hi, It's now been 4 days that I'm trying to fix this issue. I'm using Wave SDK for Unity 3.0.2, and Unity 2019.1.7f1. Basically, I have an "out of memory" issue even though there should be still a lot to spare. It only happens when I'm switching from a scene to another, but not all the time. The only mandatory part is for the new scene to be larger in memory than the previous one. Sometimes it's an OpenGL ES error "GL_OUT_OF_MEMORY". Sometimes an Android one "Could not allocate memory: System out of memory!" Here is some more info from the Android one: [ ALLOC_DEFAULT ] used: 559739971B | peak: 1137576874B | reserved: 575390594B [ ALLOC_TEMP_JOB_1_FRAME ] used: 0B | peak: 0B | reserved: 1048576B [ ALLOC_TEMP_JOB_2_FRAMES ] used: 0B | peak: 0B | reserved: 1048576B [ ALLOC_TEMP_JOB_4_FRAMES ] used: 3090535B | peak: 0B | reserved: 11534336B [ ALLOC_TEMP_JOB_ASYNC ] used: 0B | peak: 0B | reserved: 1048576B [ ALLOC_GAMEOBJECT ] used: 1681104B | peak: 2817667B | reserved: 1691554B [ ALLOC_GFX ] used: 6388369B | peak: 400481140B | reserved: 6398078B [ ALLOC_TEMP_THREAD ] used: 105416B | peak: 0B | reserved: 3440640B We can quickly determine that it doesn't have to go over 2GB total to happen. The OpenGL issue doesn't actually make the application crash, but the image displayed from that point on could be best described as "funky". It can disappear if you manage to leave the current scene for a lighter one though. In the meantime, Unity Profiler shows a "Total System Memory Usage" really close to 3.5GB. Sometimes going over it, without crashing the app. But most of the time it does. And the "Reserved total" is well below 3.5GB. Usually around 1.2GB. My money would be on an unsafe threaded concurrent memory call. I'm suspecting the camera rendering to be the issue. Basically the camera from the previous scene is deleted, and a new one is created. I tried having a "no current camera" gap between the two. Having no gap at all, or at least, the closest I could manage since so much is threaded. Disabling Multi Threading or Graphic Jobs didn't do anything. I tried every Unity graphics option actually. And I tried overlapping the two cameras. In every scenario, that message always appears: [EGL] Unable to acquire context: EGL_BAD_SURFACE: An EGLSurface argument does not name a valid surface (window, pixel buffer or pixmap) configured for GL rendering. (Filename: ./Runtime/GfxDevice/egl/WindowContextEGL.cpp Line: 287) Whether it crashes or not. What I will test next is to reuse the same camera throughout all the scenes. Hopefully it should work! I haven't tried Unity 2018 back yet, but I really hope I don't have to. I have one other last test as well, which is to move from ETC2 to ASTC texture compression, which I'll normally be able to test tomorrow. Has anyone encountered something similar? I'd really appreciate if you have, and could comment :) JY (tag added by forum moderator)
  13. Hi, It's now been 4 days that I'm trying to fix this issue. I'm using Wave SDK for Unity 3.0.2, and Unity 2019.1.7f1. Basically, I have an "out of memory" issue even though there should be still a lot to spare. It only happens when I'm switching from a scene to another, but not all the time. The only mandatory part is for the new scene to be larger in memory than the previous one. Sometimes it's an OpenGL ES error "GL_OUT_OF_MEMORY". Sometimes an Android one "Could not allocate memory: System out of memory!" Here is some more info from the Android one: [ ALLOC_DEFAULT ] used: 559739971B | peak: 1137576874B | reserved: 575390594B [ ALLOC_TEMP_JOB_1_FRAME ] used: 0B | peak: 0B | reserved: 1048576B [ ALLOC_TEMP_JOB_2_FRAMES ] used: 0B | peak: 0B | reserved: 1048576B [ ALLOC_TEMP_JOB_4_FRAMES ] used: 3090535B | peak: 0B | reserved: 11534336B [ ALLOC_TEMP_JOB_ASYNC ] used: 0B | peak: 0B | reserved: 1048576B [ ALLOC_GAMEOBJECT ] used: 1681104B | peak: 2817667B | reserved: 1691554B [ ALLOC_GFX ] used: 6388369B | peak: 400481140B | reserved: 6398078B [ ALLOC_TEMP_THREAD ] used: 105416B | peak: 0B | reserved: 3440640B We can quickly determine that it doesn't have to go over 2GB total to happen. The OpenGL issue doesn't actually make the application crash, but the image displayed from that point on could be best described as "funky". It can disappear if you manage to leave the current scene for a lighter one though. In the meantime, Unity Profiler shows a "Total System Memory Usage" really close to 3.5GB. Sometimes going over it, without crashing the app. But most of the time it does. And the "Reserved total" is well below 3.5GB. Usually around 1.2GB. My money would be on an unsafe threaded concurrent memory call. I'm suspecting the camera rendering to be the issue. Basically the camera from the previous scene is deleted, and a new one is created. I tried having a "no current camera" gap between the two. Having no gap at all, or at least, the closest I could manage since so much is threaded. Disabling Multi Threading or Graphic Jobs didn't do anything. I tried every Unity graphics option actually. And I tried overlapping the two cameras. In every scenario, that message always appears: [EGL] Unable to acquire context: EGL_BAD_SURFACE: An EGLSurface argument does not name a valid surface (window, pixel buffer or pixmap) configured for GL rendering. (Filename: ./Runtime/GfxDevice/egl/WindowContextEGL.cpp Line: 287) Whether it crashes or not. What I will test next is to reuse the same camera throughout all the scenes. Hopefully it should work! I haven't tried Unity 2018 back yet, but I really hope I don't have to. I have one other last test as well, which is to move from ETC2 to ASTC texture compression, which I'll normally be able to test tomorrow. Has anyone encountered something similar? I'd really appreciate if you have, and could comment :) JY
×
×
  • Create New...