Jump to content
 

zzy

Employee
  • Content Count

    142
  • Joined

  • Last visited

Posts posted by zzy


  1. Hi @Verlinch

    Are you using Vive Input Utility in your project? I'm not quite familiar with VIU, I'll see if I can ask some colleague to look into this as well.

    Meanwhile, I think we can check with the following:

    1. I assume your scene can work without hand tracking, i.e. by disabling GestureProvider script.
    2. Can you please try with sample scenes in VIU if this problem still exists? So I can try to reproduce this on my side.

    Please also share the version of VIU and SteamVR on your PC.


  2. Hi @andrewchilicki

    Unfortunately, Cosmos Elite is currently not supported by Hand Tracking SDK. This is due to camera stream is not available using current API. We are investigating the possiblitiy to enable camera stream internally, but there is no ETA at the moment.

    For now, if you want to use HMD with lighthouse, we recommend Vive Pro or Valve Index, which are both supported by Vive Hand Tracking SDK.

    However, wireless adapter is not recommended. With wireless adapter, camera stream is stopped and cannot be resumed if tracking is lost. The only way to have camera stream again is to restart SteamVR.


  3. Hi everyone,

    Vive Hand Tracking SDK has a new release v0.9.3 today. You can download it from here.

    Highlight of changes:

    1. Update deep learning models for all platform.
    2. Add support for Valve Index
    3. Speedup detection on Windows
    4. Support define custom gestures in skeleton mode in Unity and Unreal plugin
    5. Update mesh for cartoon hand
    6. Support simulate hand without VR devices in Unity Editor

    For detailed changes, please refer to release notes.


  4. Hi @tired

    You can select collider type to add collider by hand renderer, it adds collider for all joints.

    1297129170_.png.221eb2e15b486829129a135055ca8c50.png

    If you want to add colliders for fingertips only, you need to create custom scripts for this. You can reference hand renderer script for how to add colliders. Basically, you need create a few spheres colliders and update their position to match finger tip position.


  5. Hi @tired

    Please select the OnStateChanged method in dynamic int (red box), not the one in static parameters (black box). The dynamic version will send actual state to Grab script, while static one will always send fixed state (in your case 0) to Grab script.

    event.png.7af030441b9ce68385b22188ac1819a3.png


  6. Hi @duelgame

    I can see all the images now, I think both Unreal and Hand Tracking need to optimize GPU usage:

    1. Next release version of Vive Hand Tracking will improve GPU usage (target end of this month).
    2. From a quick google search, I found Instanced Stereo Rendering support is already in VR: https://docs.unrealengine.com/en-US/Platforms/VR/DevelopVR/VRPerformance/index.html. Is there any reason that you cannot enable this feature?

    We will keep improving GPU usage in future versions, but please understand currently improving accuracy has higher priority.

     


  7. Hi @duelgame

    I cannot see most of the pictures you attached, but here is my answer based on text in your question.

    To make it clear, here is hand tracking process: Camera frame -> Hand Tracking Process (GPU) -> output result in engine plugin. This has nothing to do with the render texture size, or vr.PixelDensity.

    However, VR rendering also uses GPU. Hand Tracking and render shares the GPU resource, so if you reduce workload for either side, you can observe performance improvement. This is case when you reduce rendering workload (i.e. by reducing vr.PixelDensity or using Instanced Stereo Rendering). You can observe the same thing by using 3d point mode instead of skeleton mode in Vive Hand Tracking.

    I'm not expert for Unreal, but I think you need to check SteamVR plugin for Instanced Stereo Rendering support in Unreal.


  8. Hi @Xyah

    Thanks for the test. Since the deep learning model are quite computational heavy, it's natural that it might eat up a lot of GPU resources.

    If you want to use lighter version of Hand Tracking, you can change the mode of hand tracking to 3D Point, which should be much faster than skeleton mode, although returns limited result.

    We plan to release new version before end of this month, which should bring some performance improvement. You can have a test when the new version is released. We will also keep improving performance, as there are several items in our backlog, but considering current SDK status, accuracy might take higher priority.

    Here are some references:

    Different between different modes: https://hub.vive.com/storage/tracking/overview/feature.html#hand-positions-modes

    Information about how to change mode in Unreal: https://hub.vive.com/storage/tracking/unreal/usage.html#start-stop-detection


  9. Hi @Xyah

    I have tested using our test machines (1060 & 2060) today. I'll answer your question first and then update my findings.

    1. You can find the prebuilt samples in the zip you downloaded from here: https://developer.vive.com/resources/knowledgebase/vive-hand-tracking-sdk/. "Prebuilt samples" folder contains binaries built using Unity.
    2. I think this requires camera API to support input resolution settings, but currently no API has such support (SteamVR or Vive Console), so we probably can do nothing about this.
    3. We did some performance tuning, but I think that not this big to solve your problem.

    About my findings, I think this might be a problem from Unreal side instead of hand tracking sdk. Here is my experiment, I use SteamVR frame timer to check the FPS and GPU times. In all my experiments, CPU is the bottleneck.

    1. I run Unity sample on 1060 & 2060, GPU time is larger when hands are visible, but the frame time is within 11ms (90 FPS).
    2. I run Unreal app, GPU time is significantly larger than Unity version, about 2x times are used in GPU.
    3. I tried to create a new default level, without hand related stuffs. It consumes very large GPU times, even worse than Unity sample with hand. This makes me think the problem is on Unreal side.

    Please let me know if this is the same situation on your side.


  10. Hi @Xyah

    Thanks for the test. As far as I know, there is no way to change camera resolution of original Cosmos.

    From your results, it seems the FPS drop only happens when hand is visible. We will check if we can reproduce it on our side. Meanwhile, can you confirm if it's same behavior in Unreal and Unity (pre-built sample)?


  11. Hi @Xyah

    Are you using Cosmos XR or original Cosmos? Camera resolution of original Cosmos is similar to Vive Pro, so I think it's not the root cause.
    Here are some questions/suggestions for the general frame drop problem:
    1. How about running the pre-built sample (built by Unity) and see if there is frame drop? You can use SteamVR to check if there is frame drop.
      1. I would like to know if this is specific to Unreal or not.
      2. We do have a test laptop with 1060, but Unity sample seems to be working fine with Vive Pro, Cosmos and Cosmos XR.
    2. Does it mean that Vive (or Vive Pro?) works fine on your PC without frame drop?
    3. Can you check if the frame drop is CPU bounded or GPU bounded? We also uses some CPU resources in our pipeline, and if CPU is 100%, this can easily cause lagging.
      1. From what I know, if Vive Console is running for a long time, it may use up to 100% CPU and slows down the PC. You can restart Vive Console in such case.

  12. Hi @Tima

    I've spent some time to check if we can support x86 in our current code base. There are some blockers:

    1. There are currently some dependencies (internal and external) missing x86 support.
    2. Performance of x86 is worse than x64, we need some time for tuning before release. Please note even after tuning, x64 still runs faster than x86.

    Based on the current situation, I would expect at least 2-3 months before we can release the x86 version.

×
×
  • Create New...