Jump to content

zzy

Verified Members
  • Posts

    219
  • Joined

  • Last visited

Posts posted by zzy

  1. Hi @okocha1337

    I tested locally on my computer and SRWorks and Vive Hand Tracking can work together.

    Here is my configuration: GTX 1070, Unity 2017.4.16f1, SRWorks v0.8.0.2, Vive Hand Tracking v0.8.2

    Here is the step to setup the project:

    1. Follow the steps here to setup graphics card driver and SteamVR.
    2. Create new 3D project in Unity.
    3. Import both SDKs
    4. Open "Assets/ViveSR/Scenes/Sample/Sample.unity" scene, and accept all the steps prompted.
    5. In the scene, add GestureProvider script to the Camera with MainCamera tag. This should be the "[ViveSR]/DualCamera (head)/Camera (eye offset)/Camera (eye)" GameObject.
    6. Add LeftHandRenderer and RightHandRenderer prefab to the scene from "Assets/ViveHandTracking/Prefab" folder.
    7. Play in Unity.

    Some notes:

    1. Please first make sure if you see your hands detected using "Assets/ViveHandTracking/Sample/Sample.unity" scene.
    2. For the first time running on each machine, Vive Hand Tracking may take 20-30 seconds for setup. So your hands are not shown immediately after you clicked play. You can wait for some time and see if your hand is shown.
    3. Please share your configuration so we can help to identify the problem you encounter.
    • Like 2
  2. Hi @JCSegula

    If you only need the front camera video frame, you can use OpenVR camera API to get the frames and display in your scene. This is what the double click home button on controller is doing. It can let you see the outside, but not mapping the outside exactly as the Photo you provided.

    If you need to map the real objects and virtual objects in you app, the easier way now is to use the SRWorks SDK. If you need to show hand skeleton results, you can use Vive Hand Tracking SDK. You can use both SDK together if you want to map virtual hands with the see-through image, though there might be some mis-alignment. To my knowledge, this is the only solution so far. However, if you do have the necessary knowledge about computer vision and computer graphics, you can try to create the mapping yourself to achieve the effect you what, using OpenVR camera frame and the hand detection result.

    • Like 1
  3. Hi @JCSegula

    I've tried on my side but failed to reproduce the problem. I tried Unity 2019.2.9f1 and 2019.3.0b7, but works fine on the built application, though I'm trying on the same machine that have Unity. Here is my steps:

    1. Create new Unity project with 3D template
    2. Import Vive Hand Tracking SDK
    3. Enable Virtual Reality supported in XRSettings
    4. Build and run for Windows platform

    Please let know if you have any new findings.

    • Like 1
  4. Then that seems strange. I'll try if I can reproduce the situation on my side. In the mean time, can you please try another Unity version see if it works or not? I would recommend 2017.4 or 2018.4, since these are LTS versions. I also tested with 2019.2.0f1 on my computer.

  5. Hi @JCSegula

    Since you can run it using prebuilt sample package, I would assume your SteamVR setup is correct. The problem should possibly in the Unity side.

    1. What version of Unity are you using?
    2. Is there any error log in Unity console window after you play the scene?
    3. Does the display (skybox) changes as you moves your headset?
    • Like 1
  6. Hi everyone,

    Vive Hand Tracking SDK has a new release v0.8.2 today. You can download it from here.

    Highlight of changes:

    1. Update deep learning models for all platform.
    2. Add support for Vive Cosmos headset on Windows, supports 2D/3D/Skeleton mode.
    3. Add support for WaveVR armv8, this requires WaveVR 3.1.1 or newer.
    4. Speedup skeleton mode by 30%.
    5. Provide pre-made blueprints for displaying detected skeleton as 3D hand model in Unreal plugin.

    For detailed changes, please refer to release notes.

    Best Regards,

    zzy

  7. Hi @Danisinu

    There is no need for you to install NDK. The NDK API is used by our native android library. From the adb log, it seems your mobile phone does not support NDK camera API, which makes the detection fail to start. I would suggest you to change to another mobile phone or try to upgrade system ROM if possible.

    Another way to double check if the API is available, you can try the NDK Camera2 Sample as I mentioned in previous post.

    Best Regards,

    zzy

  8. Hi @Danisinu

    The screenshot looks correct to me, considering Vive Hand Tracking part.

    For other ideas, I would suggest you to try Sample scene in Vive Hand Tracking and see if it can detect your hand.

    By the way, which mobile phone are you using? Some mobile phone might not support ndk camera2 API, so we cannot start camera on these phones. To test if native camera2 API is supported or not, I recommend you to follow the steps in this post.

    Best Regards,

    zzy

  9. Hi @Sai

    Unfortunately, we don't provide option to enable vive hand tracking SDK for front camera on Android phones. The main reason for this is that our deep learning model is trained for first-person view hand detection, which is normally not the case when using front camera. Therefore, we cannot detect your hands when you are using front camera.

    Best Regards,

    zzy

  10. Hi @Danisinu

    To be honest, we haven't tested Vive Hand Tracking SDK with Vuforia AR SDK. Here is something that I think might be the reason:

    1. Make sure camera permission is granted to the app. We need to use rear camera, so need to have camera permission. If you are using Unity 2018.2 or newer, the SDK can display the dialog at runtime. Otherwise, I would suggest you to grant permission manually.

    2. From my understanding, Vuforia is also using the rear camera. I don't think Android API supports sharing some camera stream to 2 callers. To identify this, I would suggest you to see the log using "adb logcat -s Unity Aristo" to show both log of Unity and our plugin (you can show vuforia too if you know the keyword). If Vive Hand tracking failed to start the camera, the log will output errors, which might be a little hint.

    Please let me know about your findings, thank you.

    Best Regards,

    zzy

  11. Hi @runout

    Since you have no experience to either Unity or Unreal, I would suggest to start with Unity, with is easier to learn. Please follow the steps below (on Windows with Vive Pro plugged):

    1. Install Unity. You can install Unity Hub first, which is a Unity management tool, which can help you install Unity. You can follow the instructions in Unity Official Manual.
      1. When selecting Unity version to install, please scroll down and select the 2017.4.x (LTS) versions.
      2. When selecting modules, you can deselect all to reduce download size.
    2. Create a new Unity Project in Unity Hub. You can choose the 3D template.
    3. Follow the SDK document about SteamVR camera setup.
    4. In the created Unity Project, import Vive Hand Tracking SDK. Open menu Assets - Import Package - Custom Package ..., select Vive Hand Tracking Unity.unitypackage file.
    5. Modify project settings to support Vive Pro.
      1. Open menu Edit - Project Settings - Player, the player setting is shown in the Inspector window.
      2. Scroll down to XR Settings tab at the bottom, check Virtual Reality Supported, and make sure Virtual Reality SDKs contains only OpenVR. You can use the +/- button to add/remove other entries.
    6. Open our sample scene and run inside Unity Editor.
      1. In the project tab, open Assets/ViveHandTracking/Sample/Sample.unity scene.
      2. Press the play button on the toolbar at the top to run the scene.
      3. You can refer to Unity manual about Project Window, Open Scenes (at the bottom of the page), Toolbar & Play button.

    Please let me know if you have any problems following these steps.

  12. Hi @A Hobbit

    We have provided the 3D hand model rendering function in Unity plugin v0.8.1. For UE4, it will be supported in the next version (targeting early October).

    If you need to use it before October, you can try to implement the function yourself in UE4, here are some reference for you:

    1. We are currently using posable mesh component to adjust the model nodes using skeleton results.

    2. Instead of using the raw skeleton data, you can use the smoothed version of the result, which will make the 3D hand model looks more natural. You can reference the ModelRenderer.cs file in Unity plugin for how to do so.

×
×
  • Create New...