Jump to content


  • Content Count

  • Joined

  • Last visited

Everything posted by zzy

  1. Hi @JCSegula The current release do have some jittering in the results and we are trying to improve it. Internally, we do have some improvements with the jittering and expect to release new version before the end of the year. For now, I would suggest you to make the buttons (or only the trigger area) slightly larger so you can trigger the button easier.
  2. Hi @JCSegula and @Kikuto In my answer in a related post, I think the detected hand and the see through hand aligns together. There might be some lagging in the detected when you moving your hand fast. Please follow the steps in the linked post and see if it suits your need.
  3. Hi @okocha1337 I tested locally on my computer and SRWorks and Vive Hand Tracking can work together. Here is my configuration: GTX 1070, Unity 2017.4.16f1, SRWorks v0.8.0.2, Vive Hand Tracking v0.8.2 Here is the step to setup the project: Follow the steps here to setup graphics card driver and SteamVR. Create new 3D project in Unity. Import both SDKs Open "Assets/ViveSR/Scenes/Sample/Sample.unity" scene, and accept all the steps prompted. In the scene, add GestureProvider script to the Camera with MainCamera tag. This should be the "[ViveSR]/DualCamera (head)/Camera (eye offset)/Camera (eye)" GameObject. Add LeftHandRenderer and RightHandRenderer prefab to the scene from "Assets/ViveHandTracking/Prefab" folder. Play in Unity. Some notes: Please first make sure if you see your hands detected using "Assets/ViveHandTracking/Sample/Sample.unity" scene. For the first time running on each machine, Vive Hand Tracking may take 20-30 seconds for setup. So your hands are not shown immediately after you clicked play. You can wait for some time and see if your hand is shown. Please share your configuration so we can help to identify the problem you encounter.
  4. Hi @JCSegula If you only need the front camera video frame, you can use OpenVR camera API to get the frames and display in your scene. This is what the double click home button on controller is doing. It can let you see the outside, but not mapping the outside exactly as the Photo you provided. If you need to map the real objects and virtual objects in you app, the easier way now is to use the SRWorks SDK. If you need to show hand skeleton results, you can use Vive Hand Tracking SDK. You can use both SDK together if you want to map virtual hands with the see-through image, though there might be some mis-alignment. To my knowledge, this is the only solution so far. However, if you do have the necessary knowledge about computer vision and computer graphics, you can try to create the mapping yourself to achieve the effect you what, using OpenVR camera frame and the hand detection result.
  5. Hi @zuhair Please download the latest v0.8.2 version of SDK, which contains a pre-built sample for Windows. Before running the sample, please follow the SteamVR camera setup on our document page: https://developer.viveport.com/documents/sdk/en/misc/steamvr.html For the first run on each machine, the SDK may need 20-30 seconds before showing hand results. Subsequent launches should be completed in 2-3 seconds.
  6. Hi @JCSegula I've tried on my side but failed to reproduce the problem. I tried Unity 2019.2.9f1 and 2019.3.0b7, but works fine on the built application, though I'm trying on the same machine that have Unity. Here is my steps: Create new Unity project with 3D template Import Vive Hand Tracking SDK Enable Virtual Reality supported in XRSettings Build and run for Windows platform Please let know if you have any new findings.
  7. Hi @JCSegula Thanks for your test. I will investigate why Unity 2019.3 does not work on my side.
  8. Then that seems strange. I'll try if I can reproduce the situation on my side. In the mean time, can you please try another Unity version see if it works or not? I would recommend 2017.4 or 2018.4, since these are LTS versions. I also tested with 2019.2.0f1 on my computer.
  9. From the error log, the camera failed to start, i.e. our SDK failed to get camera frame. Please make sure you followed the camera setup steps on the test computer (where you run the .exe). Please also try the prebuilt sample to see if it can work on the test computer.
  10. Hi @JCSegula Since you can run it using prebuilt sample package, I would assume your SteamVR setup is correct. The problem should possibly in the Unity side. What version of Unity are you using? Is there any error log in Unity console window after you play the scene? Does the display (skybox) changes as you moves your headset?
  11. Hi @zuhair The latest (v0.8.2) SDK contains pre-built sample in the package. You can download the package and find Windows sample (one binary for Vive/Vive Pro/Vive Cosmos) the "Prebuilt samples" in the zip file.
  12. Hi @MDV Games Not in current 0.8.2 version. We are still testing and improving performance internally. We plan to release it by the end of this year.
  13. Hi @A Hobbit 3D model support in Unreal plugin is available in the v0.8.2 release.
  14. Hi @YahyaDan and @jds Cosmos support is available in the v0.8.2 release.
  15. Hi everyone, Vive Hand Tracking SDK has a new release v0.8.2 today. You can download it from here. Highlight of changes: Update deep learning models for all platform. Add support for Vive Cosmos headset on Windows, supports 2D/3D/Skeleton mode. Add support for WaveVR armv8, this requires WaveVR 3.1.1 or newer. Speedup skeleton mode by 30%. Provide pre-made blueprints for displaying detected skeleton as 3D hand model in Unreal plugin. For detailed changes, please refer to release notes. Best Regards, zzy
  16. Hi @YahyaDan and @jds Vive Cosmos requires different camera API than Vive and Vive Pro, so the current 0.8.1 version cannot work on Cosmos. We plan to release a new version with Vive Cosmos support this month. There is still some internal process before we can release the SDK. Best Regards, zzy
  17. Hi @Danisinu There is no need for you to install NDK. The NDK API is used by our native android library. From the adb log, it seems your mobile phone does not support NDK camera API, which makes the detection fail to start. I would suggest you to change to another mobile phone or try to upgrade system ROM if possible. Another way to double check if the API is available, you can try the NDK Camera2 Sample as I mentioned in previous post. Best Regards, zzy
  18. Hi @Danisinu The screenshot looks correct to me, considering Vive Hand Tracking part. For other ideas, I would suggest you to try Sample scene in Vive Hand Tracking and see if it can detect your hand. By the way, which mobile phone are you using? Some mobile phone might not support ndk camera2 API, so we cannot start camera on these phones. To test if native camera2 API is supported or not, I recommend you to follow the steps in this post. Best Regards, zzy
  19. Hi @Sai Unfortunately, we don't provide option to enable vive hand tracking SDK for front camera on Android phones. The main reason for this is that our deep learning model is trained for first-person view hand detection, which is normally not the case when using front camera. Therefore, we cannot detect your hands when you are using front camera. Best Regards, zzy
  20. Hi @Danisinu To be honest, we haven't tested Vive Hand Tracking SDK with Vuforia AR SDK. Here is something that I think might be the reason: 1. Make sure camera permission is granted to the app. We need to use rear camera, so need to have camera permission. If you are using Unity 2018.2 or newer, the SDK can display the dialog at runtime. Otherwise, I would suggest you to grant permission manually. 2. From my understanding, Vuforia is also using the rear camera. I don't think Android API supports sharing some camera stream to 2 callers. To identify this, I would suggest you to see the log using "adb logcat -s Unity Aristo" to show both log of Unity and our plugin (you can show vuforia too if you know the keyword). If Vive Hand tracking failed to start the camera, the log will output errors, which might be a little hint. Please let me know about your findings, thank you. Best Regards, zzy
  21. Hi @akira fan This problem is caused due to a bug in the HandProvider script. It should be fixed in the next version, expecting to release in middle of October. Thanks for your report.
  22. Hi @runout Since you have no experience to either Unity or Unreal, I would suggest to start with Unity, with is easier to learn. Please follow the steps below (on Windows with Vive Pro plugged): Install Unity. You can install Unity Hub first, which is a Unity management tool, which can help you install Unity. You can follow the instructions in Unity Official Manual. When selecting Unity version to install, please scroll down and select the 2017.4.x (LTS) versions. When selecting modules, you can deselect all to reduce download size. Create a new Unity Project in Unity Hub. You can choose the 3D template. Follow the SDK document about SteamVR camera setup. In the created Unity Project, import Vive Hand Tracking SDK. Open menu Assets - Import Package - Custom Package ..., select Vive Hand Tracking Unity.unitypackage file. Modify project settings to support Vive Pro. Open menu Edit - Project Settings - Player, the player setting is shown in the Inspector window. Scroll down to XR Settings tab at the bottom, check Virtual Reality Supported, and make sure Virtual Reality SDKs contains only OpenVR. You can use the +/- button to add/remove other entries. Open our sample scene and run inside Unity Editor. In the project tab, open Assets/ViveHandTracking/Sample/Sample.unity scene. Press the play button on the toolbar at the top to run the scene. You can refer to Unity manual about Project Window, Open Scenes (at the bottom of the page), Toolbar & Play button. Please let me know if you have any problems following these steps.
  23. Hi @A Hobbit We have provided the 3D hand model rendering function in Unity plugin v0.8.1. For UE4, it will be supported in the next version (targeting early October). If you need to use it before October, you can try to implement the function yourself in UE4, here are some reference for you: 1. We are currently using posable mesh component to adjust the model nodes using skeleton results. 2. Instead of using the raw skeleton data, you can use the smoothed version of the result, which will make the 3D hand model looks more natural. You can reference the ModelRenderer.cs file in Unity plugin for how to do so.
  24. Hi @dipankerJuego Skeleton mode is currently not supported on Vive Focus and Android phone. We plan to add support in future release. For now, each hand only have a 3D position and gesture classification on these platforms.
  25. Hi @Ellen If you need to write your own code logic for handling teleport based on gesture input. You can disable the remote grab game object, so the same gesture does not trigger remote grab. You can refer to the HandStateChecker script for triggering actions based on gesture type.
  • Create New...