Jump to content
 

zzy

Employee
  • Content Count

    70
  • Joined

  • Last visited

  • Days Won

    1

zzy last won the day on September 29

zzy had the most liked content!

Community Reputation

5 Neutral

About zzy

  • Rank
    Explorer

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi @JCSegula The current release do have some jittering in the results and we are trying to improve it. Internally, we do have some improvements with the jittering and expect to release new version before the end of the year. For now, I would suggest you to make the buttons (or only the trigger area) slightly larger so you can trigger the button easier.
  2. Hi @JCSegula and @Kikuto In my answer in a related post, I think the detected hand and the see through hand aligns together. There might be some lagging in the detected when you moving your hand fast. Please follow the steps in the linked post and see if it suits your need.
  3. Hi @okocha1337 I tested locally on my computer and SRWorks and Vive Hand Tracking can work together. Here is my configuration: GTX 1070, Unity 2017.4.16f1, SRWorks v0.8.0.2, Vive Hand Tracking v0.8.2 Here is the step to setup the project: Follow the steps here to setup graphics card driver and SteamVR. Create new 3D project in Unity. Import both SDKs Open "Assets/ViveSR/Scenes/Sample/Sample.unity" scene, and accept all the steps prompted. In the scene, add GestureProvider script to the Camera with MainCamera tag. This should be the "[ViveSR]/DualCamera (head)/Camera (eye offset)/Camera (eye)" GameObject. Add LeftHandRenderer and RightHandRenderer prefab to the scene from "Assets/ViveHandTracking/Prefab" folder. Play in Unity. Some notes: Please first make sure if you see your hands detected using "Assets/ViveHandTracking/Sample/Sample.unity" scene. For the first time running on each machine, Vive Hand Tracking may take 20-30 seconds for setup. So your hands are not shown immediately after you clicked play. You can wait for some time and see if your hand is shown. Please share your configuration so we can help to identify the problem you encounter.
  4. Hi @JCSegula If you only need the front camera video frame, you can use OpenVR camera API to get the frames and display in your scene. This is what the double click home button on controller is doing. It can let you see the outside, but not mapping the outside exactly as the Photo you provided. If you need to map the real objects and virtual objects in you app, the easier way now is to use the SRWorks SDK. If you need to show hand skeleton results, you can use Vive Hand Tracking SDK. You can use both SDK together if you want to map virtual hands with the see-through image, though there might be some mis-alignment. To my knowledge, this is the only solution so far. However, if you do have the necessary knowledge about computer vision and computer graphics, you can try to create the mapping yourself to achieve the effect you what, using OpenVR camera frame and the hand detection result.
  5. Hi @zuhair Please download the latest v0.8.2 version of SDK, which contains a pre-built sample for Windows. Before running the sample, please follow the SteamVR camera setup on our document page: https://developer.viveport.com/documents/sdk/en/misc/steamvr.html For the first run on each machine, the SDK may need 20-30 seconds before showing hand results. Subsequent launches should be completed in 2-3 seconds.
  6. Hi @JCSegula I've tried on my side but failed to reproduce the problem. I tried Unity 2019.2.9f1 and 2019.3.0b7, but works fine on the built application, though I'm trying on the same machine that have Unity. Here is my steps: Create new Unity project with 3D template Import Vive Hand Tracking SDK Enable Virtual Reality supported in XRSettings Build and run for Windows platform Please let know if you have any new findings.
  7. Hi @JCSegula Thanks for your test. I will investigate why Unity 2019.3 does not work on my side.
  8. Then that seems strange. I'll try if I can reproduce the situation on my side. In the mean time, can you please try another Unity version see if it works or not? I would recommend 2017.4 or 2018.4, since these are LTS versions. I also tested with 2019.2.0f1 on my computer.
  9. From the error log, the camera failed to start, i.e. our SDK failed to get camera frame. Please make sure you followed the camera setup steps on the test computer (where you run the .exe). Please also try the prebuilt sample to see if it can work on the test computer.
  10. Hi @JCSegula Since you can run it using prebuilt sample package, I would assume your SteamVR setup is correct. The problem should possibly in the Unity side. What version of Unity are you using? Is there any error log in Unity console window after you play the scene? Does the display (skybox) changes as you moves your headset?
  11. Hi @zuhair The latest (v0.8.2) SDK contains pre-built sample in the package. You can download the package and find Windows sample (one binary for Vive/Vive Pro/Vive Cosmos) the "Prebuilt samples" in the zip file.
  12. Hi @MDV Games Not in current 0.8.2 version. We are still testing and improving performance internally. We plan to release it by the end of this year.
  13. Hi @A Hobbit 3D model support in Unreal plugin is available in the v0.8.2 release.
  14. Hi @YahyaDan and @jds Cosmos support is available in the v0.8.2 release.
  15. Hi everyone, Vive Hand Tracking SDK has a new release v0.8.2 today. You can download it from here. Highlight of changes: Update deep learning models for all platform. Add support for Vive Cosmos headset on Windows, supports 2D/3D/Skeleton mode. Add support for WaveVR armv8, this requires WaveVR 3.1.1 or newer. Speedup skeleton mode by 30%. Provide pre-made blueprints for displaying detected skeleton as 3D hand model in Unreal plugin. For detailed changes, please refer to release notes. Best Regards, zzy
×
×
  • Create New...