Jump to content
 

zzy

Employee
  • Content Count

    170
  • Joined

  • Last visited

Community Reputation

17 Good

About zzy

  • Rank
    Constructor

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi @Tima Let me try to explain this. Please remember is that hand tracking is running at 40-60 fps, so the hand raw data is updated at lower frequency than VR rendering. When working in camera space, you apply current HMD transform to GestureResult every frame. When HMD is moving fastly, GestureResult does not change quite often, this results in virtual hand moves with HMD while hand is staying still in reality. When working in HMD space, the HMD transform is not applied to GestureResult every frame, instead the HMD transform is applied only raw hand position is calculated. Therefore, no matter how your HMD moves, hand stays still in the virtual environment. So in this case, it's better to use HMD space. There is also other cases, where it's better to use camera space. For example, when you moves HMD and hand together, making hand relative position to HMD not changed.
  2. Hi @Tima To solve this problem, please add the following lines to interface_gesture.hpp: struct GestureVector3 { float x, y, z; }; struct GestureQuaternion { float x, y, z, w; }; INTERFACE_GESTURE_EXPORTS void SetCameraTransform(GestureVector3 position, GestureQuaternion rotation); Please call SetCameraTransform function every VR frame (i.e. 90 times per second) to pass the HMD transform to native side. Like GestureResult, position and rotation are in Unity axis. After calling SetCameraTransform function, you no longer need to apply HMD transform to GestureResult. Note: This change will also be available starting from next release.
  3. Hi @Tima Do you call UseExternalTransform(true) before StartGestureDetection? This could be the cause for the problem, as you are applying hmd transform on GestureResult, making your hands move with your HMD. By the way, since the raw camera fps is 60, setting maxFPS above 60 has no use.
  4. Hi @Tima The current plan for next release is about early February. About grab strength, do you have any special use case that cannot use pinch strength? Also, do you have any suggestion about how to calculate grab strength from skeleton? I would think of using average closing angle of each finger, but not sure how to handle the case when some fingers are closed while others are stright (e.g. point/two gesture).
  5. @Siyo Just like my colleague as mentioned, this is a developer feature. If you want to see it in any of games you are playing, you need to make suggestion to the developers. The Neos VR has adopted our SDK in this September after requested by their users: https://github.com/Frooxius/NeosPublic/issues/339
  6. Hi @Tima Thanks for the suggestion. We plan to add support for joint rotation/pinch direction in c++ interface in next release.
  7. Hi @Tima Thanks for spotting the bug. We have fixed this internally.
  8. Hi @Tima After some debugging this week, I found the problem is caused by a calling convention error in openvr library. After upgrading openvr from 1.0.14 to 1.3.22, the problem is solved. Please find the attched patch for win32 (only include aristo_interface.lib and aristo_interface.dll). Let me know if this works on your side. 0.9.4_win32_patch.zip
  9. Hi @Tima Thanks for the minidump.
  10. Hi @Tima Thanks for the feedback. I'll check this week to see if I can reprodcue it on my side.
  11. Hi @kilik For first question, another thing I can think of is to install MSVC 2015 update 3 runtime on your PC. The dll not found problem might be caused by failure to load dependency dll. For second question, if your image is not correct, then it's possible hand detection fails. If you are hand in editor, is that using EditorEngine (which generates fake hand data)? For camera problem, I would suggest to follow steps here: https://hub.vive.com/storage/tracking/misc/steamvr.html
  12. Hi @kilik Here are some some points I can think of to check. You are using the latest 0.9.4 version. You are running Windows Unity Editor. Linux/Max is not supported. You can try to run the pre-built windows sample and see if that can work.
  13. Hi @Fangh Hand tracking SDK provides some extra function than Hand package in WaveVR package. You can see the document here for details: https://hub.vive.com/storage/tracking/unity/advanced.html Since hand package in WaveVR might change API in future, I would suggest to use Hand tracking SDK for your application.
  14. Hi @ericvids Yes, it's not normal why the program picked up Intel graphics but not NVIDIA one. I would suggest to check if your NVIDIA driver and OpenCL function is installed correctly. I recommend to download the latest driver from https://www.nvidia.com/Download/index.aspx?lang=en-us Also, thanks for your suggestion for the log. I'll add it in the next version. For now, you can modify `AssetsViveHandTracking\Scripts\Engine\ViveHandTrackingEngine.cs` (about line 127) and add the log: if (index <= lastIndex) return; if (index > 0 && lastIndex == 0) Debug.Log("Detection initialization completed"); lastIndex = index;
  15. Hi @ericvids Indeed, thre is no error message from unity log. So I assume the hand tracking is running (or still initializing). To run on Windows, our SDK will neeed to compile OpenCL kernels at runtime, which can be quite time consuming for the first run. This might take a few minutes before the hands can be detected. Please make sure you let the program run for a few minutes and check if hands are detected or not. There also can be other reasons which caused detection failure. I would suggest to upgrade GPU driver to latest version from NVIDIA/AMD official site.
×
×
  • Create New...