Jump to content


  • Content Count

  • Joined

  • Last visited

Everything posted by zzy

  1. Hi @sjpt Thanks for the suggestion. I will see if we can provide the pre-built binaries for common platforms in next release. As for the WebVR, we don't have plans to support it yet, I think you can stick to your existing plan.
  2. Hi @sjpt The sample used in the video is provided in the unity plugin. You can import the Vive Hand Tracking plugin and find the example scene in Assets/ViveHandTracking/Sample folder. Just build and run on supported platform, including Windows (with OpenVR enabled), WaveVR and Android phone (not all functions supported on phone).
  3. Hi @pcasanova On Android, hand tracking should be running at 30 fps, including time for waiting new camera frame, calculate hand results, and return results to c# interface. If "GestureProvider.UpdatedInThisFrame" is true, it means a new frame is calculated. You can record the time of last frame and calculate the FPS. Even if running at 30 fps, the latency is <33ms. But it might still feel lagging in VR, since the rendering is happening at 60 or more FPS.
  4. Hi @MDV Games Skeleton support is not available on Focus and Android phones at the moment. We have plans to support it in future, but we are still working on it.
  5. Hi @pcasanova I don't think a glove would help in such situation. In fact, using a glove would make detection worse for finger key points, since the texture feature would be hard to detection. For in-consistent detection, if you have a Vive Pro, you can try to see if that improves the situation. Using single camera to estimate depth would make the result quite strange in some situation.
  6. Hi Thanks for the feedback. Currently, the best experience of hand tracking is available on Vive Pro. If you are not using v0.8.1, we recommend you to update to this latest version, since it have better accuracy (especially for black wall). Since you are using Vive (which have only 1 front camera), the depth of the finger points are not quite accurate. This may causes the hand to lean outwards than your actual hand. Here are some suggestions: 1. Using Vive Pro can have much better results. 2. If you need to use Vive, sometimes put down both hands completely for 1-2 seconds and put it up back can ease the situation. 3. When move your hand from outside into the front camera FOV, please make sure your hand is open and all five fingers are clearly visible. As for moving head while hand stays still, the hand will have some shaking. This problem is a little complicated , it might be caused by multiple reasons: 1. When your head is moving, the motion blur is significant in the front camera image and making it very hard to detect the exact position of your fingers. So the detection results from deep learning might be shaking a bit. 2. Our calculation are first done in front camera frame and then aply the HMD transform. So there might be some mismatch between them due to mothion smoothing of the HMD position. 3. Currently, I don't have a solution for this problem, but we are definitely looking into it to see if we can improve the situation. Best Regards, zzy
  7. Hi , Currently the predefined gestures cannot be customized in the SDK, since adding each new gesture requires a lot effort. However, if you are using skeleton mode on Windows, you can customize your gesture using the skeleton results. You can use the 3D skeleton positions to calcualte straight/folded states of each finger and determine gesture. For example, if all the fingers are folded, then it's a fist gesture. A recommend way is to use the `ModelRenderer` script to get smoothed skeleton and calculate bending angle of each finger. Below is a sample code for calculate bending angle of each finger. Note: Below code uses raw hand data for illustration purpose, you can change to use the bone objects in hand model for better result. // fingerIndex is 0-4, for thumb-tail // result is in [0 (straight), 180 (folded)] float GetFingerAngle(GestureResult hand, int fingerIndex) { int startIndex = fingerIndex * 4 + 1; Vector3 root = hand.points[startIndex]; Vector3 joint1 = hand.points[startIndex + 1]; Vector3 joint2 = hand.points[startIndex + 2]; Vector3 top = hand.points[startIndex + 3]; Vector3 vec1 = joint1 - root; Vector3 vec2 = joint2 - joint1; Vector3 vec3 = top - joint2; float angle = Vector3.Angle(vec1, vec2) + Vector3.Angle(vec2, vec3); return Mathf.Clamp(angle, 0, 180); }
  8. Hi I think you have encountered problem when you tried to import the unity package. Below is a general step to use Vive Hand Tracking plugin on Vive Pro with Unity. 1. Install SteamVR and run room setup. Please also follow this guide to enable camera in SteamVR and upgrade your GPU driver if needed. 2. Create a new project in Unity with 3D template. 3. In XRsettings in player settings, enable virtual reality support and make sure OpenVR is the first in the list. 4. Download Vive Hand Tracking plugin from download page and extract the zip file. 5. Import the package from Unity menu: Assets - Import Package - Custom Package ... - Select Vive Hand Tracking Unity.unitypackage in the open dialog. 6. Unity will start unpacking the package and prompts you a dialog which lists all the files. Click Import button to import all the files. 7. Open sample scene in ViveHandTracking/Sample/Sample.unity. 8. You can now press the play button. Please note that when first run on each machine, it may take 30-60 seconds before the hand is shown up.
  9. Hi Actually, we don't have the translation matrix. In Hand Tracking SDK, we calculate 3d points based on front camera intrinsics and extriniscs. The unity plugin renders the 3d points based on rendering camera settings, and thus displayed on your screen and HMD. I think you may ask the SRWorks side for this, if they have some mappings for virtual objects that need to appear at some specific place on the texture. Can you help on this question? Thank you. Best Regards, Zhongyang
  10. Hi I would recommend you to try with SDK v0.8.1 to see if this problem still exists. Prior to v0.8.1, we used hand calibrated intrinsics for Vive Pro, this might be different from the parameters used in SRWorks. In v0.8.1, we changed to use OpenVR API to get intrinsics which should be similar to what SRWorks used. As for the hand material, if you don't need the hand, you can simply disable the HandRenderer, or change it's layer to an invisible layer.
  11. Hi Thanks for trying out. For the "OK" gesture to work, please make sure: 1. the circle made by thumb and index is visible to the camera 2. the other three fingers are all visible to the camera, i.e. ring and pinky finger are not hidden by middle finger
  12. Hi I think a simple scripts will do: void Update() { // do nothing if results is not changed if (!GestureProvider.UpdatedInThisFrame) return; if (GestureProvider.LeftHand != null) Debug.Log("Left gesture is " + GestureProvider.LeftHand.gesture); if (GestureProvider.RightHand != null) Debug.Log("Right gesture is " + GestureProvider.RightHand.gesture);} You can also refer to our usage document or sample document for more usages.
  13. I'll check with the team about the unreal project. For now, I would suggest to follow our document to setup your Unreal project: https://developer.viveport.com/documents/sdk/en/unreal/setup.html
  14. Hi everyone, Vive Hand Tracking SDK has a new release v0.8.1 today. You can download it from here. Summary of changes: 1. Update deep learning models for all platform. 2. We increased camera fov on Windows (10-15%) 3. Now we can render skeleton result onto a 3d hand model in unity (model is provided in plugin) 4. Full support of Unreal plugin For detailed changes, please refer to release notes. The plugin structure is changed for Unity and Unreal plugin. Please remove old version before install the new one. Best Regards, zzy
  15. Hi , yes the next version includes Android phone and Focus support in Unreal plugin
  16. OK, it seems that on your phone, it takes a longer time before the camera streaming starts to output frames. We internally have a 1.5 second timeout for waiting one frame. This seems to be the reason. I'll try to increase the timeout for the first frame in next version (expected to release in a few days).
  17. Hi You are right, the problem is in Hand Tracking SDK now, since we cannot get camera frames from your rear camera. From the log, we can find a rear camera, but cannot get frames from it. If you are building 32bit binary, using mono is definitely fine. Please make sure: 1. Your app have camera permission (I recommend you grant permission manually in Settings) 2. Some phones may not support NDK Camera2 API, which is what we use to get frames. A simple check is to try this example and see if it works.
  18. Hi Are you still experiencing the same error from adb log? I think you might not have Daydream setup on your Samsung S8. You need to install the Daydream app from Google play and setup the viewer to make a daydream app running. If you don't have a daydream viewer, I suggest you use Cardboard as your XRDevice instead.
  19. Hi I haved tested on my side and it seems to work with latest Unity and GoogleVR. Here are my steps: 1. Create a new 3D project using Unity 2019.1.3f1 2. Import Vive Hand Tracking SDK v0.8.0 and Google VR SDK v1.200.0. Please note after you imported both SDK, the ARISTO_WITH_GOOGLEVR define is added automatically. 3. Switch platform to Android and change some player settings to make sure build can succeed: 1. Change company name, product name and Android package name 2. Enable Daydream in XRSettings, and remove Vulkan from Android Graphics API (since Unity does not support Vulkan + XR yet) 3. Change Android Min API level to 24 4. Build and run on Pixel. When the app is run for the first time, it calls daydream API to grant camera permission, you need to take the phone off the viewer, hold it as portrait, accept camera permission and put it back to viewer. Note: You don't need to change the binaries or defines manually. Please let me if this steps works on your side or not. If you are using different versions, please let me know.
  20. From the log, the problem is caused by the AndroidJavaException part, and the detection is never started. If you have already granted camera permission, you can try to remove the ARISTO_WITH_GOOGLEVR script define symbol and try agin. In the meanwhile, I'll try to reproduce it on my side and see if it can be fixed in next version. Can you please share some of your project info: 1. Version of Unity 2. Version of Google VR plugin 3. Current active XRDevice (I assume it's daydream).
  21. Hi One thing first, if you are only using Vive Hand Tracking SDK with GoogleVR plugin and not use with WaveVR plugin, the default editor script should work for you. No need to manually remove libraries or modify script define symbols. Now back to your question. 1. Please make sure camera permission is granted to your app. 2. Are you enabling Daydream as XR? If so, please make sure your cellphone's rear camera is not blocked by the Daydream viewer. From my knowledge, the daydream viewer blocks the rear camera and thus we cannot detect any hand. I would recommend test without daydream viewer, or use a cardboard viewer that does not block the camera. If you still encounter the problem after you unblock the camera, please attach the adb log for us to investigate. You can use the command `adb logcat -s Unity Aristo` to include all the logs from Unity and our native plugin. Best Regards, zzy
  22. Hi Currently in Unity plugin, we apply camera transform (the game object that GestureProvider script is attached to) to the detected result. The only difference is that, the transform is passed into native plugin, and the final result is returned a few frames later, thus making animated camera appear very strange. Can you please check if apply camera transform in Unity makes it work better? An easy way to do so is attach GestureProvider script to a game object that is positioned at origin point (not moving), and you apply the transform of camera manually in Unity by modifying Update function of GestureProvider. You can edit all the points of LeftHand and RightHand at the end of Update function.
  23. Hi We have basic support for Android devices. Please read the Android support document for Unity plugin, Android plugin or C++ plugin. One thing to note, your screen orientation must be Landscape Left. Best Regards, zzy
  24. Hi , We have released a patch release (forum link) which should your problem. Please have a try. Best Regards, zzy
  • Create New...