Jump to content

zzy

Verified Members
  • Posts

    219
  • Joined

  • Last visited

Everything posted by zzy

  1. Hi @WannerDev From the log file, it says: "Plugins: Failed to load 'C:/Users/cm/Downloads/ViveHandTracking_0.10.0/Prebuilt samples/ViveHandTrackingSample_Win64/Sample_Data/Plugins/aristo_interface.dll' with error 'Das angegebene Modul wurde nicht gefunden." So it's definitely missing some dependencies, but the problem right now is we cannot find which module is the root cause. I haven't tried this on a windows 10 home pc, but it can be a possible cause. If you have another pc with win 10 professional/enterprise edition, you can have a try (without a VR-ready GPU or HMD) to see if the same error happens in the log.
  2. Hi @WannerDev After some googling for the dlls, it seems these 2 dlls are system dlls, and will be dynamically loaded at runtime. I also double checked on my machine, and dependency walker also complains about these 2 missing dlls. So I think it has nothing todo with the dll not found error. If these 2 dlls are the only missing dll (excluding API-MS-*.dll and EXT-MS-*.dll), then I believe this is the same behavior on my machine. Can you please check the pre-built samples and see if they can work on your pc? You can check the unity log at %USERPROFILE%\AppData\LocalLow\HTC\ViveHandTrackingSample\output_log.txt or %USERPROFILE%\AppData\LocalLow\HTC\ViveHandTrackingUISample\output_log.txt
  3. Hi @WannerDev The EXT-MS-* and API-MS-* can be ignored for this purpose, as they seems to be Windows internal. No matter found or not, they don't have problem when loading dll. You can check the module in the bottom and see if any other dll is missing.
  4. Hi @WannerDev If you don't see OpenVR in the XR settings, then probably you haven't installed the OpenVR package yet. You need to install it in the package manager UI. Since it's from a custom a package registry, you might need to find it in My regrestries. But if you can see the VR scene in your HMD and the HMD tracking is working fine, I think you can stick with the deprecated OpenVR support. Now back to the dll not found problem, this seems irrelevent to the OpenVR setup. It seems to be some dll missing when you are trying to load aristo_interface.dll. Please follow the steps to check: Make sure aristo_interface.dll and cosmos_camera.dll exist under Assets/ViveHandTracking/Plugins/x86_64 folder Please make sure you have installed Visual C++ 2015 redistribute Update 3 (64-bit), please note the Update 3 and 64-bit part. Update 3 can be downloaded here: https://www.microsoft.com/en-us/download/details.aspx?id=53587, please make sure to select the 64-bit installer. You can use depenedency walker to check if you can load the dll or if any dependency is missing. Dependency walker can be downloaded from here: https://www.dependencywalker.com/
  5. Hi @WannerDev To start with hand tracking sdk on Windows, please follow the steps below: I would suggest to use Unity LTS version, i.e. 2018.4 and 2019.4 series are recommended. Follow the steps here to install OpenVR support of Unity XR system. Follow the steps in Quickstart/Install section to enable OpenXR in your project, this is the steps required for "checking if OpenVR is loaded". You can follow the steps in Quickstart/Standalone (no input) to build a sample scene and test if VR setup is correct. Note that if you don't need controller in your project, you can just skip the "Quickstart/SteamVR Input System" part. Import Vive Hand Tracking SDK into your project. You can now run the sample scenes (Sample or UISample) under Assets/ViveHandTracking/Sample folder to see if hand tracking is working. Please note on the first startup, it might take a few minutes before hand is displayed.
  6. Hi @Shr Do you see any errors in Unity log console? Can you also provide the hand tracking log file, which is located in %USERPROFILE%\AppData\Local\Aristo\Aristo.log. It seems a little strange to see cartoon hands after the start, since the default hand is the skeleton (spheres and links). Cartoon hand is only shown after both hands triggered thumb up gesture. By the way, we have newer version 0.10.0, you can have a try and see if that helps.
  7. Hi @Shr Have tried to run the sample scenes first? You can try the UI sample scene, which is located at `Assets/ViveHandTracking/Sample/UISample.unity`. Just double click it from Project window and Hit Play. Please also note that Hand Tracking SDK may take about 1-2 minutes to startup in the first run on each machine. Please wait for a few minutes, then put on HMD and verify if you can see your hands.
  8. Hi @atea OK, I think this is incompatibility between Unity 2020 and WaveXR 1.0.1. If you can downgrade to Unity 2019.4, I would suggest to use LTS version. Otherwise, you might need to manually remove the `SetVirtualRealitySDKs` part in the code.
  9. Hi @atea Wave SDK 4.0.0 is currently not supported by hand tracking SDK yet. We plan to add support in next release. For now, you can use Wave SDK 1.0.1. You can goto package manager, find all wave package, click 'other versions', and select to install previous versions. Please make sure to install same version for all the 3 Wave packages.
  10. Hi @jjoyner In SRWorks 0.9.7.1, please make sure GestureProvider script is attached to "[SRwork_FrameWork]/DualCamera (head)/RenderCamera (Left)" GameObject. Please also make sure to attach the hand renderer prefabs so detected hands are visible. Some hints for further troubleshooting: Are hand renderer objects enabled when you put up your hands? If object is not enabled or position is not updated, please make sure if hand detection starts correctly. If hand object position is updating, is the hand position correct? If hand position is not correct, I assume it's due to GestureProvider script attached to wrong object.
  11. Hi @jjoyner Have you tried the hand tracking sample scene? Can you see your hands? It might take some time for the first run, please wait for 2 minutes and see if you can see your hand. It's recommended to wear the hmd and test with hand, not hold your hmd by one hand and put other hand in front of camera. Which hmd are you using, Vive Pro or Cosmos?
  12. Hi @EniacMlezi Thanks for the work on vcpkg portfile. Since the current release is only 0.x, so we don't consider providing integration with other build systems at the moment. This might change after v1.0 release, and we may consider shipping license in the provided zip as well.
  13. Hi @Tima The new version (including binary and document) are now online. Please have a try.
  14. Hi everyone, Vive Hand Tracking SDK has a new release v0.10.0 today. You can download it from here. Highlight of changes: Update deep learning model Use same binary for android phone and wavevr device Add joint rotation/pinch info/palm position info in result Add skeleton support on Android phone Updated pinch direction calculation (Unreal) Support simulate hand without VR devices in Unreal Editor For detailed changes, please refer to release notes.
  15. Hi @Tima The new version is still being uploaded. So altough you have seen the new release page, the binary is still 0.9.4. I think it will be uploaded this week. Will let you know when it's completed.
  16. Hi @Tima The new version should be online this week, and contain the changes you mentioned.
  17. Hi @Nikolay Current hand tracking SDK does not work with ARCore. The main problem is that we cannot current get the camera stream while ARCore is running. We are still investigating how to integrate with ARCore, I believe the ARCore API does provide some level of camera access. But we need to integrate it into our existing binary. I cannot promise a target date at the moment, but it's indeed on our roadmap for the future.
  18. @ericvids Thanks for the quick feedback. Indeed the score shouldn't be this high for iGPU. I have added a fix for this and added more logs in case the fix doesn't work. Please help to test this again, thank you. aristo_interface2.zip
  19. Hi @ericvids Thanks for your test. I'm still struggling to find a device to reproduce the bug. It's really nice that you can help to test with a debug dll. I have added more logs to the OpenCL selection part. You just need to replace the dll in Assets\ViveHandTracking\Plugins\x86_64 folder in your Unity project, or <exe_name>_Data\Plugins\ folder in your built player. All logs will be in our log file: %USERPROFILE%\AppData\Local\Aristo\Aristo.log. aristo_interface.zip
  20. Hi @ericvids Update results from some quick checks: After checking codes, the log from Unity Editor seems to be from our binary, not Unity. The OpenCL checking is called before logging is redirected to aristo.log, so it outputs directly to stdout. So unity plugin is indeed using iGPU. I'm still trying to find devices to reproduce this on my side. I'm currently using pre-built sample on a notebook with both Intel iGPU and NVIDIA GPU, but it's indeed selecting NVIDIA GPU. I'll see if I can find more devices to test. Can you please verify if you can observe same behavior in the pre-built sample?
  21. Hi @ericvids Can you please share your Unity version? I'll try if I can reproduce this on my side.
  22. Hi @ericvids From the output log (first line), the program is indeed using NVIDIA GPU over Intel iGPU. I can also confirm that from the opencl properties, GTX 1050Ti is having much higher score than HD630. I think this problem may not be caused by wrong OpenCL GPU selected. You can check the log file (%USERPROFILE%\AppData\Local\Aristo\Aristo.log) to see if 1050Ti is selected in hand detection. Have you tried to check if GPU of unity.exe (or exe file of build app) is 1050Ti? This can be checked in NVIDIA control panel.
  23. Hi @ericvids Thanks for the details. In Vive Hand Tracking, we do have a system to select the best GPU. In this system, we don't consider OpenCL version (i.e. 1.2 vs 2.x), but use opencl properties to calcuate a computing score and select the GPU with best score. In the score calculation, we favor NVIDIA and AMD GPUs over Intel. So normally Intel GPUs are not selected. Since the program selects Intel GPU on your system, this seems that we still need to change our score system to choose the real GPU. I have uploaded an exe which prints out the fields we use to consider which GPU to use. Can you please help to run the exe on your PC (with Intel OpenCL enabled) and send me back the output? The exe is a console application, so please run it in cmd. device.zip
  24. Hi @MadisV We will have a try and feedback to you when we have result.
  25. Hi @Tima Let me try to explain this. Please remember is that hand tracking is running at 40-60 fps, so the hand raw data is updated at lower frequency than VR rendering. When working in camera space, you apply current HMD transform to GestureResult every frame. When HMD is moving fastly, GestureResult does not change quite often, this results in virtual hand moves with HMD while hand is staying still in reality. When working in HMD space, the HMD transform is not applied to GestureResult every frame, instead the HMD transform is applied only raw hand position is calculated. Therefore, no matter how your HMD moves, hand stays still in the virtual environment. So in this case, it's better to use HMD space. There is also other cases, where it's better to use camera space. For example, when you moves HMD and hand together, making hand relative position to HMD not changed.
×
×
  • Create New...