Jump to content
 

zzy

Employee
  • Content Count

    170
  • Joined

  • Last visited

Everything posted by zzy

  1. Hi @毛.sy We don't support changing camera scale in the current version. Can you please share your use case why you need to scale the camera in a VR application? Can you please verify if setting camera scale to (1,1,1) can solve the problem?
  2. Hi @davide445 Sorry for the late reply, we just came back from a long holiday. Due to internal schedule changing, v1.0 is postponed to release together with future hardware. In the meantime, regarding the SDK improvement we made this year, I think you can start trying out the SDK and use it in your application. Please let me know if there is any feature you need is not within the SDK. As for the commercial project, there are indeed projects on-going, but I cannot say much about that.
  3. Hi @Tima We plan to release new version before end of October.
  4. Hi @Tima We have migrated all the dependencies to support x86 now. This should be available in next update. Will let you know when we have an ETA.
  5. Hi @Verlinch Are you using Vive Input Utility in your project? I'm not quite familiar with VIU, I'll see if I can ask some colleague to look into this as well. Meanwhile, I think we can check with the following: I assume your scene can work without hand tracking, i.e. by disabling GestureProvider script. Can you please try with sample scenes in VIU if this problem still exists? So I can try to reproduce this on my side. Please also share the version of VIU and SteamVR on your PC.
  6. Hi @aleks Ideally wireless headset can work with hand tracking sdk, but the stability is quite low, including SteamVR hang, camera stream stops after tracking loss (can only resume after SteamVR restart) and etc. Therefore we recommend wired connections for hand tracking.
  7. Hi @andrewchilicki Unfortunately, Cosmos Elite is currently not supported by Hand Tracking SDK. This is due to camera stream is not available using current API. We are investigating the possiblitiy to enable camera stream internally, but there is no ETA at the moment. For now, if you want to use HMD with lighthouse, we recommend Vive Pro or Valve Index, which are both supported by Vive Hand Tracking SDK. However, wireless adapter is not recommended. With wireless adapter, camera stream is stopped and cannot be resumed if tracking is lost. The only way to have camera stream again is to restart SteamVR.
  8. Hi @Mohsen Asghari v0.9.3 is now released, you can download it from here. The new version should fix the hand missing problem after teleporting.
  9. Hi @Xyah v0.9.3 is now released, you can download it from here. It improves GPU usage in hand tracking sdk on Windows. Please have a try.
  10. Hi @duelgame v0.9.3 is now released, you can download it from here. It improves GPU usage in hand tracking part. Please have a try.
  11. @Frank_Cough v0.9.3 is now released, you can download it from here
  12. Hi everyone, Vive Hand Tracking SDK has a new release v0.9.3 today. You can download it from here. Highlight of changes: Update deep learning models for all platform. Add support for Valve Index Speedup detection on Windows Support define custom gestures in skeleton mode in Unity and Unreal plugin Update mesh for cartoon hand Support simulate hand without VR devices in Unity Editor For detailed changes, please refer to release notes.
  13. Hi @Liyea You can disable hand renderer script, so hand is not rendered. If you need the hand renderer script to be active (e.g. using the collider), you can change the hand material to a no-op material that always draws tranparent color.
  14. Hi @tired You can select collider type to add collider by hand renderer, it adds collider for all joints. If you want to add colliders for fingertips only, you need to create custom scripts for this. You can reference hand renderer script for how to add colliders. Basically, you need create a few spheres colliders and update their position to match finger tip position.
  15. Hi @tired Please select the OnStateChanged method in dynamic int (red box), not the one in static parameters (black box). The dynamic version will send actual state to Grab script, while static one will always send fixed state (in your case 0) to Grab script.
  16. Hi @SanityGaming Sorry for your bad experience with Vive customer support. I'm RD in hand tracking, so I really don't know the schedule about the product sales. @VibrantNebula Do you know who who can help on this question about when Cosmos Elite will be available on Amazon?
  17. Hi @KnotEgan Sorry for the late response. I think this is caused by Wireless adpater. I'm not expert about Wireless adapter, so I suggest you to post your questions on Vive Pro sub-forum here: https://forum.vive.com/forum/81-vive-and-vive-pro/
  18. Hi @SanityGaming For general cosmos question, please contact Vive support team. You can click the link above the post.
  19. Hi @duelgame I can see all the images now, I think both Unreal and Hand Tracking need to optimize GPU usage: Next release version of Vive Hand Tracking will improve GPU usage (target end of this month). From a quick google search, I found Instanced Stereo Rendering support is already in VR: https://docs.unrealengine.com/en-US/Platforms/VR/DevelopVR/VRPerformance/index.html. Is there any reason that you cannot enable this feature? We will keep improving GPU usage in future versions, but please understand currently improving accuracy has higher priority.
  20. Hi @Mohsen Asghari Thanks for the report. I think the hand will gradually shifts towards the new camera position, but could be a little slower. The quickest way is to put down the hand and then up again to make it appear at the new camera position. I'll check if we can improve it in the next release.
  21. Hi @duelgame I cannot see most of the pictures you attached, but here is my answer based on text in your question. To make it clear, here is hand tracking process: Camera frame -> Hand Tracking Process (GPU) -> output result in engine plugin. This has nothing to do with the render texture size, or vr.PixelDensity. However, VR rendering also uses GPU. Hand Tracking and render shares the GPU resource, so if you reduce workload for either side, you can observe performance improvement. This is case when you reduce rendering workload (i.e. by reducing vr.PixelDensity or using Instanced Stereo Rendering). You can observe the same thing by using 3d point mode instead of skeleton mode in Vive Hand Tracking. I'm not expert for Unreal, but I think you need to check SteamVR plugin for Instanced Stereo Rendering support in Unreal.
  22. Hi @Xyah Thanks for the test. Since the deep learning model are quite computational heavy, it's natural that it might eat up a lot of GPU resources. If you want to use lighter version of Hand Tracking, you can change the mode of hand tracking to 3D Point, which should be much faster than skeleton mode, although returns limited result. We plan to release new version before end of this month, which should bring some performance improvement. You can have a test when the new version is released. We will also keep improving performance, as there are several items in our backlog, but considering current SDK status, accuracy might take higher priority. Here are some references: Different between different modes: https://hub.vive.com/storage/tracking/overview/feature.html#hand-positions-modes Information about how to change mode in Unreal: https://hub.vive.com/storage/tracking/unreal/usage.html#start-stop-detection
  23. Hi @Xyah I have tested using our test machines (1060 & 2060) today. I'll answer your question first and then update my findings. You can find the prebuilt samples in the zip you downloaded from here: https://developer.vive.com/resources/knowledgebase/vive-hand-tracking-sdk/. "Prebuilt samples" folder contains binaries built using Unity. I think this requires camera API to support input resolution settings, but currently no API has such support (SteamVR or Vive Console), so we probably can do nothing about this. We did some performance tuning, but I think that not this big to solve your problem. About my findings, I think this might be a problem from Unreal side instead of hand tracking sdk. Here is my experiment, I use SteamVR frame timer to check the FPS and GPU times. In all my experiments, CPU is the bottleneck. I run Unity sample on 1060 & 2060, GPU time is larger when hands are visible, but the frame time is within 11ms (90 FPS). I run Unreal app, GPU time is significantly larger than Unity version, about 2x times are used in GPU. I tried to create a new default level, without hand related stuffs. It consumes very large GPU times, even worse than Unity sample with hand. This makes me think the problem is on Unreal side. Please let me know if this is the same situation on your side.
  24. Hi @Xyah Thanks for the test. As far as I know, there is no way to change camera resolution of original Cosmos. From your results, it seems the FPS drop only happens when hand is visible. We will check if we can reproduce it on our side. Meanwhile, can you confirm if it's same behavior in Unreal and Unity (pre-built sample)?
  25. Hi @aze I'm not sure if this is a Unity bug, but it seems using Unity 2019.1 is a workaround for now. @Cotta Can you please also help to see if there is any workaround in WaveVR? It seems memory leak happens with WaveVR prefab in some Unity versions.
×
×
  • Create New...