Jump to content
 
Sign in to follow this  
Kikuto

Hand Tracking SDK + SRWorks

Recommended Posts

With the SDK patch 0.8.0p1, now is possible to use Hand Tracking SDK + SRWorks in See Through mode. However, when I configure both SDKs together as in this unity example, I got different position for 3D hand and my real hand as in image bellow.

 

 

This project was built on ViveSR/Sample/Sample scene.

This problem occurs ins both hand in both eyes.   how can I find the translation matrix that fixes this position?

 

I intend to use handmat material, that made virtual hand invisible, to interact with virtual objects. As I finished the integration with Leap Motion Interaction Engine, I will be able to use all interactive features of Leap SDK. I will share the interaction code soon as I fix some small details.

 

Thanks.

Share this post


Link to post
Share on other sites

Hi  

 

I would recommend you to try with SDK v0.8.1 to see if this problem still exists. Prior to v0.8.1, we used hand calibrated intrinsics for Vive Pro, this might be different from the parameters used in SRWorks. In v0.8.1, we changed to use OpenVR API to get intrinsics which should be similar to what SRWorks used.

 

As for the hand material, if you don't need the hand, you can simply disable the HandRenderer, or change it's layer to an invisible layer.

Share this post


Link to post
Share on other sites

Hi  

 

Thanks for the fast reply. I tried your suggestion, but as the image below I still have 3D hand misplaced with my real hand in Hand Tracking SDK 0.8.1. So, how can I found the translation matrix that aligns these hands?

 

 

Here is my unity example scene where the dependencies are SRWorks SDK 0.8.0.2 and Hand Track SDK 0.8.1.

 

In this new SDK version, the hand detection is much better! Thanks for the hard work.

 

Thanks.

Share this post


Link to post
Share on other sites

Hi  

 

Actually, we don't have the translation matrix. In Hand Tracking SDK, we calculate 3d points based on front camera intrinsics and extriniscs. The unity plugin renders the 3d points based on rendering camera settings, and thus displayed on your screen and HMD.

 

I think you may ask the SRWorks side for this, if they have some mappings for virtual objects that need to appear at some specific place on the texture.

 

Can you help on this question? Thank you.

 

Best Regards,

Zhongyang

Share this post


Link to post
Share on other sites

Hello ,

 

I read all the topics in SRWorks sections, but I didn't get any clue about how to solve this transformation problem. I also test SRHands example, but i had the same problem(left displacement) with depth images.


Another related problem is the position of depth image in SRWorks samples in my device looks that it's in the same position of hand object. In SRWorks provided sample scene only the right eye can see the depth map occlusion, and the virtual camera was set on the left eye, so we can't see the depth occlusion by default in unity game windows. After making the proper adjustments, I get the same problem that depth mask is left displaced as when I saw in the right eye trough the headset, as virtual objects.


I will post a photo of this problem. I belive that these problems are related.

 

Thanks.

Share this post


Link to post
Share on other sites

Firstly, you could verify if the controller's 3D  model position could be aligned well with the controller in the camera image.

Share this post


Link to post
Share on other sites

Hi. This also happens with SRWorks on its own. If you have a motion controller visible in game with overlay active, there is an offset between the controller and the controller visible in pass through. Similarly if you look in reality then compare with pass through on screen, there is also an offset.

Doesn't mean there isn't a mismatch with the hand tracking SDK too but definitely there is one with SRWorks

Share this post


Link to post
Share on other sites

@zzy

Hi Zzy.

I have some questions about all of this thread.

1.- Is SRWorks the only way to create a Mixed Reality in which you can place your virtual hands or any other virtual object with your real background?

2.- I was confused because i can see the controllers when i activate the front camera in the vive pro with the double click on the controller button (not the SRWorks see-through mode). So i thought, if controllers can be shown why cant i place any other virtual button? Is it, maybe, because controller own tracking?

3.- The main question is, how can i achieve this with HTC? -> Photo

 

Share this post


Link to post
Share on other sites

Hi @JCSegula

If you only need the front camera video frame, you can use OpenVR camera API to get the frames and display in your scene. This is what the double click home button on controller is doing. It can let you see the outside, but not mapping the outside exactly as the Photo you provided.

If you need to map the real objects and virtual objects in you app, the easier way now is to use the SRWorks SDK. If you need to show hand skeleton results, you can use Vive Hand Tracking SDK. You can use both SDK together if you want to map virtual hands with the see-through image, though there might be some mis-alignment. To my knowledge, this is the only solution so far. However, if you do have the necessary knowledge about computer vision and computer graphics, you can try to create the mapping yourself to achieve the effect you what, using OpenVR camera frame and the hand detection result.

  • Like 1

Share this post


Link to post
Share on other sites

Hi @JCSegula and @Kikuto

In my answer in a related post, I think the detected hand and the see through hand aligns together. There might be some lagging in the detected when you moving your hand fast. Please follow the steps in the linked post and see if it suits your need.

Share this post


Link to post
Share on other sites

Please sign in to comment

You need to be a member in order to leave a comment

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×
×
  • Create New...