Jump to content
 
Tima

Win32 version

Recommended Posts

Hi @Tima

The current plan for next release is about early February.

About grab strength, do you have any special use case that cannot use pinch strength?

Also, do you have any suggestion about how to calculate grab strength from skeleton? I would think of using average closing angle of each finger, but not sure how to handle the case when some fingers are closed while others are stright (e.g. point/two gesture).

Share this post


Link to post
Share on other sites

Hi @zzy

I’ve got some problems.

If I put my physical hand on table ( in fixed pose ) and very slowly turn my headset up and down my virtual hand stays at the same position relative to virtual camera but if I turn my headset a bit faster ( not very fast ) my virtual hand moves with virtual camera and sometimes returns back to initial position in virtual world.

1.)    My frame rate is 90 fps

2.)    No joints, I use only origin of the first point.

3.)    Unity sample has no such issue

4.)    I tried to set maxFPS to 90 in GestureOption the results are the same

5.) The same test runs without issues if I replace Vive-wrapper with LM-wrapper. (both wrappers return only left hand position in headset system coordinates)

Any help would be appreciated.

BR

Tima

Share this post


Link to post
Share on other sites

Hi @Tima

Do you call UseExternalTransform(true) before StartGestureDetection? This could be the cause for the problem, as you are applying hmd transform on GestureResult, making your hands move with your HMD.

By the way, since the raw camera fps is 60, setting maxFPS above 60 has no use.

Share this post


Link to post
Share on other sites

Hi @zzy

Do you call UseExternalTransform(true) before StartGestureDetection?

Yes.

Are  you applying hmd transform on GestureResult, making your hands move with your HMD.

Yes

 

Share this post


Link to post
Share on other sites

Hi @Tima

To solve this problem, please add the following lines to interface_gesture.hpp:

struct GestureVector3 {
  float x, y, z;
};

struct GestureQuaternion {
  float x, y, z, w;
};

INTERFACE_GESTURE_EXPORTS void SetCameraTransform(GestureVector3 position,
                                                  GestureQuaternion rotation);

Please call SetCameraTransform function every VR frame (i.e. 90 times per second) to pass the HMD transform to native side. Like GestureResult, position and rotation are in Unity axis. After calling SetCameraTransform function, you no longer need to apply HMD transform to GestureResult.

Note: This change will also be available starting from next release.

Share this post


Link to post
Share on other sites

Hi @Tima

Let me try to explain this. Please remember is that hand tracking is running at 40-60 fps, so the hand raw data is updated at lower frequency than VR rendering.

When working in camera space, you apply current HMD transform to GestureResult every frame. When HMD is moving fastly, GestureResult does not change quite often, this results in virtual hand moves with HMD while hand is staying still in reality.

When working in HMD space, the HMD transform is not applied to GestureResult every frame, instead the HMD transform is applied only raw hand position is calculated. Therefore, no matter how your HMD moves, hand stays still in the virtual environment.

So in this case, it's better to use HMD space. There is also other cases, where it's better to use camera space. For example, when you moves HMD and hand together, making hand relative position to HMD not changed.

Share this post


Link to post
Share on other sites

Please sign in to comment

You need to be a member in order to leave a comment

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...