Jump to content

Vive Tracker - SteamVR setup + Unreal Engine 4 setup + Noitom Hi5 VR Gloves setup


EnterRealityVR

Recommended Posts

  • 1 month later...

I'm having issues and confusion.

Device ID's of 1 and 2 or 3 and 4 are not working. I can confirm that 3 and 4 are my controllers but the trackers are not working with 1 and 2.

Also for debugging I'm using Get Valid Tracked Device Ids and only two controllers (3 and 4) show up. I'm assuming the trackers are supposed to show as well? I've also tried Device Type of Static and Other with no luck. Static shows my lighthouses I'm assuming?

And of course the trackers are showing as paired in SteamVR.

Link to comment
Share on other sites

  • 3 weeks later...

Hey, with regards to the trackers and lighthouses, yes it is possible. I have it setup now.

8 trackers and 2 lighthouse running off a surface. All battery powered.

You dont need the headset to define the room scale and calibrate floor. It can be done with a tracker.

Link to comment
Share on other sites

  • 1 month later...

Hi Nicolas, I had a question reguarding your integration of your Perception Neuron mocap data, I am trying to accomplish litteraly this exact setup for my Thesis project. I have the suit integrated and bringing the bvh data into UE4 but I am currently using a fullbody avatar. Did you have to retarget the hand skeletal mesh to get it to work right with the suit, or did it work with the vive hand mesh without having to do that? I am also wondering how you are sending the bvh data to two seperate skeletal meshes at the same time? My current setup requires a vive tracker for tracking a digital midi keyboard and the I am currently only using one tracker at the root of the Perception Neuron suit to track it, but I believe that your setup would be much more acurate in terms of identifying the exact location of the hands and would also be a huge time saver in showing my thesis to my panel without having to recalabrate the suit for people of different height. Thanks for the work you have done so far, and I really appreciate any insight you are willing to provide on your setup.

Link to comment
Share on other sites

The setup for the PN suit works like that:

 

The Trackers gives the position/orientation of the hands, the fingers are driven by the mocap data being streamed from Axis Pro to UE4 in realtime.

In short, what I did was to modify the VR Pawn and add what I need, which in my case are the two hands ( which are from the same skeleton, not two different skeleton, since it's the Mannequin character modified in Maya by removing the entire body, except for the hands ), the Perception Neuron node inside the VR Pawn BP, then a Perception Neuton Manager inside the scene, and that's it.

The actual position of the player is given by the Vive itself, so the PN mocap data is used just to track the fingers.

 

If you want to have very precise results you need to use multiple Trackers, because the suit gives you an approximate position, which is never the same every time you move and try to go back to the same position ( downside of a IMU tracking system ).

 

Regarding the retargeting: if you're using the Plugin from the noitom website the retargeting is already done for you ( check the tutorial about it, it's very easy to setup the entire thing ).

Link to comment
Share on other sites

  • 2 weeks later...

Thank you very much for the information, I beleive that I'm really close to getting it working, I just need to go through the retargeting process for the model, I think my biggest mistake was trying to track the entire suit instead of just the hands. I will do some testing with it this week and see if I can get it working. I was using the spawner in the scene when I guess I needed to determine how send the BVH data to a model inside of an actor blueprint. Thanks again hopefully I can get this worked out.

 

Link to comment
Share on other sites

  • 4 months later...

Hi Nicolas,

 

I was trying to setup the HTC Vive Trackers in Unreal Engine 4 to control a virtual camera with the tracker itself attached to a real world camera (for a mixed reality setup).

 

But I've encountered a problem. It seems that the trackers rotations are in Y-up (Pitch, Yaw, Roll) while Unreal uses the Z-up system (as you certainly know). Which led to the problem that when I was rotating my real world camera, the virtual cameras rotation have been wrong (flipped on one axis). I wasn't able to figure out how to solve that myself. 

 

Do you have any suggestions or experiences to share?

 

As additional information:

My setup was quiet similar to yours. I've attached a Scene Component to the Blueprint as ROOT for the tracked virtual camera. This root got its rotation from the "Get Tracked Device Orientation and Location" blueprint node. As child of that I had my virtual camera attached.

Link to comment
Share on other sites

  • 2 months later...

Good morning
I'm sorry for my english
I write to you hoping you can help me, I need to create scenes of a room (only one room) in unreal engine and through a htc controller live on a real camera synchronize the movement of the scene with the 3D real camera. I do not need of other controller trackers because the person taken by the real camera does not have to have a helmet and interact
but it must be integrated simply with the chrome key in the virtual scene.
It is important instead that the quality of the chroma key is excellent for this maybe not to use software but a video hardware mixer. In practice the fundamental thing is to trace the camera well.
Of course if I could even use augmented reality it would be better but the quality of the chroma key is fundamental.
I ask you for some advice on what to use and how to use to achieve these results, I repeat at the moment the system htc lives that allows you to move around the room is this beautiful but I would like to create a workflow of this kind:
I create a scene containing a virtual environment
development of an executable that when a part should be synchronized with a controller of the htc lives
then via software or hardware the two true video camera signals on a green background
and virtual scene should synchronize.
for example how I do it
Could you help me get this thing?
you have some advice to give me
Thank you very much
Marco

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...