Jump to content

Vive Cosmos OpenXR Hand Tracking Support - Unreal Engine 4.26 Sample Project


MariosBikos
 Share

Recommended Posts

Here is a sample project built using Unreal Engine 4.26 showing how to get started using OpenXR Hand Tracking with Vive Cosmos headsets (Project is attached at the bottom of this page).

Please make sure you follow the instructions specified here first to enable OpenXR in the Vive Console Runtime: 

 

Sample Project

The project comes with 2 pre-installed project plugin:

  • Vive Cosmos Controller Plugin defines input subcategories for Cosmos controllers.
  • OpenXR Vive Cosmos Controller Plugin allows using Vive Cosmos controllers input for your OpenXR applications as it adds the Vive Cosmos controller interaction profile to OpenXR Input (XR_HTC_vive_cosmos_controller_interaction

image.png.814e84620375e279f860923734c2e434.png


We have also enabled the following plugins in the project:

  • OpenXR Plugin since we want to build an OpenXR app.
  • OpenXR Hand Tracking to support the hand tracking extension of OpenXR

image.png.03bcece54c632f544a2f125478c25fe2.png

  • XR Visualization Plugin allows quickly rendering HMDs,controllers,hand meshes using the relevant data as parameters.This makes it easier to quickly render a representation of a virtual hand based on the information we get about each joint. Of course this is optional and it's not required to use it in your project.

image.png.e237cf227d3b43b0a1703f8219b9edbe.png

 

 

Implementation

After you open the sample project using Unreal Engine 4.26, please check the Event Graph of the Level Blueprint of the default Level "HandTrackingTest".  

We use the GetMotionControllerData function passing as a parameter the Left or Right Hand and we get back information about the MotionControllerData that can be used to render virtual hands. After that we use the RenderMotionController function from the XRVisualization Plugin to render a virtual representation of hands. You can also break the MotionControllerData structure and use the data about the hands in a different way depending on your use case.

image.thumb.png.8bf3666ed036a8c5e23ef1b31c3bd976.png

image.thumb.png.7ce436e64d33fce5ec771801273872cb.png

Remember that when asking to "GetMotionControllerData" the C++ side of things will try to get Hand Tracker data via the function "FOpenXRHMD::GetMotionControllerData". While trying to get OpenXRHandTracking data, the engine will get data from the runtime and the .json and .dll files provided as shown below. This is automatically handled after you enable the OpenXR runtime on Vive Console.

image.png.1b9a0a2067545ef7af9379b11090fa9f.png

 

Here's what you should see after hitting the button Play in VR:

image.thumb.png.7a4959e8e8c09eaf17f1ec9604bfd76a.png

OpenXRHandTest.zip

  • Like 2
Link to comment
Share on other sites

  • 2 months later...
10 hours ago, Morton UE4 Eyetracking Dev said:

Many thanks!

Ah so OpenXR effectively replaces the need for the use of SRanipal's plugin?

Amazing work.

No it doesn't techically replace it - it wraps it into a common interface -  I'll be naswering question live in an hour at the VRTO conference

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...