Jump to content


Verified Members
  • Posts

  • Joined

  • Last visited


50 Excellent


About MariosBikos

  • Birthday August 12

Recent Profile Visitors

1,826 profile views
  1. Here is a sample project built using Unreal Engine 4.26 showing how to get started using OpenXR Hand Tracking with Vive Cosmos headsets (Project is attached at the bottom of this page). Please make sure you follow the instructions specified here first to enable OpenXR in the Vive Console Runtime: Sample Project The project comes with 2 pre-installed project plugin: Vive Cosmos Controller Plugin defines input subcategories for Cosmos controllers. OpenXR Vive Cosmos Controller Plugin allows using Vive Cosmos controllers input for your OpenXR applications as it adds the Vive Cosmos controller interaction profile to OpenXR Input (XR_HTC_vive_cosmos_controller_interaction ) We have also enabled the following plugins in the project: OpenXR Plugin since we want to build an OpenXR app. OpenXR Hand Tracking to support the hand tracking extension of OpenXR XR Visualization Plugin allows quickly rendering HMDs,controllers,hand meshes using the relevant data as parameters.This makes it easier to quickly render a representation of a virtual hand based on the information we get about each joint. Of course this is optional and it's not required to use it in your project. Implementation After you open the sample project using Unreal Engine 4.26, please check the Event Graph of the Level Blueprint of the default Level "HandTrackingTest". We use the GetMotionControllerData function passing as a parameter the Left or Right Hand and we get back information about the MotionControllerData that can be used to render virtual hands. After that we use the RenderMotionController function from the XRVisualization Plugin to render a virtual representation of hands. You can also break the MotionControllerData structure and use the data about the hands in a different way depending on your use case. Remember that when asking to "GetMotionControllerData" the C++ side of things will try to get Hand Tracker data via the function "FOpenXRHMD::GetMotionControllerData". While trying to get OpenXRHandTracking data, the engine will get data from the runtime and the .json and .dll files provided as shown below. This is automatically handled after you enable the OpenXR runtime on Vive Console. Here's what you should see after hitting the button Play in VR: OpenXRHandTest.zip
  2. @Linkedoranean The post says that you can't use versions > 2019.3.6, this means you can still use 2019.3.5 for example. Just make sure you use the Unity XR Plugin instead of the Legacy plugin as this will make things easier. So with Wave XR Plugin we recommend Unity 2019.4 LTS which works 100% or 2020.1 (will work but not stable version yet)
  3. @ArAnish, If you start from a Blueprint project you can always convert it to a C++ project by creating a new C++ Class from the Unreal Editor (Add New-->C++ Class). Here is a link: https://allarsblog.com/2015/11/05/converting-bp-project-to-cpp/ You can use our SRanipal SDK both in a Blueprint-only project and in a C++ project. We recommend the latter, because then you can easily extend what is available in the plugin, e.g you can create extra BP nodes for your project.
  4. Did you use a MotionController Component attached to your Player Pawn? Can you share your project so we can have a look? @selor
  5. @selor I moved the post to the right category. Vive Cosmos Elite is using the Vive Controllers. Have you tried selecting the input options underneath the HTC Vive category in Unreal's Project Settings --> Input menu?
  6. Hi @Tesi, we don't provide a scan path at the moment via the SDK but that's something you could implement based on the data provided by the SDK.
  7. Ah sorry about that @Tomas_TDFM, @C3D let me re-upload the image here. Let me know if it's visible. What you need to do is go to SRanipal\Source\SRanipalEye\Private\SRanipalEye_Core.cpp file and replace line 337: RayCastDirection = PlayerMainCameraRotation.RotateVector(PlayerMainCameraLocation + CameraGazeDirection * maxDistance); with the following: RaycastDirection = (PlayerMainCameraRotation.RotateVector(CameraGazeDirection)*maxDistance) + PlayerMainCameraLocation;
  8. Hi @Tomas_TDFM, version should include the fix I mentioned in this thread. I am double-checking that but it looks like the next version should be ready & public early next week. Until then you can always manually change the code if you have a C++ project.
  9. Hi @Stefano, can you share the VRS settings that you are using when you see this issue? Is this happening with every combination of settings(Foveation Pattern Preset, Foveation Shading Rate)? Also are you using the latest version of SRanipal SDK? Can you send your logs at @marios_bikos@htc.com so that we can have a look? Anything that can help us reproduce the issue e.g the steps you followed.
  10. Hi @C3D, there was an issue indeed in our SDK and the team managed to fix it. This will be updated in SRanipal v1.3.1.1 but here is an image showing the required code change if you want to fix it earlier than that. Please use this and let us know if it worked.
  11. Hi @C3D, I managed to reproduce the issue and have already sent a request for the team to have a look. It looks like the Focus function is not behaving properly but the GetGazeData returns the proper output. Can you check if the GetGazeData works for you until we fix the issue?
  12. Hi @Tesi,we provide an Eye Tracking SDK for Vive Pro Eye that is called SRanipal SDK. It is free for developers to integrate with their project and for the end-users to access via the SRAnipal runtime as the licensing is included as part of Vive Pro Eye hardware platform. Our SDK primarily offers things like feature data (gaze, pupil diameter/position...etc). You can see the full list of features in the image below.Currently our Vive SRanipal SDK doesn’t allow access to the raw data from the Eye Trackers. Developers can only get access to a specific set of features but not the raw feed from the trackers.Here is a webinar recording to help you understand more things about Vive Pro Eye and SRAnipal SDK: https://register.gotowebinar.com/recording/212634959163678731
  13. Hey @zgibsontheia, have you tried initialising the SRanipal runtime first? If you have a Vive Pro Eye device and the SRanipal is properly initialised then ticking the first tick box should initialise eye tracking in Unreal properly. For the Lip framework you need to have a Lip Tracker device otherwise it won't work so you need to disable it. Please also try the latest version of sRanipal v1.3.1.0 that comes with some changes on that front: https://developer.vive.com/resources/vive-sense/sdk/vive-eye-tracking-sdk-sranipal/
  14. Hi all, SRanipal v1.3.1.0 is now available that should solve the calibration issues mentioned previously in this thread. Please download it and try again: https://developer.vive.com/resources/vive-sense/sdk/vive-eye-tracking-sdk-sranipal/
  15. Hi @hieutt, if you use the Wave Unity XR Plugin then you can use both Unity 2019.3 and 2019.4 (We recommend 2019.4 as it's an LTS version). However if you decide to use the Legacy Wave Plugin then you can only use if for versions of Unity up to Unity 2019.3.6, otherwise you will get a memory leak issue. We recommend using the XR Plugin as that is future-proof considering the changes in Unity XR Platform. Also the XR Plugin is compatible with newer Unity versions (Unity 2020.1,etc). That's the reason why we focused on that.
  • Create New...