Jump to content

MariosBikos

Verified Members
  • Posts

    78
  • Joined

  • Last visited

Posts posted by MariosBikos

  1. Here is a sample project built using Unreal Engine 4.26 showing how to get started using OpenXR Hand Tracking with Vive Cosmos headsets (Project is attached at the bottom of this page).

    Please make sure you follow the instructions specified here first to enable OpenXR in the Vive Console Runtime: 

     

    Sample Project

    The project comes with 2 pre-installed project plugin:

    • Vive Cosmos Controller Plugin defines input subcategories for Cosmos controllers.
    • OpenXR Vive Cosmos Controller Plugin allows using Vive Cosmos controllers input for your OpenXR applications as it adds the Vive Cosmos controller interaction profile to OpenXR Input (XR_HTC_vive_cosmos_controller_interaction

    image.png.814e84620375e279f860923734c2e434.png


    We have also enabled the following plugins in the project:

    • OpenXR Plugin since we want to build an OpenXR app.
    • OpenXR Hand Tracking to support the hand tracking extension of OpenXR

    image.png.03bcece54c632f544a2f125478c25fe2.png

    • XR Visualization Plugin allows quickly rendering HMDs,controllers,hand meshes using the relevant data as parameters.This makes it easier to quickly render a representation of a virtual hand based on the information we get about each joint. Of course this is optional and it's not required to use it in your project.

    image.png.e237cf227d3b43b0a1703f8219b9edbe.png

     

     

    Implementation

    After you open the sample project using Unreal Engine 4.26, please check the Event Graph of the Level Blueprint of the default Level "HandTrackingTest".  

    We use the GetMotionControllerData function passing as a parameter the Left or Right Hand and we get back information about the MotionControllerData that can be used to render virtual hands. After that we use the RenderMotionController function from the XRVisualization Plugin to render a virtual representation of hands. You can also break the MotionControllerData structure and use the data about the hands in a different way depending on your use case.

    image.thumb.png.8bf3666ed036a8c5e23ef1b31c3bd976.png

    image.thumb.png.7ce436e64d33fce5ec771801273872cb.png

    Remember that when asking to "GetMotionControllerData" the C++ side of things will try to get Hand Tracker data via the function "FOpenXRHMD::GetMotionControllerData". While trying to get OpenXRHandTracking data, the engine will get data from the runtime and the .json and .dll files provided as shown below. This is automatically handled after you enable the OpenXR runtime on Vive Console.

    image.png.1b9a0a2067545ef7af9379b11090fa9f.png

     

    Here's what you should see after hitting the button Play in VR:

    image.thumb.png.7a4959e8e8c09eaf17f1ec9604bfd76a.png

    OpenXRHandTest.zip

    • Like 2
  2. @ArAnish, If you start from a Blueprint project you can always convert it to a C++ project by creating a new C++ Class from the Unreal Editor (Add New-->C++ Class). Here is a link: https://allarsblog.com/2015/11/05/converting-bp-project-to-cpp/

    You can use our SRanipal SDK both in a Blueprint-only project and in a C++ project. We recommend the latter, because then you can easily extend what is available in the plugin, e.g you can create extra BP nodes for your project.

    • Like 1
  3. Ah sorry about that @Tomas_TDFM, @C3D let me re-upload the image here. Let me know if it's visible.

    What you need to do is go to SRanipal\Source\SRanipalEye\Private\SRanipalEye_Core.cpp file and replace line 337

    RayCastDirection = PlayerMainCameraRotation.RotateVector(PlayerMainCameraLocation + CameraGazeDirection * maxDistance);

    with the following:

    RaycastDirection = (PlayerMainCameraRotation.RotateVector(CameraGazeDirection)*maxDistance) + PlayerMainCameraLocation;



    GetAttachmentThumbnail?id=AAMkAGVjMGE0Yjg0LWMwNjQtNDU2MS1iYjQwLTcyYjM4ZTNmMzlkMgBGAAAAAAB1Pj8SzGAOToOs9DtdpIjnBwDJSUN1U8crRKkSfyPQ%2Byl1AAAAdbnnAAAtkgq9eVmKTLagY%2FCiNPKeAADucDx1AAABEgAQANFXC00sRb1LjFjBlNCClFo%3D&thumbnailType=2&token=eyJhbGciOiJSUzI1NiIsImtpZCI6IjU2MzU4ODUyMzRCOTI1MkRERTAwNTc2NkQ5RDlGMjc2NTY1RjYzRTIiLCJ0eXAiOiJKV1QiLCJ4NXQiOiJWaldJVWpTNUpTM2VBRmRtMmRueWRsWmZZLUkifQ.eyJvcmlnaW4iOiJodHRwczovL291dGxvb2sub2ZmaWNlMzY1LmNvbSIsInVjIjoiZmQ4ZDZkODIyYTEzNDA0YTgwNjk5OTFkOGQ5ZTE5MmEiLCJzaWduaW5fc3RhdGUiOiJbXCJrbXNpXCJdIiwidmVyIjoiRXhjaGFuZ2UuQ2FsbGJhY2suVjEiLCJhcHBjdHhzZW5kZXIiOiJPd2FEb3dubG9hZEBhZmI1ZDNjZi0yNjkzLTQ3ZTctYWRlOS02OTZhODA2YmE5NWEiLCJpc3NyaW5nIjoiU0lQIiwiYXBwY3R4Ijoie1wibXNleGNocHJvdFwiOlwib3dhXCIsXCJwcmltYXJ5c2lkXCI6XCJTLTEtNS0yMS05OTA5NzY0OTYtMzg0NzQzNjQyLTg5ODczODQ5Ni0yMDE0ODEwMlwiLFwicHVpZFwiOlwiMTE1MzgwMTExNjE4OTU4ODQ3NVwiLFwib2lkXCI6XCIwODgxODY2Zi03ZDk1LTQ2ZWYtYjY2OS1iOTRiZDJlYzAxMjZcIixcInNjb3BlXCI6XCJPd2FEb3dubG9hZFwifSIsIm5iZiI6MTYwNTc4MzkzMywiZXhwIjoxNjA1Nzg0NTMzLCJpc3MiOiIwMDAwMDAwMi0wMDAwLTBmZjEtY2UwMC0wMDAwMDAwMDAwMDBAYWZiNWQzY2YtMjY5My00N2U3LWFkZTktNjk2YTgwNmJhOTVhIiwiYXVkIjoiMDAwMDAwMDItMDAwMC0wZmYxLWNlMDAtMDAwMDAwMDAwMDAwL2F0dGFjaG1lbnRzLm9mZmljZS5uZXRAYWZiNWQzY2YtMjY5My00N2U3LWFkZTktNjk2YTgwNmJhOTVhIiwiaGFwcCI6Im93YSJ9.K0nK_T_kvL3sAQ1FP4wMEYcUqVx9euiwMAZQridjDqxKci5TRRsbjVVUlm4zSAnF7R4AKxt812IsztdrTauUmXP407bRAuWp-iQ-ZQGznRu1PrXW11qS-OtD4-KLHUP0vWJoH3QDJSrx38QRc87mEJoWhH3AMeInzOnmENo6WKAP91rDjXbozgCO3pNQZo1MSoPPTRYCEvli-Zgb1rp-EV9-PtnfRucQ7Tf-9xNNgWh6m25VJZkiQpYfPPyLsD-JTAKrzLMw_nS1W9j30Nk9BpuJvHbNHLcXh5397YDULsTZhuKfowOoyRaSiFO7-HLwm8mL5oYdfTuUDGxwhAFXeA&X-OWA-CANARY=CAg4HxCeGU2WEerdAZw457AUHF57jNgYelB1dz0r5bhmUDGLxrcY7Cwnl5bjfg1aG-GlgXi2x-g.&owa=outlook.office365.com&scriptVer=20201109002.11&animation=true

  4. 19 hours ago, Stefano said:

    @MariosBikos_HTC Thanks for the great tutorial! I'm running into a bit of a problem though. I've followed all the steps for to implement it into 4.24 but it seems to only work when one of my eyes is closed. When I close one of my eyes it tracks the either one just fine, but when I try with both eyes open it doesn't work. I've checked Is Stereo Gaze Data Available and Is Eye Tracker Connected and they are both true. Do you know why this is happening and what I should look into to fix it?

    Hi @Stefano,
    can you share the VRS settings that you are using when you see this issue? Is this happening with every combination of settings(Foveation Pattern Preset, Foveation Shading Rate)?
    Also are you using the latest version of SRanipal SDK? Can you send your logs at @marios_bikos@htc.com so that we can have a look? Anything that can help us reproduce the issue e.g the steps you followed.

  5. Hi @Tesi,

    we provide an Eye Tracking SDK for Vive Pro Eye that is called SRanipal SDK.  It is free for developers to integrate with their project and for the end-users to access via the SRAnipal runtime as the licensing is included as part of Vive Pro Eye hardware platform. Our SDK primarily offers things like feature data (gaze, pupil diameter/position...etc). You can see the full list of features in the image below.

    Currently our Vive SRanipal SDK doesn’t allow access to the raw data from the Eye Trackers. Developers can only get access to a specific set of features but not the raw feed from the trackers.

    Here is a webinar recording to help you understand more things about Vive Pro Eye and SRAnipal SDK: https://register.gotowebinar.com/recording/212634959163678731

    GetAttachmentThumbnail?id=AAMkAGVjMGE0Yjg0LWMwNjQtNDU2MS1iYjQwLTcyYjM4ZTNmMzlkMgBGAAAAAAB1Pj8SzGAOToOs9DtdpIjnBwDJSUN1U8crRKkSfyPQ%2Byl1AAAAdbnpAAAtkgq9eVmKTLagY%2FCiNPKeAADK8pmjAAABEgAQALznzkoLYsRArbAP%2FjEm6lk%3D&thumbnailType=2&owa=outlook.office365.com&scriptVer=20201024001.04&X-OWA-CANARY=UqZEwc4jfU6QIsImtIiSiYDweIQEgdgY3VimeM2cMiEqQT-e7YMQoQe5jp-rNilTSZid84fDwJs.&token=eyJhbGciOiJSUzI1NiIsImtpZCI6IjU2MzU4ODUyMzRCOTI1MkRERTAwNTc2NkQ5RDlGMjc2NTY1RjYzRTIiLCJ0eXAiOiJKV1QiLCJ4NXQiOiJWaldJVWpTNUpTM2VBRmRtMmRueWRsWmZZLUkifQ.eyJvcmlnaW4iOiJodHRwczovL291dGxvb2sub2ZmaWNlMzY1LmNvbSIsInVjIjoiZjJkMzA0OWYwMDY2NGY2M2FlMDc5OGQ5YzJmMDlhOTciLCJzaWduaW5fc3RhdGUiOiJbXCJrbXNpXCJdIiwib2lkIjoiMDg4MTg2NmYtN2Q5NS00NmVmLWI2NjktYjk0YmQyZWMwMTI2IiwiaXBhZGRyIjoiMTc2LjI0LjQ5LjE5MCIsInhtc19jYyI6IkNQMSIsImlhdCI6IjE2MDQ1MjE0MDUiLCJ2ZXIiOiJFeGNoYW5nZS5DYWxsYmFjay5WMSIsImFwcGN0eHNlbmRlciI6Ik93YURvd25sb2FkQGFmYjVkM2NmLTI2OTMtNDdlNy1hZGU5LTY5NmE4MDZiYTk1YSIsImlzc3JpbmciOiJTSVAiLCJhcHBjdHgiOiJ7XCJtc2V4Y2hwcm90XCI6XCJvd2FcIixcInByaW1hcnlzaWRcIjpcIlMtMS01LTIxLTk5MDk3NjQ5Ni0zODQ3NDM2NDItODk4NzM4NDk2LTIwMTQ4MTAyXCIsXCJwdWlkXCI6XCIxMTUzODAxMTE2MTg5NTg4NDc1XCIsXCJvaWRcIjpcIjA4ODE4NjZmLTdkOTUtNDZlZi1iNjY5LWI5NGJkMmVjMDEyNlwiLFwic2NvcGVcIjpcIk93YURvd25sb2FkXCJ9IiwibmJmIjoxNjA0NTIzNDE0LCJleHAiOjE2MDQ1MjQwMTQsImlzcyI6IjAwMDAwMDAyLTAwMDAtMGZmMS1jZTAwLTAwMDAwMDAwMDAwMEBhZmI1ZDNjZi0yNjkzLTQ3ZTctYWRlOS02OTZhODA2YmE5NWEiLCJhdWQiOiIwMDAwMDAwMi0wMDAwLTBmZjEtY2UwMC0wMDAwMDAwMDAwMDAvYXR0YWNobWVudHMub2ZmaWNlLm5ldEBhZmI1ZDNjZi0yNjkzLTQ3ZTctYWRlOS02OTZhODA2YmE5NWEiLCJoYXBwIjoib3dhIn0.cXLMbaE6fey7JLhQoKMbHrXRe0V21esxs1eIQWmD5kuauAEb5Vdt2yf_GTjp-lju1QTvhPPj7Da4eGUZmq1vTtKgRbp6zk2hu7n64uWVu7h838_rYnTbiSZu4pmDMUyVbNOJjVgHH06EoW-8YgB5Ex2iEF57wmZyYOL7irY_OWaeFQdPuEE4kgPflSSiMqjeXL427U26uUPJWCoCFeCNWR_pTJ0eL0LdfSFZxWsM7PdP13s-kddTmrez99GtNAeFhIm5JkkSK_4OCzUOEypGHRG0ZwGWQIVKDHFDhz8v_URohLPoVUAf0z55E5dee4g7EvIslm-STvc0Xlc6jCKqMQ&animation=true

  6. Hey @zgibsontheia,

    have you tried initialising the SRanipal runtime first? If you have a Vive Pro Eye device and the SRanipal is properly initialised then ticking the first tick box should initialise eye tracking in Unreal properly.
    For the Lip framework you need to have a Lip Tracker device otherwise it won't work so you need to disable it.

    Please also try the latest version of sRanipal v1.3.1.0 that comes with some changes on that front: https://developer.vive.com/resources/vive-sense/sdk/vive-eye-tracking-sdk-sranipal/

     

    • Like 1
  7. Hi @hieutt,

    if you use the Wave Unity XR Plugin  then you can use both Unity 2019.3 and 2019.4 (We recommend 2019.4 as it's an LTS version).
    However if you decide to use the Legacy Wave Plugin then you can only use if for versions of Unity up to Unity 2019.3.6, otherwise you will get a memory leak issue. 

    We recommend using the XR Plugin as that is future-proof considering the changes in Unity XR Platform. Also the XR Plugin is compatible with newer Unity versions (Unity 2020.1,etc). That's the reason why we focused on that.

    • Like 1
  8. Hi @d.b_mann,

    you are right, the reason you are getting this error is because after SRanipal v1.3.0.9, Lip Tracking is enabled by default so if you don't want any lip tracking functionality you need to disable the "Enable Lip by Default" settings tick box in SRanipal Project Settings of your UE4 project (see image). If you untick the box you should be able to package without errors. We will update the documentation to highlight this change. Thanks for reporting this 😉

     

    image.thumb.png.04a72bb52c39c7ccc71281e528a5f853.png 

    • Like 1
  9. Hey @Valvoa, what is your current firmware version on Focus Plus?
    I tried UE4.24 and the plugin project that is included in Wave 3.2 SDK and didn't have any issues creating and running a build. Can you please share some logs using adb logcat so that we can see what is going on when the app starts running? Also can you provide the Unreal project logs from {ProjectName}/Saved/Logs?

    Did you get any errors during the packaging process of the apk file in Unreal?

    • Like 1
  10. Hey @Franka, you are right, we currently don't have a function exposed to Blueprints for GetPupilDiameter but I have already informed the SDK team about it so that they can add one in a future version of the SDK. Meanwhile, here are some screenshots that will help to add that yourself.

    What you need to do is define a GetPupilDiameter function inside USRanipalEye_FunctionLibrary (similarly to how GetPupilPosition is structured) and then also create a function GetPupilDiameter() inside SRanipalEye_Core that will be called from the first function I mentioned. 

    Here are the cpp files I used to achieve that. You need to of course modify the header files as well.

     

    oKu8armCFApEtM1pgUB0oE5upB_E8yO72qB1NGQDjCvR7o-oQ7vjOzzTAPXHsaDOti8dg3AG6Zo60onTuOzaw-dHN2ITJcfsrQtGWefcZ1MYHonBCIoPK_93Fk84K2l7YpQ0N4_a

    Xv8SbkgS_z5_vJeXBgR4Jxppl1etKE1s0laEkUbn3RkYWA8jM-De8tpWOhZTanJ_b1zAhBqREyjorcnbiOFjztTnXd0VTUyEPHP2Grl5JgUmjv0oSicNkgCti3CQP34ob89ntStt

     

    • Thanks 1
×
×
  • Create New...