Jump to content

Search the Community

Showing results for tags 'unreal engine 4'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 20 results

  1. Hi, Is there a way to start the eye calibration without using the controller and having to go through the menu? I'm planning to use it in an experiment and want to start it directly (ideally from Unreal 4.23 because I'm using that) without having to tell my participants which buttons they have to click all the time. In the SRanipal Unreal SDK document the function 'LauchEyeCalibration' is listed, which sounds promising, but this does not show up in the list of available functions in the Unreal Blueprint. How do I run this function, then? Thanks @Corvus @Daniel_Y @zzy
  2. In this post, I am going to show you how to integrate Variable Rate Shading (VRS) with your Unreal Engine project in order to enable Foveated Rendering using the HTC Vive Pro Eye headset. This article is going to focus on Unreal Engine. If you are using Unity instead of Unreal Engine then you can use the Vive Foveated Rendering plugin from the Unity Asset Store or the Github page. It is assumed that you’re somewhat familiar with Unreal Engine, C++ and Blueprints. Requirements HTC Vive Pro Eye headset VR Ready Quadro (Quadro Desktop: Quadro 4000 card and higher, Quadr
  3. Hello everybody, I am developing a simple VR scenario in UNREAL ENGINE 4 which extrapolates the orientation data from the VIVE TRACKER (2018) and I am trying to understand how the device (and in general the tracking process) is actually working. My first question is purely technical: does the VIVE TRACKER implement an inertial sensor to obtain the orientation of the device? If yes there is any specification on the marginal errors or drift effect? The second question is for those of you that works in UE4 and concerns the reference system of the TRACKER. While it is clear the ori
  4. Application: Academic Research Goals: Install SDK ----------------------------- [ X ] Get Eye Gaze ------------------------- [ ] Get Fixation --------------------------- [ ] Get Pupil Dilation ------------------- [ ] Run Subjects & Get Tenure ----- [ ] Question: How do I reference the SDK's framework / API to extract close to real-time eye tracking data that prints either in a data.frame or CSV file. ------------------------------------------------------------------------------------------------------------------
  5. Problem I'm having some difficulty getting the plugin for unreal engine installed for the Vive Pro Eye Headset. Following the directions in the documentation leads to compile errors when I go to launch the project. In short, I can't get the plugin to compile or the project to load correctly. Done: [X] - Installed and running steam VR [X] - Installed and running SR_Runtime [X] - Calibrated Eye Tracker in Steam VR Error Trigger Pasted unzipped unreal plugin into C++ blank UE4 project. Returned message from unreal: Result of
  6. Hi there, I am working on an application that uses the SRWorks pass-through mode with a Vive Pro in UE4. Until yesterday everything worked fine, however the cameras suddenly stopped sending out images. I can't even run the Experience_Unreal sample or create a blank project - all I see are the default textures. I've attached a screenshot of the VR preview, which mirrors the right eye. On the left eye, I am seeing the default brick wall texture. Also, when I turn my head, I can see, that those textures are simply put on a plane, which is not following my head (as expected). I already t
  7. Hi all, We're releasing a demo for Rigel, our "All in One" Full Body Motion Capture Solution for body, fingers and face, so that potential customers can test this solution and evaluate accordingly. Here you can download the executable demo: Rigel Demo Since the introduction of the SteamVR Input plugin, there have been some changes related to the Vive Trackers setup, so we made a video explaining what needs to be done in order to make everything work during Rigel's Calibration. Rigel Demo Guideline
  8. Hi all, In this video you can see the realtime motion capture data smoothing that allows the user to choose the degree of smoothing while recording motion capture. This feature is intended to smooth out data while recording fast movements ( like fighting and fast gestures ), so that the animation curve will require less cleanup after the mocap has been recorded. Rigel - Features Highlight - Realtime Motion Capture Data Smoothing
  9. Hi all, In this video we're showing some of the advanced features for Rigel, our All in One Full Body Motion Capture Solution using Vive Trackers. We're showing how fast and easy the calibration process is, and how Rigel is able to retarget in realtime the animation data from the Vive Trackers to characters with different body size. Rigel - Features Highlight The Full Body Motion Capture Solution is set for release at the end of June, and a demo will be available in short time. If you want to read more about the entire setup, here is a FAQ page. Rega
  10. The Wave SDK for Unreal Engine is now also available on Github: https://github.com/ViveSoftware/VIVE-Wave-SDK-Unreal INTRODUCTION So far, the Wave SDK for developers using Unreal Engine has been available only on the Vive Developer Website. We decided to release the WaveVR plugin for Unreal Engine as a public Github repository. This will allow developers to report bugs or suggest enhancements using Github Issues allowing us to get feedback from the developer community. Developers can also create Pull Requests to suggest bug fixes. The Vive team will review the pull requests and follo
  11. UPDATE: The issue has been addressed with Beta 1.0.12.2 released. Here is how to change to the the BETA stream. -------------- There is currently an issue where Cosmos Elite users will get a crash when running applications that were built using Unreal Engine 4.24 or 4.25. We are now tracking the issue and will report back whenever there is an official fix for it. However, you can apply a temporary workaround for the issue as reported here: https://answers.unrealengine.com/questions/953996/does-the-unreal-engine-support-the-cosmos-elite.html The workaround is quite simple but it
  12. Hi all, We recently released an Early Access version of Wave SDK 3.1.94. This version comes with several new Experimental Features for content developers and one of them is Direct Preview for Unreal Engine. While creating content for Vive Focus/ Focus Plus, developers need to test and tweak their project to make sure that everything works properly. However this process is often time-consuming as developers need to repeatedly build, deploy and install APKs spending time waiting during the development stage. That’s why we introduced the Direct Preview feature wh
  13. Hi, I've been having some issues with steamvr crashing which seemed to be somehow caused by the vive wireless app but nothing was reproducible enough that I thought it worth reporting. But over the last few days I've noticed that it is also causing unreal engine to crash. If I either close the wireless app, or I disconnect the battery to headest/the battery runs out, it will immediately cause unreal engine to crash with no warnings and nothing written to unreal logs. I'm not sure what information I should provide to help work out what the issue is but if someone lets me know I'd
  14. Hi, Since the SRanipal documentation is not that great, and it took me a long time to get things up and running, I thought I'd share my solution here to help getting other people started. I'm not claiming it's the best solution and it's definitely not the only one, so if there are suggestions to improve it, let me know 🙂. What I'm trying to do here is sending the location and rotation of the HMD as OSC messages, as well as the eye angle measured by the HTC Vive Pro Eye. These OSC messages can be logged, or even processed in Matlab in real time, which is very useful when doing research, fo
  15. I would like to make sure about GetValidity() in SRanipal Unreal SDK. (I'm using Unreal Engine 4.22.3, SRanipal_Runtime 1.1.2.0, SRanipalSDK 1.1.0.1.) I have checked the source code concerned to GetValidity() in SDK (see TEXT1 below) and seemed to discover that GetValidity() is incorrect in a part of the code. If the validity as an argument with GetValidity() is SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY and its flug also be been on in an eye_data_validata_bit_mask, the function should return the TRUE, however, it returns FALSE. Because, SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY defines 0 in t
  16. Hello. I want to calculate the gaze point in Unreal Engine. I have some questions about EyeData. (I'm using Unreal Engine 4.22.3, SRanipal_Runtime 1.1.2.0, SRanipalSDK 1.1.0.1) 1. Shouldn't gaze_origin_mm be used to calculate the gaze point? Why EyeFocusSample doesn't use gaze_origin_mm? 2. Is eye_data_validata_bit_mask a value that is set a bit corresponding to SingleEyeDataValidity? For example, if only gaze_origin_mm and gaze_direction_normalized are valid, eye_data_validata_bit_mask is ((1<<SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY) | (1<<SINGLE_EYE_DATA_GAZE_DIRECT
  17. Hi, I'm new to this and I'm trying to set up SRanipal in Unreal so that I can get the eyetracking data. I have followed the manual and can get to the point where I have to enable the plugin, but it does not show up in the list of plugins in Unreal. Any ideas what could be wrong? I tried copying the unzipped 'Plugins' folder into the project folder and also tried copying the 'SRanipal' folder into the 'Plugins' folder of the game engine. Best, Maartje @Daniel_Y @zzy @Corvus
  18. Hello Everyone, So I've been wondering if there is an easy way to basically wrap a model around the base Skeleton of the Vive Hand Tracking. For example, the Leapmotion_Basehand_Rig_Left seems to have a sekeleton that lines up with the Vive Hand Tracking Spawn Points (Linked it in attachments), so one way I thought of doing this is to use these spawnpoints to just control the Skeleton. However I have not been able to do this in Unreal Engine. (Note; I'm not too experienced with UE4 Animations, so there might be something basic I'm missing) What I've come across so far from looking around
  19. There is a bug in a clean install of the UE4 plugin. I get this error non-stop: WVRSimulator: Error: Failed to load Simulator library. This prevents me from packaging or launching the game @Tony PH Lin @Cotta
  20. Hi all - I'm trying to integrate two Vive trackers into my Unreal project, but I'm not able to get the trackers to consistently register in SteamVR. When I hover over the tracker in the SteamVR window, it shows it "Searching" most of the time rather than actively tracking. When I open my project, the mesh I have matched with the tracker only appears every so often when it's actively tracking Does anyone have advice about how to get a consistent track? I have the Tracker Dongles plugged in to my PC and resting over a foot away from it. Thanks!
×
×
  • Create New...