Jump to content
 

Search the Community

Showing results for tags 'unreal engine 4'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 9 results

  1. In this post, I am going to show you how to integrate Variable Rate Shading (VRS) with your Unreal Engine project in order to enable Foveated Rendering using the HTC Vive Pro Eye headset. This article is going to focus on Unreal Engine. If you are using Unity instead of Unreal Engine then you can use the Vive Foveated Rendering plugin from the Unity Asset Store or the Github page. It is assumed that you’re somewhat familiar with Unreal Engine, C++ and Blueprints. Requirements HTC Vive Pro Eye headset A GPU with NVIDIA Turing™ architecture ( GeForce™ RTX 20 series or GTX-1660 / GTX-1660 Ti ) NVIDIA Driver version: 430.86 or later OS: Microsoft Windows This post is divided into 4 main parts: Part 1 – Foveated Rendering & NVIDIA ariable Rate Shading (VRS) Part 2 – Setting up Unreal Engine Source Code from the Vive Github Repository. Part 3 – Setting up SRanipal Plugin Part 4 – Combining everything for Foveated Rendering results So, let’s get started! Part 1 – Foveated Rendering & NVIDIA Variable Rate Shading (VRS) Allow me to begin by stressing the importance of Foveated Rendering and how beneficial it can be for VR applications. Creative directors and technical artists always try to raise the bar of visual fidelity of immersive VR applications, but the engineering teams often hit the bottleneck of achieving the required minimum of 90 FPS as new headsets are released with better and better display resolutions. From time to time some new smart rendering techniques are invented that allow us to optimize this process without having to drop the quality. This is where foveated rendering comes in to play. If you’re not already familiar with the term Foveated Rendering, what you need to know about it is that it is a rendering technique that can be used together with VR Headsets that support Eye Tracking, such as the HTC Vive Pro Eye, to reduce the rendering workload, thus allowing developers to improve the performance of their VR applications or push the visual quality of their content. The term “Foveated” derives from the word Fovea, the part of the eye retina that allows us to have a sharp central vision (Foveal Vision) which lets us focus on the visual details of what is important in a narrow region right in front of us. The Foveal and Para-Foveal areas cover a field of vision of about 5 degrees. Anything outside the fovea region belongs to the peripheral vision and despite the fact that it is also useful, it can only detect fast-moving content changes and color, hence why it feels comparatively less detailed and blurry. The Foveated Rendering technique exploits the anatomy of the human eye and suggests that computers can drop the quality of graphics in the peripheral vision by lowering the resolution there while focusing all of the available processing power on the smaller foveated area of the image. VRS_01_v008_white_logo.mp4 Eye-Tracked Foveated Rendering should not be confused with a variant called Fixed Foveated Rendering that assumes a fixed focal point without taking eye gaze direction into account. Although the concept of Foveated Rendering has been around for a while, the new NVIDIA Turing™ architecture for GPUs allowed the use of a rendering technique called Variable Rate Shading (VRS) that can enable this feature. This technique offers the ability to vary the shading rate within a frame and can be combined with Eye-Tracking data to perform Foveated Rendering in VR experiences, hence why you need a GPU that is based on Turing™ architecture. Graphics cards have a component called pixel shader (or fragment shader), that determines the visual characteristics of a single pixel in a virtual environment such as its color, brightness, contrast, etc. Instead of executing the pixel shader once per pixel, with VRS the GPU can not only apply a single pixel shader operation to a whole 16 x 16-pixel region but also dynamically change the shading rate during actual pixel shading in one of 2 ways: Coarse Shading. Execution of the Pixel Shader once per multiple raster pixels and copying the same shade to those pixels. Configurations: 1×1, 1×2, 2×1, 2×2, 2×4, 4×2, 4×4 Super Sampling. Execution of the Pixel Shader more than once per single raster pixel. Configurations: 2x, 4x, 8x For example, in the case of Coarse Shading, we can select a 2×2 coarse shading that indicates that a single shade operation evaluated by a pixel shader invocation will be replicated across a region of 2×2 pixels. So instead of sampling 4 times for each pixel in the 2×2 area, we can sample once and broadcast the result to all 4 pixels. On the other hand, a Supersampling or Variable Rate Supersampling (VRSS) shading rate such as 4x Supersampling indicates that pixel shader will be invoked upto 4 times evaluating upto 4 unique shades for the samples within 1 pixel. This is really useful as we can improve the image quality and reduce aliasing providing a smoother immersive VR experience for the users especially when they need for example to read text in VR. This fine level of control enables developers to combine varied shading rates and gaze tracking capabilities for foveated rendering. NVIDIA also supports 2 other forms of VRS that can be applied to non-VR games and belong to the Adaptive Shading category, called Content Adaptive Shading and Motion Adaptive Shading. Content Adaptive Shading predicts the impact of reduced-rate shading and only does it if it’s expected to be imperceptible. The less contrast and variation the objects in the scene have, the lower the shading rate can be without reducing visual quality. Motion Adaptive Shading takes advantage of motion blur-type effects and moving game objects to reduce the shading rate when moving and turning in-game. The idea is to use VRS when objects are moving fast enough that the reduced shading rate is not noticeable. Part 2 – Setting up Unreal Engine Source Code from the Vive Github Repository. Usually, most engineering teams that use Unreal Engine for VR development have to make a choice depending on the VR project they will work on and the flexibility they need. They can use: a binary version of Unreal Engine provided from Epic Games via the Epic Games Launcher. In that case, the Engine can’t be modified and only binary plugins that are verified to work with that version of Unreal Engine can be used. This is usually the case if you want to build quickly a demo using technologies/features that are already supported out of the box. the source code of a specific version or branch of Unreal Engine. The engineering team can introduce entirely new features in their custom source code of Unreal Engine to reach their goals for a project. For example, if Unreal doesn’t support a feature, the engineers can modify the source code of the engine to introduce this feature. Or if they find a bug in Unreal Engine that has not been fixed yet, they can fix it themselves without the need to wait for Epic Games to fix it. Most engineering teams that work on projects with new technologies usually prefer the 2nd option. This gives them more control over their pipeline. Currently, if you want to integrate VRS and Foveated Rendering in your Unreal Engine project you need to download and use a custom modified version of the Unreal Engine source code from the HTC Vive Github Repository. HTC Vive currently supports 2 different versions of Unreal Engine for Foveated Rendering, so there is a branch for UE 4.21 and another branch for UE 4.23. This modified version of Unreal Engine contains: Changes to the Unreal Engine Rendering Pipeline source code to support VRS for certain render passes. The VRS PLugin that comes with NVIDIA API libraries We realize that you may want to integrate Foveated Rendering in your project and you may be using a binary/installed version of Unreal Engine or a different custom version of Unreal Engine. HTC Vive is working closely with NVIDIA and Epic Games to make this process easier in the future by integrating the required engine changes to the official Unreal Engine branch so that only the VRS plugin will be needed without any further modifications. Don’t forget that in order to be able to access the Unreal Engine Source code repository, it’s required to link your Epic Games account to GitHub account and get authorized by Epic Games. For now, if you or your team already use a custom Unreal Engine source code for versions 4.23 or 4.21 and you don’t want to switch to a different engine, you can copy the changes we made in the Unreal Engine Rendering pipeline source code and of course, add the extra VRSPlugin. It’s only a few lines of code! More specifically, we had to make changes to the following files: Engine/Source/Runtime/Renderer/Private/ScenePrivate.h Engine/Source/Runtime/Renderer/Private/PostProcess/PostProcessing.cpp Engine/Source/Runtime/Engine/Public/SceneViewExtension.h You can choose to download the source code branch as a .zip file or use Github Desktop to clone the repository locally. We recommend using Git to clone the repository, since this way you can quickly fetch the latest updates and bug fixes and update your local copy every time the HTC Vive engineers push a change. Here is the list of steps required after you download the custom Unreal Engine source code in order to get the Engine up and running: 1. If you downloaded a zip archive of the code, extract the archive contents to a directory where you would like to build the engine, e.g., C:\Unreal\4.23-vive. Make sure you use a short path, otherwise you may get errors in the next step with files that exceed the Windows length limit for file names. Alternatively, you can map your install directory as a Windows drive to reduce the path length. 2. Now navigate to the engine directory and run Setup.bat. You may need to Run as Administrator. The Setup.bat script will start downloading the required 3rd party dependencies for Unreal Engine and your system. This may take some time. Make sure that you check regularly the window for any warnings/issues. 3. The next step is to download all the required dependencies for Unreal Engine, so we need to run GenerateProjectFiles.bat to generate the Visual Studio Solution. By default, the script GenerateProjectFiles.bat creates project files for Visual Studio 2017. 4. Launch UE4.sln to open the project solution in Visual Studio. 5. In the menu bar, select Build > Configuration Manager. Verify that Active solution configuration is set to Development Editor and that Active solution platform is set to Win64. 6. In the Solution Explorer, right-click UE4 under Engine and select Set as Startup Project. This makes sure we build the Engine and not one of the other Programs that comes with Unreal. 7. Now, right-click UE4 under Engine again and, in the context menu, select Debug > Start New Instance to launch the engine. 8. Once the Unreal Engine is launched you can select the project you would like to open or specify a new project. If you are creating a new project, don’t forget to specify a project name. **There is currently an issue if you try to select a Blank Blueprint or Basic Code C++ Project where everything is black unless the mode is set to “Unlit” so instead try to select any of the other Template Projects. If you already have an Unreal Project built with a different version of Unreal Engine and you just want to swap the Engine versions, then you can try to open the Project using the Unreal Engine version that supports Foveated Rendering. To do that, find the Unreal Engine Project file, right-click on it and switch the Unreal Engine version used for that project to the new version of the engine we installed (browse to the installed path). That’s all, you should now be able to use the Custom Unreal Engine provided by HTC Vive with your project. Part 3 – Setting up SRanipal Plugin In order to optimize the rendering quality to match the user’s gaze, we need to feed Eye-Tracking gaze data to VRS. To get access to the data provided by the Eye-Tracking capabilities of the HTC Vive Pro Eye, you need to use the Vive SRanipal SDK. The SRanipal SDK allows developers to track and integrate users’ eye and lip movements, empowering them to read intention and model facial expression, create new interactions and experiences and improve existing ones exploiting the precise tracking of eye movement, attention and focus. The SRanipal SDK includes: the required runtime(SR_Runtime) which runs in the notification tray to show the current eye-tracking status for HTC VIVE Pro Eye. Plugins for Unreal/Unity along with sample code for native C development. Follow the list of steps below in order to integrate the SRanipal plugin in your project: 1. Visit the Vive SRanipal SDK page at the Vive Developer website, read the guidelines and start the download procedure. If you don’t have an HTC Developer account you will be asked to register and create a new one to be able to download the SRanipal SDK. 2. Download the Vive_SRanipalInstaller that contains the required runtime. Follow the instructions to install SR_Runtime. 3. Once installed, ensure that your Vive Pro Eye headset is connected to your PC and launch SR_Runtime.exe as an Administrator to start the runtime. Wait until you notice the SRanipal status icon that looks like a Robot appearing in the Windows notification tray. This icon’s eyes will turn orange if a headset with Eye-tracking capabilities is detected but the device is in idle mode and green if a program is retrieving eye data from it. 4. Start SteamVR (If not running already) 5. Put on your HMD. 6. The next step is to go through the Eye-Calibration Process. This is required for each new user and the calibration results are saved to the PC. To start eye calibration, press your VIVE controller’s system button and the calibration program will show an overlay window on your HMD. Select the Vive Pro button at the bottom grid of SteamVR menu. Note that for the highest level of precision, it is recommended to recalibrate for different users, as the eye positions and the pupillary distances are different for each individual. 7. You will need to read and accept the user agreement before turning on the Vive Eye-Tracking capabilities. 8. Press Calibrate to start. It will start by adjusting your HMD position. *If you have any issues with the Calibration Process and you get an error message “Initialization Failed” please follow the steps described in this troubleshooting guide. 9. Next, you need to adjust your IPD value, as shown below. 10. After that, you will be guided to do gaze calibration. Please look at the blue-dot sequentially shown at the center, right, left, upper and lower of the panel until the calibration has been successful, moving only your eyes and not your head. 11. Calibration is done. You are ready to develop eye-aware applications. Now you can try a mini-demo to light up the dots with your eye gaze or press the Vive controller’s system button to quit eye calibration. 12. Download the Vive SRanipal SDK .zip file. It contains the required plugins for each platform as well as documents with detailed instructions. 13. You will need to install the SRanipal Unreal Engine plugin in your Unreal project. To do that, copy the folder located in SRanipal SDK\v1.1.0.1\SRanipal_SDK_1.1.0.1\03_Unreal\Vive-SRanipal-Unreal-Plugin\SRanipal into the Plugins folder of your Project. If you don’t have a Plugins folder in your project structure, you can create a new one and name it “Plugins”. Then generate Visual Studio Project files again for your project. 14. Open the Unreal project, go to Edit->Plugins and you will see the SRanipal plugin under the Project plugins. Make sure it is enabled. 15. Now you can use the SRanipal SDK features for Eye Tracking with your HTC Vive Pro Eye. Part 4 – Combining everything for Foveated Rendering results At this point, the last thing we need is to tie everything together. We will combine the SRanipal plugin and the VRS plugin to enable Foveated Rendering by feeding eye gaze data each frame to the VRS algorithm. In Part 2 above, we used the custom Unreal Engine source code that comes with the VRS plugin. You can locate the VRS plugin folder at UnrealEngine\Engine\Plugins\Runtime\ViveSoftware\VRSPlugin. You will notice that it contains the required NVIDIA VRS APIs as well as code to easily modify VRS settings in your application. Also if you open your Unreal Engine project and go to Edit–>Plugins, you will find the Variable Rate Shading plugin under the Rendering category of the Unreal Engine plugins. Again make sure it is enabled. In order to feed gaze data to VRS and enable Foveated Rendering, developers need to go through the following list of steps: 1. Create a new function to return the Normalized Eye Directions using SRanipal’s method GetVerboseData() and expose it to Blueprints. One way to do this is to add a new function in the list of methods found in the header file SRanipal_FunctionLibrary_Eye.h. located in Plugins\SRanipal\Source\SRanipal\Public\Eye. Of course, you can always create a new static method anywhere else in your project as long as it uses the SRAnipal module dependency. Go ahead and type in the following code for the new function GetEyeGazeDirectionsNormalized. This function will return the normalized Eye Gaze Directions for each eye. UFUNCTION(BlueprintCallable, Category = "SRanipal|Eye") static bool GetEyeGazeDirectionsNormalized(FVector& LeftEyeGazeDirectionNormalized, FVector& RightEyeGazeDirectionNormalized); Also, for the implementation of the function, you will need to type in the following code in the source file SRanipal_FunctionLibrary_Eye.cpp. bool USRanipal_FunctionLibrary_Eye::GetEyeGazeDirectionsNormalized(FVector& LeftEyeGazeDirectionNormalized,FVector& RightEyeGazeDirectionNormalized) { LeftEyeGazeDirectionNormalized = FVector(); RightEyeGazeDirectionNormalized = FVector(); ViveSR::anipal::Eye::VerboseData VerboseDataOut; bool Result = SRanipal_Eye::Instance()->GetVerboseData(VerboseDataOut); LeftEyeGazeDirectionNormalized = VerboseDataOut.left.gaze_direction_normalized; RightEyeGazeDirectionNormalized = VerboseDataOut.right.gaze_direction_normalized; return Result; } 2. Feed the Normalized Eye Directions to VRS plugin’s UpdateStereoGazeDataToFoveatedRendering function. This way, the center of the foveated region for every frame will be calculated by the VRS plugin automatically. You will need to connect the function GetEyeGazeDirectionsNormalized from the previous step to the function UpdateStereoGazeDataToFoveatedRendering from the VRS plugin during Event Tick. You can do this by creating a new Blueprint Actor and adding the following Blueprint nodes in its Event Graph. Don’t forget to place the actor in your Scene so that it can tick. Of course, you don’t necessarily need an Actor for this, you could use the same Blueprint nodes in your GameMode or your GameInstance. 3. Finally, you need to drag and drop an Actor of type SRanipal_Eye_Framework in your Scene and enable Eye tracking (tick box) on it. Only then you can get valid eye gaze directions. 4. If the VRS plugin was successfully enabled, VRS Settings will appear under the Project Settings ->Plugins category. Below you will find a list that describes what each setting can be used for. For more information, you should read this detailed article from NVIDIA. Enable VRS: Whether to enable Variable-Rate Shading or not. Enable Eye Tracking: If an eye-tracking device is properly set up and with its eye-tracker plugin enabled, checking this will automatically fetch eye-tracking data and change the center of the foveated region according to it. ScreenPercentage: In Unreal Engine, the VRSPlugin requires r.ScreenPercentage to be larger than 100 to trigger the UpScale pass renderer so VRS could take effect, that’s why it’s set to 101 by default. Foveation Pattern Preset: This property adjusts the region size of foveated rendering. There are several pre-defined region sizes, the smaller more aggressive : Narrow Balanced Wide Custom Shading Rate Preset: This property adjusts the shading rate of each region. There are several pre-defined shading rates: Highest Performance High Performance Balanced High Quality Highest Quality Custom Region Shading Rates + Sizes: If you pick the Custom option as the Foveation Pattern Preset, the region size for each of the available regions(Innermost, Middle, Peripheral) can be customized. Developers can set the Horizontal and Vertical Radius of each region. A Diameter of value 1.0 equals to the width of a single eye buffer. Also, the Region shading rate can be customized to: Culled: Nothing will be rendered (black) Normal Shading: Each Pixel is sampled once Supersampling options (2x,4x,8x,16x): Each pixel is sampled more than once, which would result in less aliasing than Normal Rendering (but will take more computing power). Coarse Shading options (Once per 2×1,1×2,2×2,4×2,2×4,4×4 pixel): With coarse shading, a group of pixels is only sampled once, which would result in performance gain, but with fewer details rendered. All parameters in VRSSettings in Project Setting can be changed during a VR PReview / Play-In-Editor (PIE) session. Please be noted that the VRS configuration modified in Project Settings will not be carried to the packaged game. Therefore, a set of Blueprint functions is provided by the VRS plugin to alter them during runtime. Developers should call these functions at the initialization phase. The boolean return value indicates whether the setter function execution was successful or not. Generally, VRSPlugin will fallback to Fixed Foveated Rendering if there’s any problem with eye-tracking (e.g. invalid gaze data, device not connected, tracker module not loaded/implemented etc.) You should now be able to apply foveated rendering to your VR application and adjust shading rate and region size for either better performance or better quality according to your project requirements. Currently, there isn’t a way to visualize each region individually, e.g with a different color, but if you want to test that Foveated Rendering works indeed, you can use “Culled” for the Peripheral and Middle Regions so that only the Innermost Region is rendered. Using the Balanced option for both the Foveation Pattern and Shading Rate Presets should give you the best visual quality and performance. Performance gain from VRS ranges from 20% to 38% for static scenes. The more dynamic objects in the scene, the less gain from VRS. Keep in mind that for extremely dynamic scenes, VRS gains almost nothing because most of the GPU resources are used to calculate particle emission and skeletal animation, which is not benefited by VRS. Known Issues Foveated rendering is not without its own shortcomings. One of the most noticeable artifacts is magnified aliasing in the peripheral region. This artifact is more obvious for thin or glossy objects in the scene. To ease this kind of artifact, the following options are our recommendations: Use temporal anti-aliasing. Developers can choose AA method under Project Settings > Rendering > Default Settings > Anti-aliasing Method Tweak region parameters (size and resolution) based on the content. For scenes with a lot of text or glossy materials, developers should use a less aggressive setting. Some objects will flicker when VRS is enabled. Uncheck “Occlusion Culling” in Project Settings > Rendering can fix it. Other known issue so far: Performance gain from VRS for static scenes is about 18% base on our test. VRS gains less performance when the scene is full of dynamic objects like skeletal mesh, particle emitter, and post-processing-effects. Keep in mind Bloom will not take effect. If you use RenderDoc it may cause a conflict with VRS Plugin, so please disable RenderDoc plugin and try again. We support Unreal Engine 4.21 and 4.23 for now. We injected some VRS function calls in UE rendering pipeline to make VRS happen only for certain render passes. The UpScale pass was introduced in UE 4.19. Thus, the minimum supported version will be UE4.19. References What Is Variable Rate Shading? A Basic Definition of VRS NVIDIA Says New Foveated Rendering Technique is More Efficient, Virtually Unnoticeable VRWorks Variable Rate Shading (VRS) Turing™ Variable Rate Shading in VRWorks Easy VRS Integration with Eye Tracking Microsoft – Variable Rate Shading
  2. Hi, I've been having some issues with steamvr crashing which seemed to be somehow caused by the vive wireless app but nothing was reproducible enough that I thought it worth reporting. But over the last few days I've noticed that it is also causing unreal engine to crash. If I either close the wireless app, or I disconnect the battery to headest/the battery runs out, it will immediately cause unreal engine to crash with no warnings and nothing written to unreal logs. I'm not sure what information I should provide to help work out what the issue is but if someone lets me know I'd be happy to provide any further details. Thanks. @Jad
  3. Hi, Since the SRanipal documentation is not that great, and it took me a long time to get things up and running, I thought I'd share my solution here to help getting other people started. I'm not claiming it's the best solution and it's definitely not the only one, so if there are suggestions to improve it, let me know 🙂. What I'm trying to do here is sending the location and rotation of the HMD as OSC messages, as well as the eye angle measured by the HTC Vive Pro Eye. These OSC messages can be logged, or even processed in Matlab in real time, which is very useful when doing research, for example if you have to wait for the participants in an experiment to look back to the front before starting the next trial. To get them in Matlab, a tool called LabStreamingLayer (LSL) can be used (https://github.com/sccn/labstreaminglayer/wiki/Tutorial-4-a.-Receive-Data-streams-in-MATLAB). OSC messages can be converted into LSL streams using this tool: https://github.com/gisogrimm/osc2lsl. To check my implementation here, I tried putting an object in the measured gaze direction. I used the following software: - Unreal Engine 4.23.1 - SRanipal_SDK_1.1.0.1 - SR_Runtime 1.1.2.0 - OSC plugin for Unreal Engine 4 Blueprints (https://github.com/monsieurgustav/UE4-OSC) To set this up, first of all, create a new C++ Basic Code project in Unreal. Close it, and install the SRanipal and OSC plugins as explained in their documentation (create a 'Plugins' folder in your project folder and copy the plugin folders there), also copy the SRanipal Content folder to the Content folder of your project. Open the project again, and check that the two plugins show up and are enabled (Edit --> Plugins), and import the new content. Now duplicate the EyeSample2 level in the SRanipal content, and rename it. Open the new level and delete all the unnecessary things, leave only: Atmospheric Fog, Light Source, PlayerStart, Sky Sphere and SRanipal_Eye_Framework. Now, VR has to be added to the project, click Add New --> Add feature or content pack... --> Blueprint Feature (Virtual Reality) --> Add to project. Find the Motion Controller Pawn in the newly added VirtualRealityBP --> Blueprints and make a copy of it, I named this 'VR_controller'. Add the VR_controller to the level and put it in (0,0,0). Click 'Edit VR_controller' in the World Outliner to open the Blueprint. In the Event Graph, add the components shown in the picture (I hope this is readable), then Compile (ignore the warnings) and Save. Don't forget to enter the IP address of the PC you want to send your OSC messages to in the Add Send Osc Target block (I'm sending it to two PC's simultaneously), it should also be sent to the PC you're running Unreal on for things to work later on. Now, we'll add an object that will point in the measured gaze direction. To do this, click Add new --> Blueprint Class --> Actor, and name this one 'GazeSteeredObject'. Add it to the level, also in (0,0,0) and click Edit. Add an OscReceiver (Add Component --> OSC Receiver), and also add an object to be in the direction of the gaze (I added a sphere). Give it a nice material if you like. The Sphere should be the child of the DefaultSceneRoot. Now edit the Blueprint. First of all, we need to give the object the same location and rotation as the HMD. To do this, add the following to the blueprint: Now, the GazeSteeredObject should be following the head direction. To put it in the gaze direction, we need to get the eye angle and add this. We'll also send this data as OSC message. Add the components in the blueprint as shown in the picture. Make sure the initial X location of the Sphere is set to 100 (or some other positive non-zero value). The 'Get Gaze Ray' function sets the combined 'Gaze Direction', which is a vector of 3. It is not documented what this is exactly, but the Y-value seems to be the horizontal eye angle and the Z-value the vertical eye angle in degrees. The X-value is always close to 100, I don't know what this is. After compiling and saving you can play the level in VR, the sphere should now move with your gaze direction (of course calibrate the eyetracker first). The sampling rate is determined by time value in the 'Set Timer by Event' block. I hope this was useful. Finally, I have some questions for the VIVE staff: - Is it true that the 'Direction' of the 'Get Gaze Ray' function gives the combined horizontal eye angle as Y-value and the combined vertical eye angle as Z-value, in degrees? And what is the X-value, some confidence value? - Using the time value in 'Set Timer by Event' I can determine the sampling rate with which my OSC messages are sent. Values of 100 Hz and 125 Hz seem to work fine for the GazeDirection, and I even tried 200 Hz for the HMD location and rotation. The data points that come out do have the set sampling rate, and it's not sending the same value multiple times, so this looks good. But what is the maximum sampling rate that the sensors can deliver, and what happens when I set my sampling rate higher? Are these the actual measured data points or are the values extrapolated somehow? Best, Maartje @Daniel_Y @zzy
  4. Hi, Is there a way to start the eye calibration without using the controller and having to go through the menu? I'm planning to use it in an experiment and want to start it directly (ideally from Unreal 4.23 because I'm using that) without having to tell my participants which buttons they have to click all the time. In the SRanipal Unreal SDK document the function 'LauchEyeCalibration' is listed, which sounds promising, but this does not show up in the list of available functions in the Unreal Blueprint. How do I run this function, then? Thanks @Corvus @Daniel_Y @zzy
  5. I would like to make sure about GetValidity() in SRanipal Unreal SDK. (I'm using Unreal Engine 4.22.3, SRanipal_Runtime 1.1.2.0, SRanipalSDK 1.1.0.1.) I have checked the source code concerned to GetValidity() in SDK (see TEXT1 below) and seemed to discover that GetValidity() is incorrect in a part of the code. If the validity as an argument with GetValidity() is SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY and its flug also be been on in an eye_data_validata_bit_mask, the function should return the TRUE, however, it returns FALSE. Because, SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY defines 0 in the SingleEyeDataValidity(TEXT1) as enum. Therefore, the source code should be corrected as GetValidity() in TEXT2, or SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY should be not 0, how is this? -TEXT1(This is SRanipal_Eyes_Enums.h)- enum SingleEyeDataValidity { SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY, SINGLE_EYE_DATA_GAZE_DIRECTION_VALIDITY, SINGLE_EYE_DATA_PUPIL_DIAMETER_VALIDITY, SINGLE_EYE_DATA_EYE_OPENNESS_VALIDITY, SINGLE_EYE_DATA_PUPIL_POSITION_IN_SENSOR_AREA_VALIDITY }; typedef struct SingleEyeData { /** The bits containing all validity for this frame.*/ uint64_t eye_data_validata_bit_mask; bool GetValidity(SingleEyeDataValidity validity) { return (eye_data_validata_bit_mask & (uint64)validity) > 0; } }SingleEyeData; -TEXT2- bool GetValidity(SingleEyeDataValidity validity) { return (eye_data_validata_bit_mask & ((uint64)1 << (uint64)validity)) > 0; } Thank you. @Daniel_Y @zzy
  6. Hello. I want to calculate the gaze point in Unreal Engine. I have some questions about EyeData. (I'm using Unreal Engine 4.22.3, SRanipal_Runtime 1.1.2.0, SRanipalSDK 1.1.0.1) 1. Shouldn't gaze_origin_mm be used to calculate the gaze point? Why EyeFocusSample doesn't use gaze_origin_mm? 2. Is eye_data_validata_bit_mask a value that is set a bit corresponding to SingleEyeDataValidity? For example, if only gaze_origin_mm and gaze_direction_normalized are valid, eye_data_validata_bit_mask is ((1<<SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY) | (1<<SINGLE_EYE_DATA_GAZE_DIRECTION_VALIDITY))? (SingleEyeData::GetValidity() seems to be wrong.) 3. The convergence_distance_validity seems to be always FALSE. Is convergence_distance_mm supported now? Thank you. @Corvus @Daniel_Y
  7. Hi, I'm new to this and I'm trying to set up SRanipal in Unreal so that I can get the eyetracking data. I have followed the manual and can get to the point where I have to enable the plugin, but it does not show up in the list of plugins in Unreal. Any ideas what could be wrong? I tried copying the unzipped 'Plugins' folder into the project folder and also tried copying the 'SRanipal' folder into the 'Plugins' folder of the game engine. Best, Maartje @Daniel_Y @zzy @Corvus
  8. Hello Everyone, So I've been wondering if there is an easy way to basically wrap a model around the base Skeleton of the Vive Hand Tracking. For example, the Leapmotion_Basehand_Rig_Left seems to have a sekeleton that lines up with the Vive Hand Tracking Spawn Points (Linked it in attachments), so one way I thought of doing this is to use these spawnpoints to just control the Skeleton. However I have not been able to do this in Unreal Engine. (Note; I'm not too experienced with UE4 Animations, so there might be something basic I'm missing) What I've come across so far from looking around is that I need to have states for the animations to work, but I want to be able to move the model around according to the spawn points if possible. Can someone help me with this, or point me in the right direction? I've seen Control Rig, Animation Blending, and I've tried to just get the bone nodes in blue prints and animate them there, but so far I'm not sure how I should approach this. Thanks! And if there's anything I can elaborate on let me know!
  9. There is a bug in a clean install of the UE4 plugin. I get this error non-stop: WVRSimulator: Error: Failed to load Simulator library. This prevents me from packaging or launching the game @Tony PH Lin @Cotta
×
×
  • Create New...