Jump to content
 

Search the Community

Showing results for tags 'unreal engine 4'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 18 results

  1. Application: Academic Research Goals: Install SDK ----------------------------- [ X ] Get Eye Gaze ------------------------- [ ] Get Fixation --------------------------- [ ] Get Pupil Dilation ------------------- [ ] Run Subjects & Get Tenure ----- [ ] Question: How do I reference the SDK's framework / API to extract close to real-time eye tracking data that prints either in a data.frame or CSV file. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Hi there, I've been able to get the VR and eye tracking set up and working per a previous thread with UE 4.25.3 and have a demo with VR functional. The only thing is I'm not sure where to go from here to get the SDK to write/print data to a csv or similar file. Presently I don't need any eye tracking interactions within the VR environment so the dartboards and mannequin head are useful in that they show this data exists but I need to get that data out of whatever loop it is in and write it to a data file for processing in statistical programs and the like. @MariosBikos_HTC , you've been a great help so far. Let me know if you or another HTC fellow are the right ones to ask about this function. Once I get something functional I'll definitely be sharing it for future folks in my position.
  2. Problem I'm having some difficulty getting the plugin for unreal engine installed for the Vive Pro Eye Headset. Following the directions in the documentation leads to compile errors when I go to launch the project. In short, I can't get the plugin to compile or the project to load correctly. Done: [X] - Installed and running steam VR [X] - Installed and running SR_Runtime [X] - Calibrated Eye Tracker in Steam VR Error Trigger Pasted unzipped unreal plugin into C++ blank UE4 project. Returned message from unreal: Result of manual compilation via project.sln Menu -> Build -> Build Solution -------------------------------------------------------- I'm also trying to follow the directions in part 3 of : ------------------------------------------------------------ For Context: I'm a researcher at the University of Michigan and have a series of studies I would like to gather eye tracking data from (especially blink rate and pupil dilatation). We are using a fully developed VR environment in UE4 to give several treatments to participants.
  3. In this post, I am going to show you how to integrate Variable Rate Shading (VRS) with your Unreal Engine project in order to enable Foveated Rendering using the HTC Vive Pro Eye headset. This article is going to focus on Unreal Engine. If you are using Unity instead of Unreal Engine then you can use the Vive Foveated Rendering plugin from the Unity Asset Store or the Github page. It is assumed that you’re somewhat familiar with Unreal Engine, C++ and Blueprints. Requirements HTC Vive Pro Eye headset VR Ready Quadro (Quadro Desktop: Quadro 4000 card and higher, Quadro Notebook: Quadro 3000 and higher) or VR Ready GeForce Turing based GPUs ( GeForce™ RTX 20 series or GTX-1660 / GTX-1660 Ti ) NVIDIA Driver version: 430.86 or later OS: Microsoft Windows This post is divided into 4 main parts: Part 1 – Foveated Rendering & NVIDIA Variable Rate Shading (VRS) Part 2 – Setting up Unreal Engine Source Code from the Vive Github Repository. Part 3 – Setting up SRanipal Plugin Part 4 – Combining everything for Foveated Rendering results So, let’s get started! Part 1 – Foveated Rendering & NVIDIA Variable Rate Shading (VRS) Allow me to begin by stressing the importance of Foveated Rendering and how beneficial it can be for VR applications. Creative directors and technical artists always try to raise the bar of visual fidelity of immersive VR applications, but the engineering teams often hit the bottleneck of achieving the required minimum of 90 FPS as new headsets are released with better and better display resolutions. From time to time some new smart rendering techniques are invented that allow us to optimize this process without having to drop the quality. This is where foveated rendering comes in to play. If you’re not already familiar with the term Foveated Rendering, what you need to know about it is that it is a rendering technique that can be used together with VR Headsets that support Eye Tracking, such as the HTC Vive Pro Eye, to reduce the rendering workload, thus allowing developers to improve the performance of their VR applications or push the visual quality of their content. The term “Foveated” derives from the word Fovea, the part of the eye retina that allows us to have a sharp central vision (Foveal Vision) which lets us focus on the visual details of what is important in a narrow region right in front of us. The Foveal and Para-Foveal areas cover a field of vision of about 5 degrees. Anything outside the fovea region belongs to the peripheral vision and despite the fact that it is also useful, it can only detect fast-moving content changes and color, hence why it feels comparatively less detailed and blurry. The Foveated Rendering technique exploits the anatomy of the human eye and suggests that computers can drop the quality of graphics in the peripheral vision by lowering the resolution there while focusing all of the available processing power on the smaller foveated area of the image. VRS_01_v008_white_logo.mp4 Dynamic (Eye-Tracked) Foveated Rendering should not be confused with a variant called Fixed Foveated Rendering. The latter assumes forward viewing direction and limits the rendering costs of display areas which will not be clearly visible in the headset mainly around in the lens distortion region. Although the concept of Foveated Rendering has been around for a while, the new NVIDIA Turing™ architecture for GPUs allowed the use of a rendering technique called Variable Rate Shading (VRS) that can enable this feature. This technique offers the ability to vary the shading rate within a frame and can be combined with Eye-Tracking data to perform Foveated Rendering in VR experiences, hence why you need a GPU that is based on Turing™ architecture. Graphics cards have a component called pixel shader (or fragment shader), that determines the visual characteristics of a single pixel in a virtual environment such as its color, brightness, contrast, etc. Instead of executing the pixel shader once per pixel, with VRS the GPU can not only apply a single pixel shader operation to a whole 16 x 16-pixel region but also dynamically change the shading rate during actual pixel shading in one of 2 ways: Coarse Shading. Execution of the Pixel Shader once per multiple raster pixels and copying the same shade to those pixels. Configurations: 1×1, 1×2, 2×1, 2×2, 2×4, 4×2, 4×4 Super Sampling. Execution of the Pixel Shader more than once per single raster pixel. Configurations: 2x, 4x, 8x For example, in the case of Coarse Shading, we can select a 2×2 coarse shading that indicates that a single shade operation evaluated by a pixel shader invocation will be replicated across a region of 2×2 pixels. So instead of sampling 4 times for each pixel in the 2×2 area, we can sample once and broadcast the result to all 4 pixels. On the other hand, Supersampling is a method that enables increased sharpness and better anti-aliasing on high-frequency textures not possible by simple MSAA. The pixel shader is run at every sample location within a pixel instead of just once per pixel. With VRS, a VR application doesn't need to perform supersampling on the entire render target, as a specific region can be selected instead, minimizing the performance impact. Shading rate such as 4x Supersampling indicates that pixel shader will be invoked upto 4 times evaluating upto 4 unique shades for the samples within 1 pixel. This is really useful and provides a smoother immersive VR experience for the users especially when they need for example to read text in VR. This fine level of control enables developers to combine varied shading rates and gaze tracking capabilities for foveated rendering. Besides Foveated Rendering, NVIDIA VRS supports the following 2 techniques: Content Adaptive Shading that takes into account factors like spatial and temporal color coherence and object variation every frame to lower the shading rate in successive frames, e.g for areas where detail remains unchanged from frame to frame, such as sky boxes and walls. Lens-Optimised VRS that can be used to render efficiently to a surface that closely approximates the lens corrected image, eliminating the need to render pixels that would otherwise be discarded before output on the headset, exploting the fact that lenses are designed to be sharper at the center and blurrier around the periphery. It is worth mentioning that all 3 techniques can be used in unison for ultimate customization, and there is no limit on how much supersampling and coarse shading can be done per frame by the developers. Part 2 – Setting up Unreal Engine Source Code from the Vive Github Repository. Usually, most engineering teams that use Unreal Engine for VR development have to make a choice depending on the VR project they will work on and the flexibility they need. They can use: a binary version of Unreal Engine provided from Epic Games via the Epic Games Launcher. In that case, the Engine can’t be modified and only binary plugins that are verified to work with that version of Unreal Engine can be used. This is usually the case if you want to build quickly a demo using technologies/features that are already supported out of the box. the source code of a specific version or branch of Unreal Engine. The engineering team can introduce entirely new features in their custom source code of Unreal Engine to reach their goals for a project. For example, if Unreal doesn’t support a feature, the engineers can modify the source code of the engine to introduce this feature. Or if they find a bug in Unreal Engine that has not been fixed yet, they can fix it themselves without the need to wait for Epic Games to fix it. Most engineering teams that work on projects with new technologies usually prefer the 2nd option. This gives them more control over their pipeline. Currently, if you want to integrate VRS and Foveated Rendering in your Unreal Engine project you need to download and use a custom modified version of the Unreal Engine source code from the HTC Vive Github Repository. HTC Vive currently supports 3 different versions of Unreal Engine for Foveated Rendering, so there is a branch for UE 4.21.0, another branch for UE 4.23.1 and one for UE 4.24.2. Don’t forget that in order to be able to access the Unreal Engine Source code repository, it’s required to link your Epic Games account to GitHub account and get authorized by Epic Games. This modified version of Unreal Engine contains: Changes to the Unreal Engine Rendering Pipeline source code to support VRS for certain render passes. The VRS Plugin that comes with NVIDIA API libraries We realize that you may want to integrate Foveated Rendering in your project and you may be using a binary/installed version of Unreal Engine or a different custom version of Unreal Engine. HTC Vive is working closely with NVIDIA and Epic Games to make this process easier in the future by integrating the required engine changes to the official Unreal Engine branch so that only the VRS plugin will be needed without any further modifications. For now, if you or your team already use a custom Unreal Engine source code for versions 4.23 or 4.21 and you don’t want to switch to a different engine, you can copy the changes we made in the Unreal Engine Rendering pipeline source code and of course, add the extra VRSPlugin. It’s only a few lines of code! More specifically, for Unreal version 4.23 we had to make changes to the following files: Engine/Source/Runtime/Renderer/Private/ScenePrivate.h Engine/Source/Runtime/Renderer/Private/PostProcess/PostProcessing.cpp Engine/Source/Runtime/Engine/Public/SceneViewExtension.h On the other hand for UE 4.24 we modified the following files as we had to call VRS functions from the RHI thread and not the Rendering thread: Engine/Source/Runtime/Renderer/Private/ScreenSpaceRayTracing.cpp Engine/Source/Runtime/Renderer/Private/ScenePrivate.h Engine/Source/Runtime/Renderer/Private/PostProcess/PostProcessUpscale.cpp Engine/Source/Runtime/Renderer/Private/PostProcess/PostProcessMaterial.cpp Engine/Source/Runtime/Renderer/Private/DeferredShadingRenderer.cpp Engine/Source/Runtime/Engine/Public/SceneViewExtension.h You can choose to download the source code branch as a .zip file or use Github Desktop to clone the repository locally. We recommend using Git to clone the repository, since this way you can quickly fetch the latest updates and bug fixes and update your local copy every time the HTC Vive engineers push a change. Here is the list of steps required after you download the custom Unreal Engine source code in order to get the Engine up and running: 1. If you downloaded a zip archive of the code, extract the archive contents to a directory where you would like to build the engine, e.g., C:\Unreal\4.23-vive. Make sure you use a short path, otherwise you may get errors in the next step with files that exceed the Windows length limit for file names. Alternatively, you can map your install directory as a Windows drive to reduce the path length. 2. Now navigate to the engine directory and run Setup.bat. You may need to Run as Administrator. The Setup.bat script will start downloading the required 3rd party dependencies for Unreal Engine and your system. This may take some time. Make sure that you check regularly the window for any warnings/issues. 3. The next step is to download all the required dependencies for Unreal Engine, so we need to run GenerateProjectFiles.bat to generate the Visual Studio Solution. By default, the script GenerateProjectFiles.bat creates project files for Visual Studio 2017. 4. Launch UE4.sln to open the project solution in Visual Studio. 5. In the menu bar, select Build > Configuration Manager. Verify that Active solution configuration is set to Development Editor and that Active solution platform is set to Win64. 6. In the Solution Explorer, right-click UE4 under Engine and select Set as Startup Project. This makes sure we build the Engine and not one of the other Programs that comes with Unreal. 7. Now, right-click UE4 under Engine again and, in the context menu, select Debug > Start New Instance to launch the engine. 8. Once the Unreal Engine is launched you can select the project you would like to open or specify a new project. If you are creating a new project, don’t forget to specify a project name. **There is currently an issue if you try to select a Blank Blueprint or Basic Code C++ Project where everything is black unless the mode is set to “Unlit” so instead try to select any of the other Template Projects. If you already have an Unreal Project built with a different version of Unreal Engine and you just want to swap the Engine versions, then you can try to open the Project using the Unreal Engine version that supports Foveated Rendering. To do that, find the Unreal Engine Project file, right-click on it and switch the Unreal Engine version used for that project to the new version of the engine we installed (browse to the installed path). That’s all, you should now be able to use the Custom Unreal Engine provided by HTC Vive with your project. Part 3 – Setting up SRanipal Plugin In order to optimize the rendering quality to match the user’s gaze, we need to feed Eye-Tracking gaze data to VRS. To get access to the data provided by the Eye-Tracking capabilities of the HTC Vive Pro Eye, you need to use the Vive SRanipal SDK. The SRanipal SDK allows developers to track and integrate users’ eye and lip movements, empowering them to read intention and model facial expression, create new interactions and experiences and improve existing ones exploiting the precise tracking of eye movement, attention and focus. The SRanipal SDK includes: the required runtime(SR_Runtime) which runs in the notification tray to show the current eye-tracking status for HTC VIVE Pro Eye. Plugins for Unreal/Unity along with sample code for native C development. Follow the list of steps below in order to integrate the SRanipal plugin in your project: 1. Visit the Vive SRanipal SDK page at the Vive Developer website, read the guidelines and start the download procedure. If you don’t have an HTC Developer account you will be asked to register and create a new one to be able to download the SRanipal SDK. 2. Download the Vive_SRanipalInstaller that contains the required runtime. Follow the instructions to install SR_Runtime. 3. Once installed, ensure that your Vive Pro Eye headset is connected to your PC and launch SR_Runtime.exe as an Administrator to start the runtime. Wait until you notice the SRanipal status icon that looks like a Robot appearing in the Windows notification tray. This icon’s eyes will turn orange if a headset with Eye-tracking capabilities is detected but the device is in idle mode and green if a program is retrieving eye data from it. 4. Start SteamVR (If not running already) 5. Put on your HMD. 6. The next step is to go through the Eye-Calibration Process. This is required for each new user and the calibration results are saved to the PC. To start eye calibration, press your VIVE controller’s system button and the calibration program will show an overlay window on your HMD. Select the Vive Pro button at the bottom grid of SteamVR menu. Note that for the highest level of precision, it is recommended to recalibrate for different users, as the eye positions and the pupillary distances are different for each individual. 7. You will need to read and accept the user agreement before turning on the Vive Eye-Tracking capabilities. 8. Press Calibrate to start. It will start by adjusting your HMD position. *If you have any issues with the Calibration Process and you get an error message “Initialization Failed” please follow the steps described in this troubleshooting guide. 9. Next, you need to adjust your IPD value, as shown below. 10. After that, you will be guided to do gaze calibration. Please look at the blue-dot sequentially shown at the center, right, left, upper and lower of the panel until the calibration has been successful, moving only your eyes and not your head. 11. Calibration is done. You are ready to develop eye-aware applications. Now you can try a mini-demo to light up the dots with your eye gaze or press the Vive controller’s system button to quit eye calibration. 12. Download the Vive SRanipal SDK .zip file. It contains the required plugins for each platform as well as documents with detailed instructions. 13. You will need to install the SRanipal Unreal Engine plugin in your Unreal project. To do that, copy the folder located in SRanipal SDK\v1.1.0.1\SRanipal_SDK_1.1.0.1\03_Unreal\Vive-SRanipal-Unreal-Plugin\SRanipal into the Plugins folder of your Project. If you don’t have a Plugins folder in your project structure, you can create a new one and name it “Plugins”. Then generate Visual Studio Project files again for your project. If you get any compilation issue about the Dartboard.h file that can not be included then please make the following change in the SRanipal.Build.cs file: 14. Open the Unreal project, go to Edit->Plugins and you will see the SRanipal plugin under the Project plugins. Make sure it is enabled. Now you can use the SRanipal SDK features for Eye Tracking with your HTC Vive Pro Eye. Part 4 – Combining everything for Foveated Rendering results At this point, the last thing we need is to tie everything together. We will combine the SRanipal plugin and the VRS plugin to enable Foveated Rendering by feeding eye gaze data each frame to the VRS algorithm. In Part 2 above, we used the custom Unreal Engine source code that comes with the VRS plugin. You can locate the VRS plugin folder at UnrealEngine\Engine\Plugins\Runtime\ViveSoftware\VRSPlugin. You will notice that it contains the required NVIDIA VRS APIs as well as code to easily modify VRS settings in your application. Also if you open your Unreal Engine project and go to Edit–>Plugins, you will find the Variable Rate Shading plugin under the Rendering category of the Unreal Engine plugins. Again make sure it is enabled. In order to feed gaze data to VRS and enable Foveated Rendering, developers need to go through the following list of steps: 1. Create a new function to return the Normalized Eye Directions using SRanipal’s method GetVerboseData() and expose it to Blueprints. One way to do this is to add a new function in the list of methods found in the header file SRanipal_FunctionLibrary_Eye.h. located in Plugins\SRanipal\Source\SRanipal\Public\Eye. Of course, you can always create a new static method anywhere else in your project as long as it uses the SRAnipal module dependency. Go ahead and type in the following code for the new function GetEyeGazeDirectionsNormalized. This function will return the normalized Eye Gaze Directions for each eye. UFUNCTION(BlueprintCallable, Category = "SRanipal|Eye") static bool GetEyeGazeDirectionsNormalized(FVector& LeftEyeGazeDirectionNormalized, FVector& RightEyeGazeDirectionNormalized); Also, for the implementation of the function, you will need to type in the following code in the source file SRanipal_FunctionLibrary_Eye.cpp. bool USRanipal_FunctionLibrary_Eye::GetEyeGazeDirectionsNormalized(FVector& LeftEyeGazeDirectionNormalized,FVector& RightEyeGazeDirectionNormalized) { LeftEyeGazeDirectionNormalized = FVector(); RightEyeGazeDirectionNormalized = FVector(); ViveSR::anipal::Eye::VerboseData VerboseDataOut; bool Result = SRanipal_Eye::Instance()->GetVerboseData(VerboseDataOut); LeftEyeGazeDirectionNormalized = VerboseDataOut.left.gaze_direction_normalized; RightEyeGazeDirectionNormalized = VerboseDataOut.right.gaze_direction_normalized; return Result; } 2. Feed the Normalized Eye Directions to VRS plugin’s UpdateStereoGazeDataToFoveatedRendering function. This way, the center of the foveated region for every frame will be calculated by the VRS plugin automatically. You will need to connect the function GetEyeGazeDirectionsNormalized from the previous step to the function UpdateStereoGazeDataToFoveatedRendering from the VRS plugin during Event Tick. You can do this by creating a new Blueprint Actor and adding the following Blueprint nodes in its Event Graph. Don’t forget to place the actor in your Scene so that it can tick. Of course, you don’t necessarily need an Actor for this, you could use the same Blueprint nodes in your GameMode or your GameInstance. 3. Finally, you need to drag and drop an Actor of type SRanipal_Eye_Framework in your Scene and enable Eye tracking (tick box) on it. Only then you can get valid eye gaze directions. 4. If the VRS plugin was successfully enabled, VRS Settings will appear under the Project Settings ->Plugins category. Below you will find a list that describes what each setting can be used for. For more information, you should read this detailed article from NVIDIA. Enable VRS: Whether to enable Variable-Rate Shading or not. Enable Eye Tracking: If an eye-tracking device is properly set up and with its eye-tracker plugin enabled, checking this will automatically fetch eye-tracking data and change the center of the foveated region according to it. ScreenPercentage: In Unreal Engine, the VRSPlugin requires r.ScreenPercentage to be larger than 100 to trigger the UpScale pass renderer so VRS could take effect, that’s why it’s set to 101 by default. Foveation Pattern Preset: This property adjusts the region size of foveated rendering. There are several pre-defined region sizes, the smaller more aggressive. If you pick one of the predefined options, then the Foveation Pattern Preset Detail section will auto-fill the values for you. If you pick the Custom option as the Foveation Pattern Preset, you can customize the internal parameters to perform foveated rendering exactly the way you want. More specifically, the region size (Horizontal and Vertical Radius) for each of the available regions (Innermost, Middle, Peripheral) can be customized from a value of 0 to 10. When the radius is set to 1.0 for both the width and height, the region size will span the width and height of the window forming a circle fitting in the render target (this will depend on the aspect ratio of the render target as well). If you pick a value greater than 1 then the region will span outside the render target area. So the available options are: Narrow Balanced Wide Custom Shading Rate Preset: This property adjusts the shading rate of each region automatically according to a pre-defined rate without having to specify any numerical parameters. The available options are Highest Performance/ High Performance / Balanced / High Quality / Highest Quality. There is also a "Custom" option that allows you to define manually the shading rate for each region. The built-in shading rate preset details can be customized to: Culled: Nothing will be rendered (black) Normal Shading: Each Pixel is sampled once Supersampling (2x,4x,8x,16x): Each pixel is sampled more than once, which would result in less aliasing than Normal Rendering (but will take more computing power). Coarse Shading options (Sample once per 2×1,1×2,2×2,4×2,2×4,4×4 pixel): With coarse shading, a group of pixels is only sampled once, which would result in performance gain, but with fewer details rendered. All parameters in VRSSettings in Project Settings can be changed dynamically during a VR session or a PIE session. This means that developers can also dynamically changing the presets according to different game-dependent factors. Please be noted that the VRS configuration modified in Project Settings will not be carried to the packaged game. Therefore, a set of Blueprint functions is provided by the VRS plugin to alter them during runtime. Developers should call these functions at the initialization phase. The boolean return value indicates whether the setter function execution was successful or not. Generally, VRSPlugin will fallback to Fixed Foveated Rendering if there’s any problem with eye-tracking (e.g. invalid gaze data, device not connected, tracker module not loaded/implemented etc.) You should now be able to apply foveated rendering to your VR application and adjust shading rate and region size for either better performance or better quality according to your project requirements. Currently, there isn’t a way to visualize each region individually, e.g with a different color, but if you want to test that Foveated Rendering works indeed, you can use “Culled” for the Peripheral and Middle Regions so that only the Innermost Region is rendered. Performance Gains & Known Issues Using the Balanced option for both the Foveation Pattern and Shading Rate Presets should give you the best visual quality and performance. It performs 4x supersampling in foveal region, 1x in the middle and 2×2 coarse shading at periphery, while keeping the foveal region just large enough so that the periphery is outside the instantaneous field of view for most users. In general, performance gain from VRS ranges from 20% to 38% for static scenes and VRS gains less performance when the scene is full of dynamic objects like skeletal mesh, particle emitter, and post-processing-effects. We tested the performance improvement using some of the scenes includes in the Unreal Engine 4 sample scenes pack. SunTemple (Link) This scene was designed to showcase mobile features. This single level contains detailed PBR materials useful for a variety of graphics features and techniques and comes with many static and opaque actors as well as many post-process effects Frame time improved from 11.7ms to 9ms (~23%) with Narrow/Highest Performance preset Frame time improved from 11.7ms to 9.8ms (~16.2%) with Balanced/Balanced presets Default Scene View (NO VRS) VRS Enabled (Balanced/Balanced) VRS Enabled (Narrow/Highest Performance) RealisticRoom(Link) This scene shows off the realistic rendering possibilities of Unreal Engine 4 for architectural rendering. It utilizes physically-based Materials, Pre-calculated bounce light via Lightmass, Stationary lights using IES profiles (photometric lights), Post processing, Reflections. Frame time improved from 10.5ms to 7.8ms (~25%) with Narrow/Highest Performance presets Frame time improved from 10.5ms to 8.8ms (~19%) with Balanced/Balanced presets No VRS VRS Enabled (Balanced/Balanced) VRS Enabled (Narrow/Highest Performance) Virtual Studio(Link) The Virtual Studio Scene showcases the Unreal Engine's ability to integrate with professional quality video cards and contains high quality screen-space reflection, which is benefited by VRS a lot. Frame time improved from 38.2ms to 20.8ms (~45.5%) with Narrow/Highest Performance presets Frame time improved from 38.2ms to 26.6ms (~30.3%) with Balanced/Balanced presets No VRS VRS Enabled (Balanced/Balanced) VRS Enabled (Narrow/Highest Performance) Of course, VRS Foveated rendering is not without its own shortcomings. One of the most noticeable artifacts is magnified aliasing in the peripheral region. This artifact is more obvious for thin or glossy objects in the scene. To ease this kind of artifact, the following options are our recommendations: Use temporal anti-aliasing. Developers can choose AA method under Project Settings > Rendering > Default Settings > Anti-aliasing Method Tweak region parameters (size and resolution) based on the content. For scenes with a lot of text or glossy materials, developers should use a less aggressive setting. (UE 4.24 version) Some objects may flicker when VRS is enabled. Uncheck “Occlusion Culling” in Project Settings > Rendering can fix it. Other known issues so far: Keep in mind Bloom will not take effect unless you use the UE 4.24 version. If you use RenderDoc it may cause a conflict with VRS Plugin, so please disable RenderDoc plugin and try again. (UE 4.24 version) Typing console command may crash the editor. (UE 4.24 version) Editor viewport is affected by VRS. We support Unreal Engine 4.21.0 and 4.23.1 and 4.24.2. We injected some VRS function calls in UE rendering pipeline to make VRS happen only for certain render passes. The UpScale pass was introduced in UE 4.19. Thus, the minimum supported version will be UE4.19. If you want to learn more things about the VRS plugin code base, you should read this article from NVIDIA that describes the VRS Wrapper APIs. References What Is Variable Rate Shading? A Basic Definition of VRS NVIDIA Says New Foveated Rendering Technique is More Efficient, Virtually Unnoticeable VRWorks Variable Rate Shading (VRS) Turing™ Variable Rate Shading in VRWorks Easy VRS Integration with Eye Tracking Microsoft – Variable Rate Shading Realistic Virtual Vision with Dynamic Foveated Rendering - Tobii
  4. Hi there, I am working on an application that uses the SRWorks pass-through mode with a Vive Pro in UE4. Until yesterday everything worked fine, however the cameras suddenly stopped sending out images. I can't even run the Experience_Unreal sample or create a blank project - all I see are the default textures. I've attached a screenshot of the VR preview, which mirrors the right eye. On the left eye, I am seeing the default brick wall texture. Also, when I turn my head, I can see, that those textures are simply put on a plane, which is not following my head (as expected). I already tried a number of possible solutions, none of which worked. The things I have tried include: Repairing SRworks Uninstalling and reinstalling SRWorks Reinstalling SteamVR Unplugging the HMD and plugging it into another USB port Rebooting Creating a blank UE project in UE 4.18 Creating a blank UE project in UE 4.25 Reinstalling Unreal Engine Creating a fresh copy of the Experience_Unreal sample in UE 4.18 Unplugging the base stations and re-running the samples (to disable head tracking) ... I am now running out of ideas. Using SteamVR to test the cameras never worked for me and allways shows a "Communication Error". This, however, also has been the case when it worked before. I also can't tell what should have changed to cause this problem in the first place. I have also attached my SR_logs directory. Hope you can help, searching for a solution. Thanks in advance, Carsten. SR_logs.zip @Daniel_Y, @Marios Bikos
  5. Hi all, We're releasing a demo for Rigel, our "All in One" Full Body Motion Capture Solution for body, fingers and face, so that potential customers can test this solution and evaluate accordingly. Here you can download the executable demo: Rigel Demo Since the introduction of the SteamVR Input plugin, there have been some changes related to the Vive Trackers setup, so we made a video explaining what needs to be done in order to make everything work during Rigel's Calibration. Rigel Demo Guidelines User Feedback is very important at this stage, so once you'll test Rigel, we ask you to spend a couple of minutes on this survey. Rigel Demo Survey Looking forward to receive valuable feedback! For any questions regarding Rigel, feel free to contact esposito.n@enter-reality.it Regards, Nicolas Esposito www.enter-reality.it
  6. Hi all, In this video you can see the realtime motion capture data smoothing that allows the user to choose the degree of smoothing while recording motion capture. This feature is intended to smooth out data while recording fast movements ( like fighting and fast gestures ), so that the animation curve will require less cleanup after the mocap has been recorded. Rigel - Features Highlight - Realtime Motion Capture Data Smoothing
  7. Hi all, In this video we're showing some of the advanced features for Rigel, our All in One Full Body Motion Capture Solution using Vive Trackers. We're showing how fast and easy the calibration process is, and how Rigel is able to retarget in realtime the animation data from the Vive Trackers to characters with different body size. Rigel - Features Highlight The Full Body Motion Capture Solution is set for release at the end of June, and a demo will be available in short time. If you want to read more about the entire setup, here is a FAQ page. Regards, Nicolas Esposito
  8. The Wave SDK for Unreal Engine is now also available on Github: https://github.com/ViveSoftware/VIVE-Wave-SDK-Unreal INTRODUCTION So far, the Wave SDK for developers using Unreal Engine has been available only on the Vive Developer Website. We decided to release the WaveVR plugin for Unreal Engine as a public Github repository. This will allow developers to report bugs or suggest enhancements using Github Issues allowing us to get feedback from the developer community. Developers can also create Pull Requests to suggest bug fixes. The Vive team will review the pull requests and follow up with the developers that created them, but the actual merge will temporarily take place internally and not directly on GitHub. The repository will then be updated to include those bug fixes. STRUCTURE The Github repository contains a different branch for each version of Unreal Engine so developers don't need to pick a specific Wave version, only the version of Unreal Engine that they are using. This should make the process of integrating the WaveVR plugin to Unreal Engine more intuitive. The repository comes with a full Unreal Engine sample project (plugin.uproject) and the WaveVR plugin is already pre-installed in the Plugins folder. This means that if you want to use the WaveVR plugin in your own project you can simply copy the WaveVR folder from the Plugins folder to your own project's Plugins folder. Only official version releases are pushed to GitHub. Developers will still be able to access the older Wave versions using tags & releases.
  9. UPDATE: The issue has been addressed with Beta 1.0.12.2 released. Here is how to change to the the BETA stream. -------------- There is currently an issue where Cosmos Elite users will get a crash when running applications that were built using Unreal Engine 4.24 or 4.25. We are now tracking the issue and will report back whenever there is an official fix for it. However, you can apply a temporary workaround for the issue as reported here: https://answers.unrealengine.com/questions/953996/does-the-unreal-engine-support-the-cosmos-elite.html The workaround is quite simple but it can't be considered a final fix cause it may impact performances for low-end GPU users. For game/app end users: Just go to %USERPROFILE%\AppData\Local\{ApplicationName}\Saved\\Config\WindowsNoEditor\ and open (or create if not present) the file Engine.ini. Then add the following lines at the end of the file: [/Script/Engine.RendererSettings] vr.HiddenAreaMask=false For Developers: Do the same thing by editing the file DefaultEngine.ini (config folder) and restart Unreal Engine
  10. Hi all, We recently released an Early Access version of Wave SDK 3.1.94. This version comes with several new Experimental Features for content developers and one of them is Direct Preview for Unreal Engine. While creating content for Vive Focus/ Focus Plus, developers need to test and tweak their project to make sure that everything works properly. However this process is often time-consuming as developers need to repeatedly build, deploy and install APKs spending time waiting during the development stage. That’s why we introduced the Direct Preview feature which enables you to skip the building process and directly preview your content on your AIO HMD via Wi-fi. You can rapidly preview and iterate on your scene using Unreal’s VR Preview Play mode while Direct Preview will stream the rendered frames to your Vive AIO device. Headpose, Control input, gestures, and similar input is sent from the device to the computer. You are able to effectively preview the app without needing to go through a time-consuming build and deploy process. Here is a video showing the steps: DirectPReview_UE4 Video.mp4 INSTRUCTIONS You can find detailed instructions HERE about Direct Preview. Here's what you need to do: Integrate the WaveVR plugin of the WaveSDK to your Unreal Engine Project according to the instructions here. Connect the HMD to your PC/laptop with a USB cable and turn on the HMD. Make sure that the proximity sensor of the HMD is always covered to keep the HMD awake(otherwise it will go to sleep). Find the IP of your HMD using the adb command "adb shell ifconfig" Copy and paste the IP to the Wave VR Project Settings. Also make sure that the Connect Type is set to Wi-fi. Install the Direct Preview APK to your HMD using the Wave VR Menu option "Install Device APK". This will install automatically the wvr_plugins_directpreview_agent_unreal.apk that lives in the {YourProjectName}\Plugins\WaveVR\Prebuilt\DirectPreview\deviceAPK folder. Verify that the apk was installed successfully by checking the Library of installed apps on your HMD. Start the DPServer using the WaveVR Tab Menu option "Start DPServer". You will notice a console window opening that will show you logs of setting up the server. Now your server is up and running we need to connect the HMD to it. To do that we need to start the apk we installed earlier. Go ahead and run the "Start Device APK" command from the Unreal Menu tab. This will automatically start the apk on your HMD(since the HMD is connected to your PC). Once you start the apk on your HMD you will notice a screen showing the message "Connecting...". You should also see the dpServer console window is updating with logs, showing that a client is connected to the server. You can now disconnect the headset from the PC as the streaming will take place via Wifi. Now start "Play in VR Preview" from the Unreal Engine menu. This will start a preview of your VR application and if you enabled "Enable Preview Image" in your WaveVR Project settings you should see the same result rendered/streamed on your HMD wirelessly. That's it! If you now try to move the headset, the VR Preview window will update accordingly. You can always stop the VR Preview, modify your scene layout or work on your project and then start the VR Preview again to see the updated results quickly on your HMD. This way you don't have to deploy an apk and wait for things to be compiled just to preview a simple change, saving you previous development time. ISSUES/NOTES Make sure that your PC/Laptop and your HMD are both connected to the same network domain. If what you see on your HMD during the Direct Preview looks blurry, you may need to adjust the image size sent to the HMD by changing the Unreal Engine window size in Editor Preferences > Level Editor - Play in Unreal Editor. So for example, if your Unreal Editor VR Preview window is too small, the image sent to the HMD will need to upscale and this will cause pixelation. At the moment, your Unreal Engine Project folder must be in your Window C Drive(and not an external hard drive), otherwise Direct Preview may not work at all (we are working on a fix for this issue). If you notice that only the left or right eye view is rendered on the HMD during the Direct Preview Mode or none of them are rendered at all, that's because Direct Preview needs high bandwidth for streaming, otherwise it is possible to lose frames. However, we provide an update frequency option in the WaveVR Project Settings so developers can adjust the FPS according to their bandwidth and reduce the FPS accordingly. Also restarting both the apk and the dpServer.exe application can help with the loss of rendering in the HMD during Direct Preview. Keep in mind that the FPS option here has to do with the number of frames sent from the dpServer(PC) to the APK(HMD). If you can't see the dpServer.exe window after trying to start it from the WaveVR Menu option button, you can always start it manually by running {YourProjectName}\WaveVR\Prebuilt\DirectPreview\dpServer.exe. Similarly if the Direct Preview apk is not installed automatically after clicking on the Install Device APK option, you can always install it manually using adb install command. The apk lives inside the plugin: {YourProjectName}\Plugins\WaveVR\Prebuilt\DirectPreview\deviceAPK\wvr_plugins_directpreview_agent_unreal.apk. If you start the DPServer and the window is opening and closing immediately there is probably an error. Try to run the .exe manually with cmd and if you notice that the log is complaining about nvidia drivers try to update to the latest nvidia drivers + restart your PC if you already have the latest drivers. Remember that you only need to start the dpServer and the Direct Preview apk once and it will keep running in the background. Of course feel free to restart it if you notice that something went wrong. Please give it a go and let us know about your thoughts in the comments.
  11. Hi, I've been having some issues with steamvr crashing which seemed to be somehow caused by the vive wireless app but nothing was reproducible enough that I thought it worth reporting. But over the last few days I've noticed that it is also causing unreal engine to crash. If I either close the wireless app, or I disconnect the battery to headest/the battery runs out, it will immediately cause unreal engine to crash with no warnings and nothing written to unreal logs. I'm not sure what information I should provide to help work out what the issue is but if someone lets me know I'd be happy to provide any further details. Thanks. @Jad
  12. Hi, Since the SRanipal documentation is not that great, and it took me a long time to get things up and running, I thought I'd share my solution here to help getting other people started. I'm not claiming it's the best solution and it's definitely not the only one, so if there are suggestions to improve it, let me know 🙂. What I'm trying to do here is sending the location and rotation of the HMD as OSC messages, as well as the eye angle measured by the HTC Vive Pro Eye. These OSC messages can be logged, or even processed in Matlab in real time, which is very useful when doing research, for example if you have to wait for the participants in an experiment to look back to the front before starting the next trial. To get them in Matlab, a tool called LabStreamingLayer (LSL) can be used (https://github.com/sccn/labstreaminglayer/wiki/Tutorial-4-a.-Receive-Data-streams-in-MATLAB). OSC messages can be converted into LSL streams using this tool: https://github.com/gisogrimm/osc2lsl. To check my implementation here, I tried putting an object in the measured gaze direction. I used the following software: - Unreal Engine 4.23.1 - SRanipal_SDK_1.1.0.1 - SR_Runtime 1.1.2.0 - OSC plugin for Unreal Engine 4 Blueprints (https://github.com/monsieurgustav/UE4-OSC) To set this up, first of all, create a new C++ Basic Code project in Unreal. Close it, and install the SRanipal and OSC plugins as explained in their documentation (create a 'Plugins' folder in your project folder and copy the plugin folders there), also copy the SRanipal Content folder to the Content folder of your project. Open the project again, and check that the two plugins show up and are enabled (Edit --> Plugins), and import the new content. Now duplicate the EyeSample2 level in the SRanipal content, and rename it. Open the new level and delete all the unnecessary things, leave only: Atmospheric Fog, Light Source, PlayerStart, Sky Sphere and SRanipal_Eye_Framework. Now, VR has to be added to the project, click Add New --> Add feature or content pack... --> Blueprint Feature (Virtual Reality) --> Add to project. Find the Motion Controller Pawn in the newly added VirtualRealityBP --> Blueprints and make a copy of it, I named this 'VR_controller'. Add the VR_controller to the level and put it in (0,0,0). Click 'Edit VR_controller' in the World Outliner to open the Blueprint. In the Event Graph, add the components shown in the picture (I hope this is readable), then Compile (ignore the warnings) and Save. Don't forget to enter the IP address of the PC you want to send your OSC messages to in the Add Send Osc Target block (I'm sending it to two PC's simultaneously), it should also be sent to the PC you're running Unreal on for things to work later on. Now, we'll add an object that will point in the measured gaze direction. To do this, click Add new --> Blueprint Class --> Actor, and name this one 'GazeSteeredObject'. Add it to the level, also in (0,0,0) and click Edit. Add an OscReceiver (Add Component --> OSC Receiver), and also add an object to be in the direction of the gaze (I added a sphere). Give it a nice material if you like. The Sphere should be the child of the DefaultSceneRoot. Now edit the Blueprint. First of all, we need to give the object the same location and rotation as the HMD. To do this, add the following to the blueprint: Now, the GazeSteeredObject should be following the head direction. To put it in the gaze direction, we need to get the eye angle and add this. We'll also send this data as OSC message. Add the components in the blueprint as shown in the picture. Make sure the initial X location of the Sphere is set to 100 (or some other positive non-zero value). The 'Get Gaze Ray' function sets the combined 'Gaze Direction', which is a vector of 3. It is not documented what this is exactly, but the Y-value seems to be the horizontal eye angle and the Z-value the vertical eye angle in degrees. The X-value is always close to 100, I don't know what this is. After compiling and saving you can play the level in VR, the sphere should now move with your gaze direction (of course calibrate the eyetracker first). The sampling rate is determined by time value in the 'Set Timer by Event' block. I hope this was useful. Finally, I have some questions for the VIVE staff: - Is it true that the 'Direction' of the 'Get Gaze Ray' function gives the combined horizontal eye angle as Y-value and the combined vertical eye angle as Z-value, in degrees? And what is the X-value, some confidence value? - Using the time value in 'Set Timer by Event' I can determine the sampling rate with which my OSC messages are sent. Values of 100 Hz and 125 Hz seem to work fine for the GazeDirection, and I even tried 200 Hz for the HMD location and rotation. The data points that come out do have the set sampling rate, and it's not sending the same value multiple times, so this looks good. But what is the maximum sampling rate that the sensors can deliver, and what happens when I set my sampling rate higher? Are these the actual measured data points or are the values extrapolated somehow? Best, Maartje @Daniel_Y @zzy
  13. Hi, Is there a way to start the eye calibration without using the controller and having to go through the menu? I'm planning to use it in an experiment and want to start it directly (ideally from Unreal 4.23 because I'm using that) without having to tell my participants which buttons they have to click all the time. In the SRanipal Unreal SDK document the function 'LauchEyeCalibration' is listed, which sounds promising, but this does not show up in the list of available functions in the Unreal Blueprint. How do I run this function, then? Thanks @Corvus @Daniel_Y @zzy
  14. I would like to make sure about GetValidity() in SRanipal Unreal SDK. (I'm using Unreal Engine 4.22.3, SRanipal_Runtime 1.1.2.0, SRanipalSDK 1.1.0.1.) I have checked the source code concerned to GetValidity() in SDK (see TEXT1 below) and seemed to discover that GetValidity() is incorrect in a part of the code. If the validity as an argument with GetValidity() is SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY and its flug also be been on in an eye_data_validata_bit_mask, the function should return the TRUE, however, it returns FALSE. Because, SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY defines 0 in the SingleEyeDataValidity(TEXT1) as enum. Therefore, the source code should be corrected as GetValidity() in TEXT2, or SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY should be not 0, how is this? -TEXT1(This is SRanipal_Eyes_Enums.h)- enum SingleEyeDataValidity { SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY, SINGLE_EYE_DATA_GAZE_DIRECTION_VALIDITY, SINGLE_EYE_DATA_PUPIL_DIAMETER_VALIDITY, SINGLE_EYE_DATA_EYE_OPENNESS_VALIDITY, SINGLE_EYE_DATA_PUPIL_POSITION_IN_SENSOR_AREA_VALIDITY }; typedef struct SingleEyeData { /** The bits containing all validity for this frame.*/ uint64_t eye_data_validata_bit_mask; bool GetValidity(SingleEyeDataValidity validity) { return (eye_data_validata_bit_mask & (uint64)validity) > 0; } }SingleEyeData; -TEXT2- bool GetValidity(SingleEyeDataValidity validity) { return (eye_data_validata_bit_mask & ((uint64)1 << (uint64)validity)) > 0; } Thank you. @Daniel_Y @zzy
  15. Hello. I want to calculate the gaze point in Unreal Engine. I have some questions about EyeData. (I'm using Unreal Engine 4.22.3, SRanipal_Runtime 1.1.2.0, SRanipalSDK 1.1.0.1) 1. Shouldn't gaze_origin_mm be used to calculate the gaze point? Why EyeFocusSample doesn't use gaze_origin_mm? 2. Is eye_data_validata_bit_mask a value that is set a bit corresponding to SingleEyeDataValidity? For example, if only gaze_origin_mm and gaze_direction_normalized are valid, eye_data_validata_bit_mask is ((1<<SINGLE_EYE_DATA_GAZE_ORIGIN_VALIDITY) | (1<<SINGLE_EYE_DATA_GAZE_DIRECTION_VALIDITY))? (SingleEyeData::GetValidity() seems to be wrong.) 3. The convergence_distance_validity seems to be always FALSE. Is convergence_distance_mm supported now? Thank you. @Corvus @Daniel_Y
  16. Hi, I'm new to this and I'm trying to set up SRanipal in Unreal so that I can get the eyetracking data. I have followed the manual and can get to the point where I have to enable the plugin, but it does not show up in the list of plugins in Unreal. Any ideas what could be wrong? I tried copying the unzipped 'Plugins' folder into the project folder and also tried copying the 'SRanipal' folder into the 'Plugins' folder of the game engine. Best, Maartje @Daniel_Y @zzy @Corvus
  17. Hello Everyone, So I've been wondering if there is an easy way to basically wrap a model around the base Skeleton of the Vive Hand Tracking. For example, the Leapmotion_Basehand_Rig_Left seems to have a sekeleton that lines up with the Vive Hand Tracking Spawn Points (Linked it in attachments), so one way I thought of doing this is to use these spawnpoints to just control the Skeleton. However I have not been able to do this in Unreal Engine. (Note; I'm not too experienced with UE4 Animations, so there might be something basic I'm missing) What I've come across so far from looking around is that I need to have states for the animations to work, but I want to be able to move the model around according to the spawn points if possible. Can someone help me with this, or point me in the right direction? I've seen Control Rig, Animation Blending, and I've tried to just get the bone nodes in blue prints and animate them there, but so far I'm not sure how I should approach this. Thanks! And if there's anything I can elaborate on let me know!
  18. There is a bug in a clean install of the UE4 plugin. I get this error non-stop: WVRSimulator: Error: Failed to load Simulator library. This prevents me from packaging or launching the game @Tony PH Lin @Cotta
×
×
  • Create New...