Jump to content
 

MariosBikos_HTC

Employee
  • Content Count

    34
  • Joined

  • Last visited

Community Reputation

17 Good

1 Follower

About MariosBikos_HTC

  • Rank
    Explorer
  • Birthday August 12

Recent Profile Visitors

357 profile views
  1. Hey @PJninja, is this helpful? It contains instructions on how to use the vive tracker in Unreal Engine. You are using UE 4.23.2 so may also need the SteamVR Input Plugin: https://github.com/ValveSoftware/steamvr_unreal_plugin/tree/4.23
  2. Hi @Sara, do you refer to the front cameras of Vive Pro Eye or the integrated Eye Tracking IR cameras?
  3. If you have issues with the calibration process when using the Droolon F1 kits, please check this forum post and try the recommended steps to solve the issue. Also if you start the Calibration Process and can't move beyond the 1st stage(HMD adjustment), try the following: Try using a different USB slot for the cable that connects the Linkbox with your PC Try removing some devices connected via USB from your PC Make sure USB Devices in the Device Manager have Power Management-->Allow the computer to turn off this device to save power set to unticked.
  4. We realise how important it is for the development community to have easy-to-use tools that improve content performance. We recently released the new update of Wave SDK 3.1.94 [Early Access] with several Experimental Features for developers who create content for standalone VR headsets of the Wave Ecosystem such as the Vive Focus Plus and Vive Focus. In this update, we introduced changes to the Adaptive Quality feature, which can help to automatically adjust the rendering quality of your VR application according to the system workload in order to achieve better performance and improve battery life by up to 15%. This blog post will explain more about Adaptive Quality, why it is important and how to apply it to your own projects. We will provide an overview of the solution and its design/implementation and share a few tips on how developers can get started using it. We also describe how it works in synergy with other features of the Wave SDK such as Dynamic Fixed Foveated Rendering and Dynamic Resolution for better results. * Please note that Wave SDK 3.1.94 is an Early Access version that includes new features for developers to experiment with and provide feedback or suggest changes. These features are available only with a specific developer ROM update (and Adaptive Quality requires ROM v3.07.623.336 for Focus Plus) but content developed with this Developer ROM can't be published to Viveport until we release a public ROM update (coming soon!). Please refer to this article for more information on how to get access to the Developer ROM and test Wave SDK 3.1.94. Introduction Standalone VR headsets may have all the necessary components required to provide VR experiences, but unlike PC-based VR headsets, utilising the full power of their hardware requires an intricate balance for VR apps to run smoothly and with consistent performance. Heat generated from headsets working extra hard trying to render VR content can result in Throttling. The hardware will detect the high temperatures and when a predefined limit is crossed it will attempt to lower the clock speed of the CPU/GPU to prevent the system from overheating. When the temperature levels get back to normal, the system will increase the CPU/GPU clock speed and performance will bounce back again. Unfortunately, this process may be repeated and leads to poor battery life and inconsistent performance as the system is not able to quickly get rid of the generated heat. Although developers can mitigate this issue by trying to make sure games perform at their best at all times, this is not always possible. Adaptive Quality Adaptive Quality works by providing a way for developers to balance the performance of their VR applications and power consumption in real time, offering automatic adjustment of the CPU/GPU performance according to the workload of the system. Furthermore, it allows defining a set of strategies on how the application should respond to system changes to improve FPS when the rendering performance is insufficient. Adaptive Quality can be combined with the Dynamic Resolution feature to adjust the image quality of the application according to system changes and with Fixed Foveated Rendering to dynamically change the quality of the peripheral region when improving the performance is essential. This results in better battery consumption management and smooth frame rates, as developers have more control and can create and customise their own policies to dynamically handle hardware changes. Especially for GPU fragment-bound apps, this leads to less throttling and a better experience for the end users. Adaptive Quality v1.0 was first introduced in Wave SDK 3.1.1 supporting automatic CPU/GPU adjustment and system events according to workload. Starting from Wave SDK 3.1.94 we introduced new features and changes (v2.0) adding Dynamic Resolution and Fixed Foveated Rendering to the mix. Auto CPU/GPU Adjustment Standalone VR headsets are powered by a battery so making sure that power is not drained too quickly is important. With Adaptive Quality, we’ve made the management of CPU and GPU clock rates much simpler by making it almost entirely automatic. If Adaptive Quality is enabled, the system can dynamically change the CPU and GPU performance level to maintain performance based on the system load. So when the performance of your application is insufficient, CPU/GPU clock speeds will increase to improve the FPS. Likewise, if the application already runs at high FPS and the complexity of the scene is low, Adaptive quality can scale down the clock rates to save battery power for the headset and prolong its usage. Although we don’t provide direct access to the maximum/minimum-allowed clock speeds, Adaptive Quality can configure those properties based on its knowledge about the current system load and define if the levels should be lowered or raised. Of course, when Adaptive Quality is disabled, developers can still manually increase/decrease the CPU/GPU performance based on practical demands according to this API. System Events & Custom Policy Although we can change the clock rates to reduce power consumption or improve performance, that may not always be enough to achieve better results. Sometimes, more things need to change within the VR application software itself to get a stable performance. Adaptive Quality can be configured to broadcast events whenever there are changes to the system workload. This mechanism is really useful, since developers can create and customise their own policy to reduce GPU load and ensure constant frame rates over a longer period of time. Depending on the situation, the VR application can react differently according to those events and developers may create their own policies and choose whether and how to handle those changes and change scene complexity themselves. Developers can subscribe to receive performance events that recommend lowering or raising the rendering quality of the application and actively modify the rendering settings of the application in an equivalent way by enabling or disabling MSAA or other rendering settings that can help boost performance. Dynamic Resolution Dynamic Resolution is a feature that works together with Adaptive Quality and helps adjust the image quality of the application by changing the eye buffer size according to the events broadcasted (that we mentioned in the previous section). More specifically, the Resolution Scale will be increased automatically when the event received denotes that the quality can be higher and similarly, when the event type received recommends lowering the quality of the running application, the resolution scale will be decreased. In order to ensure that the Resolution Scale will not decrease to a point where the application is unusable, Dynamic Resolution comes with built-in functionality that helps determine the lower bound of Resolution Scale for different VR devices to maintain text readability. What’s great about this feature is that there is no extra latency introduced with the change in resolution scale. Dynamic Fixed Foveated Rendering Foveated Rendering is a technique that exploits the anatomy of the human eye and suggests that applications can drop the quality of graphics in the peripheral vision by lowering the resolution in that region while focusing all of the available processing power on a smaller area of the image(foveated region). The term “Foveated” derives from the word “Fovea”, the part of the eye retina that allows us to have a sharp central vision (Foveal Vision) which lets us focus on the visual details of what is important in a narrow region right in front of us. Anything outside the fovea region belongs to the peripheral vision and despite the fact that it is also useful, it can only detect fast-moving content changes and color, hence why it feels comparatively less detailed and blurry. It’s worth noting that there are 2 types of Foveated Rendering and the terms are sometimes confusing: Fixed Foveated Rendering assumes that the foveated region should always be at the center of the field of the view of the user and that lower resolution pixels should be rendered at the distortion region around the lens as things are already not clearly visible there. Eye-Tracked or Dynamic Foveated Rendering can be used with headsets that support eye tracking modules (Check Vive Pro Eye) to accurately define the foveated region based on gaze direction. It’s called dynamic because as the human eye moves, the foveated region keeps changing and the peripheral region keep changing dynamically. Adaptive Quality can be combined with Fixed Foveated Rendering to help increase the performance of VR applications according to system workload (dynamically). Fixed Foveated Rendering can be automatically enabled by Adaptive Quality to further reduce the GPU load and improve performance whenever it’s required. Results The benefits of Adaptive Quality can become more clearly articulated through some examples. The graph below illustrates how Adaptive Quality helps deliver smooth and high frame rates with a native fragment-bound application. In green, you can see the frame rate fluctuating when Adaptive Quality is disabled, with FPS going down as GPU workload increases. In pink, you can see the stable results after enabling Adaptive Quality running at 75 FPS. The x-axis indicates the time passed since the start of the application and the y-axis shows the FPS on the left and the GPU Level on the right. The yellow letters show the fragment loading values that increase every 5 seconds. Notice how the GPU clock rates go down when the FPS are high to save power(after 11 seconds and 61 seconds). Another graph below shows the results of a test using a Unity application and a GPU fragment-bound application. When Adaptive Quality and Dynamic Resolution are both enabled there is an increase of 13 FPS on average. In green you can see the case where Adaptive Quality is disabled, while in yellow you can notice how Dynamic Resolution tries to decrease the resolution scale to improve FPS and when FPS are high enough we increase the resolution scale back up again. We also tested Adaptive Quality with other applications such as Viveport’s Sureshot Game and SPGM. As you can see from the results, the energy consumption was improved by 10% and 15% respectively, indicating how useful it can be to use this feature in order to extend the battery life of the VR headset. How to use Adaptive Quality WaveVR SDK 3.1.0 to 3.1.6: Adaptive Quality is not enabled by default. The Wave SDK provides an API function called WVR_EnableAdaptiveQuailty that needs to be called manually so that the CPU/GPU performance levels can be adjusted automatically. The system events WVR_EventType_RecommendedQuality_Lower or WVR_EventType_RecommendedQuality_Higher will be broadcasted based on system workload and can be monitored to initiate actions according to them using the Wave System Event. Check System event to know how to listen these system events. WaveVR SDK 3.1.94 or later: Adaptive Quality is enabled by default and system events are broadcasted to change rendering quality when needed. One or multiple strategies can be used to adjust display quality and improve the FPS when performance is insufficient (WVR_QualityStrategy_Default, WVR_QualityStrategy_SendQualityEvent, WVR_QualityStrategy_AutoFoveation) Unity The WaveSDK package for Unity supports all the features of Adaptive Quality including the new features introduced with Adaptive Quality 2.0 (Dynamic Resolution and Dynamic Fixed Foveated Rendering) and it’s really easy to use. The WaveVR_AdaptiveQuality script can be used to enable the AdaptiveQuality feature. However, starting from WaveVR 3.1.94, this component will be pre-attached to the WaveVRAdaptiveQuality GameObject of the WaveVR Prefab. In this case, WaveVR_AdaptiveQuality is enabled by default. As you can see in the screenshot below, the WaveVRAdaptiveQuality game object is part of the WaveVR game object that is required to build VR applications that support the Wave ecosystem. There are 2 scripts attached to it: WaveVR_AdaptiveQuality: When this script is enabled, automatic CPU/GPU clock adjustment will take place and developers can tick the boxes under the Rendering Performance Improve Strategy section to define what strategies should be used (e.g I want system events to be broadcasted and Fixed Foveated Rendering to be enabled. In this case I need to tick both boxes). WaveVR_Dynamic_Resolution: This script is responsible for the Dynamic Resolution feature that adjusts the resolution scale of the VR application according to workload. An list of Resolution Scale values can be defined that will be used to adjust the resolution scale whenever there are events triggered by Adaptive Quality. Also the Text Size slider can be used to define the smallest size of text that will be used in the application to avoid having Dynamic Resolution making text unreadable. Unreal Engine Adaptive Quality can also be used with the Wave Plugin in Unreal Engine. Two functions are provided to enable the Adaptive Quality Feature and query whether Adaptive Quality is enabled or not. Although Auto CPU/GPU Adjustment is automatically supported when Adaptive Quality is enabled, Dynamic Resolution and Dynamic Fixed Foveated Rendering can’t be utilised at the moment with this plugin. There will be more updates on this soon. Wave 3.2 is expected to be released soon, adding support for system events in Unreal Engine. Summary Table Best Practices & Tips Now that you know more about Adaptive Quality here are some tips and advice: Rendering improvements by WaveVR AdaptiveQuality has limits. It is still highly recommended to optimise your app as much as possible first. Check our Mobile VR performance optimisation tips. Enabling WaveVR AdaptiveQuality can help lightweight VR apps, such as photo or video playing apps, be used longer. Enabling WaveVR AdaptiveQuality with WVR_QualityStrategy_SendQualityEvent and WVR_QualityStrategy_AutoFoveation can improve the rendering quality by up to 15% if the rendering bottleneck is GPU fragment processing bound. Always build optimized versions of the application for distribution. Even if a debug build performs well, it will draw more power and heat up the device more than a release build. Power management is a crucial consideration for Android VR development. Wave SDK force disables Adaptive Quality during map loading to increase performance and restores Adaptive Quality status after the map is loaded. Useful Links To implement Adaptive Quality in your own application, check the documentation pages below: Wave Native SDK Unity Integration Unreal Engine Integration Also see: System Event & how to listen to events Foveated Rendering Dynamic Resolution We gave a talk about Adaptive Quality and the new features introduced in the latest Wave SDK during the Virtual Vive Ecosystem Conference back in March 2020. You can watch the presentation below: What do you think? Feel free to try this feature and provide feedback from your tests in our forums. You can also find the complete list of new features for each Wave SDK update in our release notes.
  5. Hi @Withcat, glad that you managed to solve the issue. I don't think that there is a way at the moment to do this automatically because it depends on the mesh that you load.
  6. Hi @Withcat, what version of the SDK are you using and what version of Unreal Engine? Also can you share what is your version of SteamVR software? If you can share your custom mesh then we can try to reproduce the issue locally and see if we get the same issue as well in which case we will be able to fix it.
  7. The Wave SDK for Unreal Engine is now also available on Github: https://github.com/ViveSoftware/VIVE-Wave-SDK-Unreal INTRODUCTION So far, the Wave SDK for developers using Unreal Engine has been available only on the Vive Developer Website. We decided to release the WaveVR plugin for Unreal Engine as a public Github repository. This will allow developers to report bugs or suggest enhancements using Github Issues allowing us to get feedback from the developer community. Developers can also create Pull Requests to suggest bug fixes. The Vive team will review the pull requests and follow up with the developers that created them, but the actual merge will temporarily take place internally and not directly on GitHub. The repository will then be updated to include those bug fixes. STRUCTURE The Github repository contains a different branch for each version of Unreal Engine so developers don't need to pick a specific Wave version, only the version of Unreal Engine that they are using. This should make the process of integrating the WaveVR plugin to Unreal Engine more intuitive. The repository comes with a full Unreal Engine sample project (plugin.uproject) and the WaveVR plugin is already pre-installed in the Plugins folder. This means that if you want to use the WaveVR plugin in your own project you can simply copy the WaveVR folder from the Plugins folder to your own project's Plugins folder. Only official version releases are pushed to GitHub. Developers will still be able to access the older Wave versions using tags & releases.
  8. Hi @Liv Tech Company Ltd, can you share more information about what you have tried so far (e.g if you have Blueprint screenshots or share a sample project). Also what version of the Wave SDK are you using and what version of Unreal Engine?
  9. We're inviting you to participate to the "Droolon Cup" Eye-tracking VR content development competition where more than $80,000 worth of prizes await! It is a contest for VR developers sponsored by 7invensun, co-organized by HTC and with technical support from NVIDIA. In this contest you are expected to combine an HTC VR headset (Vive Cosmos, Vive Pro, Vive Focus Plus, etc) and the Vive Eye Tracking SDK(SRanipal) to develop VR eye-tracking applications. 7Invensun is the company behind the eye tracking modular kit Droolon F1 that is compatible with all HTC headsets and will loan Droolon F1 kits to selected teams. More details: https://www.7invensun.com/hd Registration Deadline: 31 May 2020
  10. Hi @Nermeen, this developer jam was held in London in January 2020 so you can no longer contribute to it.
  11. Yes you can. This link has all the info you need https://www.vive.com/us/support/vive-pro-hmd/category_howto/activating-the-dual-camera.html Also the SRworks SDK can be used with Vive Pro to create XR apps: https://developer.vive.com/resources/knowledgebase/intro-vive-srworks-sdk/
  12. UPDATE: The issue has been addressed with Beta 1.0.12.2 released. Here is how to change to the the BETA stream. -------------- There is currently an issue where Cosmos Elite users will get a crash when running applications that were built using Unreal Engine 4.24 or 4.25. We are now tracking the issue and will report back whenever there is an official fix for it. However, you can apply a temporary workaround for the issue as reported here: https://answers.unrealengine.com/questions/953996/does-the-unreal-engine-support-the-cosmos-elite.html The workaround is quite simple but it can't be considered a final fix cause it may impact performances for low-end GPU users. For game/app end users: Just go to %USERPROFILE%\AppData\Local\{ApplicationName}\Saved\\Config\WindowsNoEditor\ and open (or create if not present) the file Engine.ini. Then add the following lines at the end of the file: [/Script/Engine.RendererSettings] vr.HiddenAreaMask=false For Developers: Do the same thing by editing the file DefaultEngine.ini (config folder) and restart Unreal Engine
  13. Hi @MDV Games, we are aware that UE4.24 has deprecated the MotionController Input and we are working on a different way of supporting Unreal Engine's Input. For now, you can follow the instructions about using the WaveVR Blueprint API for Input HERE and use the BP Nodes IsInputButtonPressed, IsInputButtonTouched and GetInputButtonAxis Also for your question on how to get access to Wave SDK 31.94(that does support hand tracking indeed) please check this post. It describes the process.
  14. Hi all, We recently released an Early Access version of Wave SDK 3.1.94. This version comes with several new Experimental Features for content developers and one of them is Direct Preview for Unreal Engine. While creating content for Vive Focus/ Focus Plus, developers need to test and tweak their project to make sure that everything works properly. However this process is often time-consuming as developers need to repeatedly build, deploy and install APKs spending time waiting during the development stage. That’s why we introduced the Direct Preview feature which enables you to skip the building process and directly preview your content on your AIO HMD via Wi-fi. You can rapidly preview and iterate on your scene using Unreal’s VR Preview Play mode while Direct Preview will stream the rendered frames to your Vive AIO device. Headpose, Control input, gestures, and similar input is sent from the device to the computer. You are able to effectively preview the app without needing to go through a time-consuming build and deploy process. Here is a video showing the steps: DirectPReview_UE4 Video.mp4 INSTRUCTIONS You can find detailed instructions HERE about Direct Preview. Here's what you need to do: Integrate the WaveVR plugin of the WaveSDK to your Unreal Engine Project according to the instructions here. Connect the HMD to your PC/laptop with a USB cable and turn on the HMD. Make sure that the proximity sensor of the HMD is always covered to keep the HMD awake(otherwise it will go to sleep). Find the IP of your HMD using the adb command "adb shell ifconfig" Copy and paste the IP to the Wave VR Project Settings. Also make sure that the Connect Type is set to Wi-fi. Install the Direct Preview APK to your HMD using the Wave VR Menu option "Install Device APK". This will install automatically the wvr_plugins_directpreview_agent_unreal.apk that lives in the {YourProjectName}\Plugins\WaveVR\Prebuilt\DirectPreview\deviceAPK folder. Verify that the apk was installed successfully by checking the Library of installed apps on your HMD. Start the DPServer using the WaveVR Tab Menu option "Start DPServer". You will notice a console window opening that will show you logs of setting up the server. Now your server is up and running we need to connect the HMD to it. To do that we need to start the apk we installed earlier. Go ahead and run the "Start Device APK" command from the Unreal Menu tab. This will automatically start the apk on your HMD(since the HMD is connected to your PC). Once you start the apk on your HMD you will notice a screen showing the message "Connecting...". You should also see the dpServer console window is updating with logs, showing that a client is connected to the server. You can now disconnect the headset from the PC as the streaming will take place via Wifi. Now start "Play in VR Preview" from the Unreal Engine menu. This will start a preview of your VR application and if you enabled "Enable Preview Image" in your WaveVR Project settings you should see the same result rendered/streamed on your HMD wirelessly. That's it! If you now try to move the headset, the VR Preview window will update accordingly. You can always stop the VR Preview, modify your scene layout or work on your project and then start the VR Preview again to see the updated results quickly on your HMD. This way you don't have to deploy an apk and wait for things to be compiled just to preview a simple change, saving you previous development time. ISSUES/NOTES Make sure that your PC/Laptop and your HMD are both connected to the same network domain. If what you see on your HMD during the Direct Preview looks blurry, you may need to adjust the image size sent to the HMD by changing the Unreal Engine window size in Editor Preferences > Level Editor - Play in Unreal Editor. So for example, if your Unreal Editor VR Preview window is too small, the image sent to the HMD will need to upscale and this will cause pixelation. At the moment, your Unreal Engine Project folder must be in your Window C Drive(and not an external hard drive), otherwise Direct Preview may not work at all (we are working on a fix for this issue). If you notice that only the left or right eye view is rendered on the HMD during the Direct Preview Mode or none of them are rendered at all, that's because Direct Preview needs high bandwidth for streaming, otherwise it is possible to lose frames. However, we provide an update frequency option in the WaveVR Project Settings so developers can adjust the FPS according to their bandwidth and reduce the FPS accordingly. Also restarting both the apk and the dpServer.exe application can help with the loss of rendering in the HMD during Direct Preview. Keep in mind that the FPS option here has to do with the number of frames sent from the dpServer(PC) to the APK(HMD). If you can't see the dpServer.exe window after trying to start it from the WaveVR Menu option button, you can always start it manually by running {YourProjectName}\WaveVR\Prebuilt\DirectPreview\dpServer.exe. Similarly if the Direct Preview apk is not installed automatically after clicking on the Install Device APK option, you can always install it manually using adb install command. The apk lives inside the plugin: {YourProjectName}\Plugins\WaveVR\Prebuilt\DirectPreview\deviceAPK\wvr_plugins_directpreview_agent_unreal.apk. If you start the DPServer and the window is opening and closing immediately there is probably an error. Try to run the .exe manually with cmd and if you notice that the log is complaining about nvidia drivers try to update to the latest nvidia drivers + restart your PC if you already have the latest drivers. Remember that you only need to start the dpServer and the Direct Preview apk once and it will keep running in the background. Of course feel free to restart it if you notice that something went wrong. Please give it a go and let us know about your thoughts in the comments.
  15. The HTC VIVE Tracker allows you to not only track objects in VR (getting tracker pose in real-time) but also use the POGO pins to simulate input buttons (Grip/Trigger/Trackpad/Menu) as if you were using a Vive controller. In this post I am going to show you how to do both using Unreal Engine 4.24. We will showcase 2 different scenarios: Scenario 1: we will only use the position and orientation of the Vive Tracker in the engine, without adding any input command events for the Pogo Pins. In this case, 2 Vive controllers will handle the user input while the tracker’s pose will be updated in the engine according to the physical device. Scenario 2: we will use the Vive Tracker for its pose and for input commands. At the moment there is a limitation with SteamVR Input so you can’t get input simultaneously from 2 Vive Controllers and a Vive Tracker. Therefore you will need to use 1 Vive Controller and a Vive Tracker or just a Vive Tracker depending on your application use case. In both cases we will need a Vive Tracker. If you haven’t used a Vive Tracker with SteamVR before, you will need to pair the Vive Tracker with SteamVR so that it can be recognised as a tracking device in the SteamVR console. To do that, go to Devices → Pair Controller and a window will pop up asking you to pair an HTC Vive Controller. In this case, we want to pair a different type of device ( a tracker) so you need to click on the button “I want to pair a different type of Controller” and select the HTC Vive Tracker from the available options. After that press and hold the vive tracker System Button (where the Vive logo is) until the device is paired. You will be able to see the vive tracker on your vive console. Scenario 1 (POSE TRACKING ONLY) Tracker Pose.mp4 So let’s start with our Unreal Engine Project for the 1st scenario. If you don't like to read instructions here is a video instead. First of all, we will use a Blueprint Pawn named “VRPawn” and add 2 MotionController Components, one for each of the Vive Controllers. For the Right Controller, we will set the Motion Source to “Right” and for the Left Controller to “Left”. We will also tick the box that automatically renders the device model for us in the application. We will also add a MotionController Component for the Vive Tracker. In this case, we need to set the Motion Source to “Special 1” as this is the motion source used for Vive Trackers. If you try to play in VR Preview you your project, you will notice that the pose of the vive Tracker is not updated. That’s because SteamVR hasn’t set the pose of the Special1 Motion Source to be updated for the Vive Tracker properly yet. Here’s how to do this: Right-Click on the Vive Tracker logo and select “Manage Vive Trackers”. The SteamVR Settings window will open and you need to click Manage Vive Trackers again. You will then be able to see the currently active tracker and pick a role for it. For scenario 1 we will use the option “Camera” since we only want to use the Vive Tracker for its pose. So the Vive Tracker Role is now set to Camera and the Unreal Engine Project’s Motion Controller for our Tracker expects a Special1 Input Source. But how do we connect the 2? We need to open the SteamVR controller Bindings menu by clicking on Devices → Controller Settings and then selecting “Show Old Binding UI”. This window will allow us to define the input bindings for our UE4 project. Make sure that you have played the VR Preview at least one so that you can see the project in the available options as the menu will show you the most recently played applications. Now click on the Current Controller button(this is usually set to the Vive Controller by default) and you will see all the available options. Select the "Vive Tracker on Camera" to update the current controller. Now click the Edit button for the current Binding, or create a new Binding if you can't see that option. Next, click the "Edit Action Poses" button and a new popup window will appear. That's where you will need to assign the right hand raw pose to Special 1. That’s it! Close the window and play the VR Project in VR Preview again and you will notice that the Vive tracker pose is updated properly while running the application. You can now attach any component, e.g a static mesh and make it follow the vive tracker pose. SCENARIO 2 (POGO PINS INPUT) TrackersPOGO_UE424.mp4 in this scenario we will use 1 Vive controller (as the Right Controller of the user) and 1 Vive Tracker, but the Vive Tracker will also be able to send input to the UE4 Project via the Pogo pins. Using the same simple UE4 Project, create a new Input Action in Unreal Engine's Project Settings Input menu. This action event will be triggered every time a controller command is sent from the pogo pins and depending on the SteamVR bindings. In this case I named the action “TrackerAction” and added as key input the Vive Trigger, just to make sure that the Action will then be available in the SteamVR binding menu and so that it can be re-assigned to one of the vive tracker Pogo Pin. The only way to get input from the Vive Tracker's POGO pins in this scenario is to change the tracker role to “Held in hand” and select Left/Right depending on how we plan on using the Vive Tracker. Now let’s go back to SteamVR Input Bindings UI and select the “Vive Tracker in Hand”. In this case you will need to "Create New Binding" for this type of controller input. We can now bind the TrackerAction we created in the UE4 Project to one of the available key bindings (Power/Trigger/Grip/Menu/Thumb). Since the tracker role was set to Left Hand, you will need to pick from the left hand menu options. In this case, I added the TrackerAction as a Trigger input button. That's all! If you now use a cable to connect pins 2 + 4 together (Ground+Trigger button), the Trigger button key input command will be sent to SteamVR and SteamVR will trigger the BP Event we created earlier in our UE4 Project, printing "hello". Keep in mind that there is a limitation at the moment, since OpenVR won't allow you to use input from the POGO pins while 2x Vive controllers are already in use. Also, in Scenario 2, you need to make sure that SteamVR only has 1x controller(and not both controllers active) and 1x tracker activated and that the controller tracked is actually the Right Controller. To figure this out easily, you can try to turn on both controllers and the tracker and then turn off the controllers one by one until you can see 1x tracker and 1x controller being tracked at the same time by Unreal.
×
×
  • Create New...