Jump to content

dario

Verified Members
  • Posts

    450
  • Joined

  • Last visited

Blog Entries posted by dario

  1. dario
    Getting Started with VIVE Wave™ for Unity Developers (updated!)
    First download the Wave SDK : 
    Legacy:  if not yet on XR Management system (Unity 2019.4 and later e.g. if still on Unity 2018)
     https://developer.vive.com/resources/vive-wave/sdk/320/
    Note: Porting to the VIVE Wave platform 
     General Porting Guides from Daydream (mobile) or from the VIVE to Wave are available here: https://hub.vive.com/storage/app/doc/en-us/PortingGuide.html  Please note the following quick guides below focuses on Unity scenes and code and when porting across platforms and devices but also consider graphics (depending on CPU/GPU/RAM etc.) coniderations.
     
    If your existing application used the SteamVR APIs directly, note that most of the Wave APIs have a one to one correspondence to SteamVR.  So if you have a 3rd party framework that wraps SteamVR you should also be able to support the Wave SDK by mapping the APIs as shown in the VIVE porting guide.  Porting from other devices using different toolkits like Unity XR Interaction (in preview) for Unity XR plugins like Wave 3.2+ or VIU (Vive Input Utility) which supports both Unity XR Management and legacy should be considered.
     
    A Quick Start Guide for developing in Unity: 
     
    The following are the steps for setting up a scene with the Wave SDK (legacy) , but also see the alternative below it when using the VIU toolkit along with the Wave SDK for cross platform support for either legacy or the new Unity XR Management support.
    1) For legacy Unity (pre Wave 3.2): Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings...  (see the getting started guide to setup Android https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html)   Note: for Wave XR plugin (Unity 2019.4) use the package manager and you can also avoid Android Studio and use the built in Android support.
    2) For legacy support:  Import wavevr.unitypackage  (Assets->Import Package->Custom Package...)   
    3) For legacy support: From the Project Assets window, drag and drop the WaveVR and ControllerLoader prefabs into your Scene Hierarchy window to create objects in your scene (delete or disable the existing Camera object there’s one already included in WaveVR)  For Wave XR plugin support - you can use Unity XR APIs like when using any other XR plugin.
    4) For legacy: duplicate the ControllerLoader in your scene (or drag another one in) and select its Type as the left controller in the Inspector window as shown above. At this point it’s up to you how to handle a second controller’s inputs (or simply handle it the same as the first)  For Wave XR plugin, see samples included with the packages.
     
    5) from File->Build Settings.., select Build and Run (make sure device is attached via USB and Developer settings on the device allows for USB debugging)  VIU (more below) can use a simulator when developing for all platforms.
     
    Note if at any time you get prompted by a WaveVR plugin popup window to accept preferred settings, simply accept unless you have a good reason not to.  You can safely dismiss the AndroidManifest related popup for now until you are ready to publish on Viveport (this is for indicating 3DOF vs 6DOF or both).
    At this point you should  be able to see your empty scene with your controller on your Wave device ! 
     
    Alternative Quick Start Using the VIU (Vive Input Utility) Unity plugin: 
     
    There is also an additional Unity plugin for developing VR projects that can target multiple platforms and devices and is highly recommended especially for new projects or projects that don’t require extensive use of the Wave SDK APIs (although you can access both APIs at once). 
    The Vive Input Utility:  This is a Unity plugin that can support Vive, Vive Pro, Rift, Daydream, Go, Quest and the Wave SDK (e.g. Focus and Focus Plus) in addition to Unity’s XR APIs which in turn can support Windows MR and more. This is an abstraction that wraps other SDKs like the Wave SDK creating a common code base for many platforms. It’s available on the Unity Asset Store (search for VIU)  or at https://github.com/ViveSoftware    
     
    Steps to create the same application but using the VIU plugin: 
     
    1) Launch Unity and create a New Project and import VIU (Vive Input Utility) from the Unity Asset Store or package manager (or github)
    2) Drag and drop the ViveCameraRig (or the ViveRig for additional features) into your scene and remove the existing Camera object (there is a camera already included in ViveRig) 
    3) Build and Run 
    VIU Note: Since these prefabs also support other platforms you already get 2 controller support (in addition to falling back to single controller). The ViveRig adds controller support for teleporting, grabbing and toggling controller models and can be easily modified in the Inspector when ViveRig->ViveControllers is selected in the scene.
    Note: VIU will step through obtaining the latest Wave 4.x packages via the Package Manager when you select Wave as the target in VIU Settings (in Preferences).  These settings will wrap selecting the target in XR Management settings.
     
    Support is provided at the official developer forum for the Wave SDK: 
    http://community.viveport.com/t5/Vive-Wave-SDK/bd-p/vive-wave-sdk
    and VIU:
    https://forum.vive.com/forum/72-vive-input-utility/
  2. dario
    Today we are announcing to developers an early access release of the Vive Hand Tracking SDK for the Vive, Vive Pro and the Vive Focus (Wave platform). This SDK will provide the ability to track your hands, recognize gestures and on the Vive and Vive Pro track your fingers as well (21 point tracking).
     
    For more info please attend the sessions at GDC on Vive Developer Day, Monday March 18.

    Or just try out the SDK available now here: https://developer.vive.com/resources/
     

  3. dario
    HTC VIVE 3DSP SDK
    HTC VIVE 3DSP is an audio SDK which provide applications with spatial audio, a key factor for an immersive VR experience. With the HTC VIVE 3DSP SDK, the spatial perception is simulated by specific functions and features, such as head-related transfer functions recording and improvement, higher-order ambisonic simulation of sound direction, room audio simulation, adding background noise floor, real-world acoustic property of distance, geometric and raycast occlusion, Hi-Res audio support, and many other features.
     
    There are a lot of factors that could influence the human perception of audio spatialization, such as interaural time difference, interaural level difference, the human body factors (pinna, head, shoulder and torso), the environment factors (room reflections and reverberation), the distance from sound source to user, and the obstacle occlusion. Based on these factors, the HTC VIVE 3DSP generates immersive and realistic audio perception with the following key features:
    Higher Order Ambisonics (HOA) with very low computing power. Head-Related Transfer Function (HRTF) based on refined real-world modeling (horizontally and vertically) resulting in a better algorithm that is applied to all sound filters and effects. Room Audio simulates the reflection and reverberation of a real space. Hi-Res Audio Settings source files and playback. Distance Model based on real-world modelling. Geometric occlusion uses no Unity collider and the cover area is calculated by itself  
    Higher Order Ambisonics (HOA)
     

     Ambisonics is the technology that uses a full-sphere surround sound technique to simulate spatial sound. The 3rd order ambisonics model have been represented in HTC VIVE 3DSP SDK.
     
    Room Audio

    The technology simulates the room audio with the early reflection, late reverberation, background noise, environment materials and so on.
     
    Distance Model


    There is a sound level decrease during sound transmission in real word. However, the changes are different in different conditions. In HTC VIVE 3DSP SDK, serval decadence models are provided.
     
    Occlusion

    The occlusion effect is used to accurately simulate what happens to sound when it encounters an obstacle in its transmission path. Both of mono and binaural occlusion modes could be set up in HTC VIVE 3DSP SDK.
     
     


    Geometric Occlusion: The geometric occlusion calculates the covering ability by analytical geometry techniques. Since there is no need to use the Unity collider.
    Raycast Occlusion: The covering ability of an obstacle is calculated by casting many rays into space and enumerate how many of them are blocked.  The VIVE 3DSP SDK supports room effect, room reverberation and reflection, and acoustic occlusion. It also has spatial effect optimization for VIVE Pro Headphones. However the original VIVE and other HMDs and headphones are also supported.
     
    For more information and to request a preview of the the upcoming ambisonic decoder please visit the VIVE Audio SDKs community forum: http://community.viveport.com/t5/Vive-Audio-SDKs/gp-p/devgroup3
     
     



     
  4. dario
    VIVE SRWorks SDK
     
    With the launch of VIVE Pro, developers will now have access to the stereo front facing cameras to create new experiences that can mix the see-through stereo camera view and their virtual worlds. This will enable developers to perform 3D perception and depth sensing with the stereo RGB sensors, opening new worlds for more creative, interactive experiences.
     
    In addition to the updated OpenVR camera APIs that can now handle more than the mono camera of the original VIVE, the VIVE Software team is also providing developers the VIVE SRWorks SDK. With this SDK you can access more than just the raw camera images:
     
    Depth Spatial Mapping (static and dynamic meshes) Placing virtual objects in the foreground or background Live interactions with virtual objects and simple hand interactions  
    These features are provided by three service modules, a Depth module, a See-through module and a 3D reconstruction module thus allowing developers to focus on the content.
     
    The SDK includes support for native development with plugins for Unity and Unreal.
     
    The following videos illustrate some of the features:

        
     

     

     
     
    Here's a portal for passing between the real and virtual worlds
     

    Note: The project code for the portal example above is included in the VIVE SRWorks SDK Unity package.
     
    Here's an example from developer Jonathan Schenker from Alvios using screen filter effect to mix the realities and here's another example from developer Vladimir Storm using the 3D Reconstruction module.
     
     
     
    VIVE Audio
    We are also announcing two VIVE audio SDKs, available for Unity, with support for UE4 coming soon.
     
    - VIVE 3DSP SDK
    The VIVE 3D Sound Perception SDK provides a Unity compatible, audio spatialization plugin with the following features:
    Higher Order Ambisonics HRTFs based on refined real-world modeling (horizontally and vertically) Support for Hi-Res audio source files and playback. Acoustic distance effect with real-world modeling. The VIVE 3DSP plugin supports room effect, room reverberation and reflection, and acoustic occlusion that is tuned for the VIVE Pro, however the VIVE and other HMDs and headphones are also supported. - VIVE Pro Audio Mode
    Since the VIVE Pro has dual microphones with support for alert and conversation modes, APIs are provided so that you can toggle between the three audio modes. This provides applications with the ability to toggle between listening to foreground or background audio or a mix of both. Additionally, a USB Type-C high power mode setting (on/off) is also available.
     
     
    Initially available as early access (beta) SDKs, you can find the downloads and join the developer support forums  at http://developer.vive.com/resources
     
     
×
×
  • Create New...