Jump to content

dario

Verified Members
  • Posts

    450
  • Joined

  • Last visited

Everything posted by dario

  1. Try Unity 2018.2 and see if creating a new project to avoid any dll loading issues.
  2. Yes, VIU will be supporting future devices. As a open source project on Github there shouldn't be concern over long term support, As for performance, we did performance tests early on with no degradation of performance, it's essentially wrapping the underlying SDKs (e..g Wave SDK, Unity XR APIs, SteamVR Unity plugin) as you would likely do when using the SDKs directly. We just opened a dedicated forum for VIU - so please post questions here: https://community.viveport.com/t5/Vive-Input-Utility/gp-p/ViveInputUtility
  3. Can you share the android manifest snippet ? Normally adding to the manifest prevents the dialog from appearing. Note to all: This is more of a reminder than a build error - you can safely ignore (close) while developing. This is used by the store and thus is very important to add before publishing so that your application gets listed in the appropriate sections. The recommendation is to do your best to support both 3DOF and 6DOF controller(s) for wider reach.
  4. Does not work with Experience demos which require the SteamVR Unity plugin. The core package doesn't require the SteamVR Unity plugin
  5. Please consider using the VIU plugin to save you from managing different code bases for each platform! It let's you support e.g. Vive(or Rift) and switch to mobile platforms (like WaveSDK) by simply changing the build target! More information in our developer blog (getting started entry): https://community.viveport.com/t5/Developer-Blog/bg-p/devblog
  6. Hi Tobias, This isn't specific to wireless, it's whenever you lose tracking. It freezes because the camera feed is dependent on tracking (head pose). Notice that even in steamvhsettings you can't preview the cameras until tracking works.
  7. Getting Started with VIVE Wave™ for Unity Developers (updated!) First download the Wave SDK : Legacy: if not yet on XR Management system (Unity 2019.4 and later e.g. if still on Unity 2018) https://developer.vive.com/resources/vive-wave/sdk/320/ Note: Porting to the VIVE Wave platform General Porting Guides from Daydream (mobile) or from the VIVE to Wave are available here: https://hub.vive.com/storage/app/doc/en-us/PortingGuide.html Please note the following quick guides below focuses on Unity scenes and code and when porting across platforms and devices but also consider graphics (depending on CPU/GPU/RAM etc.) coniderations. If your existing application used the SteamVR APIs directly, note that most of the Wave APIs have a one to one correspondence to SteamVR. So if you have a 3rd party framework that wraps SteamVR you should also be able to support the Wave SDK by mapping the APIs as shown in the VIVE porting guide. Porting from other devices using different toolkits like Unity XR Interaction (in preview) for Unity XR plugins like Wave 3.2+ or VIU (Vive Input Utility) which supports both Unity XR Management and legacy should be considered. A Quick Start Guide for developing in Unity: The following are the steps for setting up a scene with the Wave SDK (legacy) , but also see the alternative below it when using the VIU toolkit along with the Wave SDK for cross platform support for either legacy or the new Unity XR Management support. 1) For legacy Unity (pre Wave 3.2): Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings... (see the getting started guide to setup Android https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html) Note: for Wave XR plugin (Unity 2019.4) use the package manager and you can also avoid Android Studio and use the built in Android support. 2) For legacy support: Import wavevr.unitypackage (Assets->Import Package->Custom Package...) 3) For legacy support: From the Project Assets window, drag and drop the WaveVR and ControllerLoader prefabs into your Scene Hierarchy window to create objects in your scene (delete or disable the existing Camera object there’s one already included in WaveVR) For Wave XR plugin support - you can use Unity XR APIs like when using any other XR plugin. 4) For legacy: duplicate the ControllerLoader in your scene (or drag another one in) and select its Type as the left controller in the Inspector window as shown above. At this point it’s up to you how to handle a second controller’s inputs (or simply handle it the same as the first) For Wave XR plugin, see samples included with the packages. 5) from File->Build Settings.., select Build and Run (make sure device is attached via USB and Developer settings on the device allows for USB debugging) VIU (more below) can use a simulator when developing for all platforms. Note if at any time you get prompted by a WaveVR plugin popup window to accept preferred settings, simply accept unless you have a good reason not to. You can safely dismiss the AndroidManifest related popup for now until you are ready to publish on Viveport (this is for indicating 3DOF vs 6DOF or both). At this point you should be able to see your empty scene with your controller on your Wave device ! Alternative Quick Start Using the VIU (Vive Input Utility) Unity plugin: There is also an additional Unity plugin for developing VR projects that can target multiple platforms and devices and is highly recommended especially for new projects or projects that don’t require extensive use of the Wave SDK APIs (although you can access both APIs at once). The Vive Input Utility: This is a Unity plugin that can support Vive, Vive Pro, Rift, Daydream, Go, Quest and the Wave SDK (e.g. Focus and Focus Plus) in addition to Unity’s XR APIs which in turn can support Windows MR and more. This is an abstraction that wraps other SDKs like the Wave SDK creating a common code base for many platforms. It’s available on the Unity Asset Store (search for VIU) or at https://github.com/ViveSoftware Steps to create the same application but using the VIU plugin: 1) Launch Unity and create a New Project and import VIU (Vive Input Utility) from the Unity Asset Store or package manager (or github) 2) Drag and drop the ViveCameraRig (or the ViveRig for additional features) into your scene and remove the existing Camera object (there is a camera already included in ViveRig) 3) Build and Run VIU Note: Since these prefabs also support other platforms you already get 2 controller support (in addition to falling back to single controller). The ViveRig adds controller support for teleporting, grabbing and toggling controller models and can be easily modified in the Inspector when ViveRig->ViveControllers is selected in the scene. Note: VIU will step through obtaining the latest Wave 4.x packages via the Package Manager when you select Wave as the target in VIU Settings (in Preferences). These settings will wrap selecting the target in XR Management settings. Support is provided at the official developer forum for the Wave SDK: http://community.viveport.com/t5/Vive-Wave-SDK/bd-p/vive-wave-sdk and VIU: https://forum.vive.com/forum/72-vive-input-utility/
  8. , it's currently available - please see the update at the top of this thread
  9. Some notes based from release notes: (release notes in docs download - note that it's 7 downloads now) Added the options to initialize individual modules in the Unity inspector. (check out the new AI module) Add a new AI Vision module, recognizing ceiling, floor, wall, chair, table and bed, to have application understand your surroundings so as to seamlessly mix real and digital in a natural life style of interaction. Added 3D segmentation with bounding box and annotation: - Added New Chaperone sample: human detection chaperone running in the background as an overlay: - Added a Tile Drawer sample: place tiles on a specified plane and the effect of Depth Occlusion application. Enhance Depth collider mesh with hole filling mode (default setting is enabled): Please see the complete release notes and known issue(s) as well as all the latest downloads over at: https://developer.vive.com/resources/knowledgebase/vive-srworks-sdk/
  10. Thanks for reporting - we'll look into it right away as we're preparing a new update. Please indicate which version of Unity and if possible which version of the NVIDIA driver and the log file (working folder) when it happened.
  11. Hi , so are you saying you previously used it and now the tracker is stuck as a controller? If so,you would need to install the tracker firmware to restore it (found in the SteamVR installation folder) Removing this temp tool is for the best to avoid users needlessly changing their trackers the hard way - when you can now simply map any tracker as a controller using the new SteamVR input system or vive versa. Again this tool was only to be used for a couple of months after Tracker 1.0 came out as support for the trackers has been well established ever since and now with the new input system more it's even more flexible and removes any need to use this role changer tool even by end users. Lastly, if you're a developer using Unity you can also use the device role mapping found in our Vive Input Utilities Unity plugin on the Unity assetstore which has been available even before the new SteamVR Input system.
  12. Take a look at this example (there's two ways of occluding there, tracking the hand and using depth) https://github.com/vivesoftware best, Dario
  13. In Unity you can find the Android manifest under Android folder under the Plugins folder.
  14. They appear at runtime, To set in a script see the example I posted.
  15. Yes, the PC (Vive hmd) can be the server (host) as long as the Focus is on the same LAN. Check the Unity assetstore for plugins that may make it easier for you.
  16. For an example that shows how to set these settings in code check this hand interaction and occlusion example: https://github.com/ViveSoftware/ViveSRWorksHand
  17. With the latest OTA update (version 1.41.1400.1 released on July 20, 2018) you may have noticed the new kiosk setting under settings. Primarily for enterprise development (also useful for arcades and public demos) you can select any app installed to be the only app that runs upon boot (doesn't require it to be a launcher) You can exit kiosk mode from the power button menu and there's also an option to secure it with a 5 digit passcode. For disabling controller pairing when you launch your app the previous method (using intents) posted has been deprecated and the new way is by adding to your android manifest's application section: <meta-data android:name="no_controller_pairing" android:value="true" /> UPDATE: a newer way to avoid using the manifest is to use the following API: wvr.Interop.WVR_SetInteractionMode(wvr.WVR_InteractionMode.WVR_InteractionMode_Gaze);
  18. ViveSR_DualCameraImageCapture.EnableDepthProcess(true);
  19. Well if you want to extend your audience let us know how we can help address and support your concerns, especially as Viveport will be a good way to promote and sell your VivePro front facing camera experiences.
  20. There should be as the underlying native APIs are the same, so depending if you're ok with looking at and calling native code. We will be looking into providing similar solutions for UE4 in future updates.
  21. Please refer to best practices here: http://community.viveport.com/t5/Vive-SRWorks-SDK/VIVE-Pro-SRWorks-SDK-best-practices-so-far/gpm-p/15727 The resolution shouldn't come as a surprise (same as the original Vive) and why you should consider thinking outside of just "AR" experiences as this isn't a AR hmd. Think of new use cases to use the "See-through Reality" option you have here. Remember that you don't have to enable the cameras to use them for pre-scanned room objects or real-time depth detection (which doesn't require high res cameras). The frame rate adjustment issues is known and we're working with Valve to address this, but it will not affect the given camera resolution anyway.
  22. Yes and no, if you check the "Run Depth Mesh Collider" you will be able to collide the dynamic meshes (which can include your hands) with virtual objects but not just your hands exclusuvely. We'll be sharing code to detect the hands soon enough but you can start here with this solution depending on your use case.
  23. If you're new to 3DOF controllers altogether (and who can blame you if coming from the Vive) you're probably new to VR on mobile platforms e.g. Gear and Daydream. One way to start is to see what is out there on the markets specifically if you have a Focus, try as many titles already available on Viveport. For development techniques, check out the following guideline from Google regarding IK arm models which help approximate 6dof movement: http://developers.google.com/VR/elements/arm-model For picking things up, an easy implementation would be to use a laser beam to bring the objects to you (perhaps using the trackpad for finer control). Again try as many apps out there that use 3dof controllers, there's a whole audience out there already familiar with them, in fact the 6dof hmd will be more new to this audience than 3dof controllers.
  24. Hi Eric, I think you meant is there a setRotation on the pass-through views, which technically you can set by rotating the cameras, but I wouldn't recommend it. By definition, a pass through view is always correct (ideally) regardless of hmd pose or orientation and will cause disorientation unless that's the intent. Is this is a seated experience with a rotating chair that you want to fool the viewer into thinking you're facing a different location? I would recommend scanning the room if you want to arbritrarily rotate it since the pass-through isn't 360 degrees anyway.
  25. Which GPU driver+version and version+build of Windows ?
×
×
  • Create New...