Jump to content
 

Search the Community

Showing results for tags 'unity'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 36 results

  1. The most recent Wave SDK is version 3.2.0 (Beta) (Released August 11, 2020) https://developer.vive.com/resources/vive-wave/ [Important Note] This Beta version includes Experimental Features that require a different ROM version than 3.07.623.x (Update: Aug 13: See REQUEST ROM ACCESS at bottom of post.) We’d like to hear your feedback on how to improve the design interface, performance, user experience, and more. Some of the Experimental Features (specifically hand tracking and gestures) only work with a new Developer ROM, so do not submit content with these features to the Viveport store for Publishing (yet). More information to follow soon for when these upgraded ROMs will be publicly available. . Major New Features include: Unity XR Plug-in Framework Support Hand and Gesture Tracking. Please note these features are experimental and are likely to be updated in our next SDK release. Passthrough Support via new APIs. Adaptive Quality and Foveated Rendering are now enabled by Default. This can be modified in the WaveVR settings. Please review the Release Notes for more information about the great new features and for Important Changes - https://hub.vive.com/storage/docs/en-us/ReleaseNote.html#release-3-2-0 And leave any feedback or questions on the forums so that we can get back to you and the community. Wave Unity and Unreal SDK Recommended Versions: Unity 2019.4.3 or newer - see https://hub.vive.com/storage/app/doc/en-us/UnitySdk.html Unreal 4.24 or 4.25 - see https://hub.vive.com/storage/app/doc/en-us/UnrealPlugin/UnrealPlugin.html REQUEST ROM ACCESS In order to use the experimental features found in Wave SDK 3.2.0 you’ll need to upgrade to the latest ROM (go to Settings > System Update). Make sure that your device is upgraded to the latest public ROM before you request access to the Upgraded ROM. Focus Plus: 3.07.623.3 Focus (dev kit): 2.04.1400.2 Upgraded ROM Version Info: Focus Plus: 4.01.623.320 Focus (dev kit): 3.11.1400.320 Please write to vivefocus_global@htc.com and provide your: name, company, device type (Focus or Focus Plus), HMD serial number, and your current ROM Version. Once your request is processed, then you will receive a FOTA Update for the latest ROM version for your device. Expect 2 business days turn-around time.
  2. Hi, Everyone. Today, I started making VR contents for Vive Focus by unity. I could build now but this windows popup everytime. How can I solve this?
  3. In my project, it is very important to use separate cameras for each eye. Any camera with the Target Eye set to Both or Left appears to work as intended, but when set to Right, the result on the Android HMD is like a left-eye camera; only the left part of the display renders, and the right side does not render. My project has successfully used per-eye cameras on other platforms previously. This is the first time implementing Wave platform. I can't see how to fix this issue, which is probably not a concern for most developers, but for me it is critical. I am using Wave XR Plugin 1.0.0 (com.htc.upm.wave.xrsdk) on Unity 2019.4.8 LTS. @Tony PH Lin
  4. I'm trying to display a UI element in the position of the gaze collision. I found out that just using: Camera.main.WorldToScreenPoint(collision.position, Camera.MonoOrStereoscopicEye.Left); does not fully work for the "LeftEye" display - there seems to be some screen-cropping going on (about 15% on each side). Does anybody have a way how to reliably obtain a screenpoint from world point in SteamVR in Unity please? @Daniel_Y @Corvus
  5. Hello, I am trying to use the HTC Vive Pro in Unity with Tobii and I am getting this error: "Assets\TobiiXR\External\Providers\Vive\ViveProvider.cs(8,7): error CS0246: The type or namespace name 'ViveSR' could not be found (are you missing a using directive or an assembly reference?)" when I enable the "Support Vive Pro Eye" in the "Standalone Player Settings" section. I am using Unity 2019.4.7f1 but I have also tried in newer versions of Unity and I still have the same error. I have redone all the installation steps just in case something went wrong but Eye Tracking is working well as I could calibrate it in the Steam VR Home. Someone knows what could be happening? Thanks!
  6. Hi, As the title implies, I would like to use Unity 2019 (version 2019.3.8f1) for my project, an application which would use hand tracking on the Vive Focus Plus. However, a few seconds after I start the application on the headset, it suddenly freezes and crashes. The same code, on a 2018 version of Unity, works perfectly. I would like to know if anyone had encountered those issues with the Vive Focus Plus and handtracking on Unity 2019, and what steps you took if you did encounter this problem. Have a good day !
  7. Hi, I'm using the SRWorks SDK – 0.9.0.3 Version and getting the following error in the demo scene of ViveSR_Experience in Unity 2019.2.5f1 when hitting play: Runtime Error! Program: This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information. Unity quits when I hit the OK button. I tried following the Guide "VIVE Getting Started with SRWorks Experience in Unity" but the error occurs anyway. I'm using the Vive Pro Eye if that matters. The camera has been activated and successfully tested in SteamVR 1.11.13. The SRWorks Runtime has been installed as well. Any help is much appreciated.
  8. Hello, I am trying to create a game in Unity that involves the use of Vive trackers. However, we cannot let others run the Unity project and can only give them a build/executable of the game. There are 3 Vive trackers needed (1 for hip and 2 for feet). Is there a way to have a popup at the start of game that forces users to turn on Vive trackers and assign them to certain game objects in the scene? Thank you! @chengnay
  9. I just got waveVR for unity in version 2019.2.9, and every time I start up the demo scene provided with it unity crashes. Any help would be appreciated
  10. I am creating a VR project for Vive Focus with Unity. Run WaveVR Attributes and the following screen will appear. This screen is also displayed when you execute a build. Which manifest file should I modify?
  11. Hi there, I am attempting a AR project using the Vive Pro Eye and SR Works. I am unsure how may I extract out a Texture2D from the front cameras to be passed on to 'OpenCV for Unity' for Aruco marker detection. After looking through the SRWorks SDK, I came across ViveSR_DualCameraImageCapture.cs which will properly be the likely script to use. I have attempted getting the Texture2D from ViveSR_DualCameraImageCapture.GetUndistortedTexture() but I am without success. I am seeking any advice on how to use ViveSR_DualCameraImageCapture.cs and/or if anyone has success at Aruco marker detection using Vive SRWorks. Many thanks!
  12. Hello, Trying to integrate a "passthrough" feature that will emulate the passthrough feature accessed by the double tap on power button on the Vive Focus Plus, so I can use the image to setup the virtual environment for my users. However, we're running into a problem : when the application is launched, if the headset enters power saving mode and the application is paused, the camera doesn't work and update anymore. Unity version is 2018.3.0f2, WaveSDK version is 3.1.94. If anyone has a bit more stability with their camera usage in their application, I would love to know how you made it.
  13. I'm working on a Vive Focus Plus project using Unity. I'm using Wave SDK v3.1.1 (Beta) and Unity 2018.3.10. I've been trying to get single pass stereo rendering to work properly for a while and I keep running into the same issue. When I try to turn on SPS using the WaveVR settings dialog (see below), I see both passes rendered side-by-side in the left eye of the Focus, and nothing in the right eye. It's as if one had taken what should be in the right eye and shoved it over to the left eye. Here's all the settings I think would be relevant: When I don't enable the last WaveVR settings option (the Single Pass Stereo support one) the application runs fine. Both eyes render properly, but it's clear it's using multipass rendering. How can I fix Single Pass Stereo rendering, and what am I doing wrong?
  14. Hi. Are there plans to make a Wave plugin for the new Unity XR Platform? If I'm targeting an app for steamvr, oculus, WMR and Wave (Vive Focus), what should I use? VIVE Input Utility (VIU)? Thanks @C.T. @Tony PH Lin @chengnay
  15. @VibrantNebula @JustinVive @Dario and team. It would be really great if your could publish a Best Practices Guide on Player/Build/Preferences settings in Unity. It would be VERY helpful understanding how WAVE SDK changes might be impacted by the version of unity you might be running. For instance, having a better understanding of how Unity 2019.x player settings impact the performance of our Wave VR 3.1.6 SDK based app on a Vive Focus/Plus would be EXTREMELY helpful...THoughts? Here's ours if its helpful. We are running Unity 2019.3.6f1 + Wave SDK 3.1.6 and we develop for BOTH Vive focus and vive focus using latest build of Visual Studio 2019 Enterprise as our editor. @Tony PH Lin @Cotta
  16. Hi, I'm currently trying to submit our first VR app release for the Pico G2 4K headset. The application is built in Unity and currently only uses the Pico SDK (So neither the Cardboard, Daydream, OpenVR, Vive or Wave SDK). The application runs fine, when side-loaded from Unity to the HMD. That's how we currently supply it to customer-businnesses of us. In order to make it publically findable, and to update it regurarly, I want to make it available for Viveport. I'm running into a couple of problems here... I'm unsure whether to select "Vive/OpenVR" or "Mobile VR" content. Later on in the submission process, I'm also required to list the supported headsets, but Pico 2G 4K is not an option. Also, Is it required to use one of the SDK's mentioned above to do submit our app in Viveport? Once uploaded and available, will it be possible for the installed application (built directly from the Unity) to replace (i.e. update) the existing application, as long as the bundle ID's match? Could someone guide me through this process? Thank you very much in advance!
  17. Hello, I am a software engineer at NASA's Johnson Space Center, and I've got some questions regarding the Vive's software interface. The objective of my team is to provide a virtual reality teleoperation interface to the Valkyrie robot, using two cameras to provide stereoscopic vision to somebody using the Manus VR gloves and HTC Vive Trackers to manipulate the robot's arms, so that they can see what they're doing. We've been using Unity to render the incoming camera images onto planes that are positioned in front of the cameras, whose resulting views are then put into the Vive screens through the black-boxy SteamVR array of scripts. However, there exist YouTube videos and other records of people being able to put incoming images into the VIVE screens directly without resorting to the roundabout methods I'm using now. I would like to know if any of you fine folks know how to access the VIVE's two screens and render JPEG images onto them. Please don't hesitate to ask for clarifications about this question! @Jad
  18. First, I'm not very good at English so bare with me. I'm currently working on the Sample scene in the Hand tracking sdk in Unity. I'm using XR settings of Google Cardboard which doesn't support Skeleton mode.(I suppose I'm on the 2d point mode) But can I render the skeleton model with the 21 points returned by the GestureInterface.cs(with the fake z coordinate)? I know it will seem weird but I would really want those skeleton to show
  19. I'm currently working on the Sample scene in vive hand tracking on unity Is there a way to render the hand skeleton with the 2d mode points returned (with a fake z factor) from Gesture Interface script?
  20. Hey, I'm trying to get SRWorks working on Unity so I can access the front cameras on an HTV Vive Pro. After I import the SRWorks package, open the sample scene and hit play, my Unity editor stops responding. Occasionally when I reopen the project there'll be an error suggesting there was a memory overflow (not to mention my memory rocketing to 96% lol). I'm running this on Unity 2019.2.10f1. Has anyone seen this problem before, or know how I can solve this and get SRWorks working? Thank you @Daniel_Y @Corvus
  21. Hello everyone I'm working on a simple VR shooter game using Wave SDK 3.1.4 and Unity 2019.2.3. The gameplay is rather simple: the player is in an open world scene with some enemy droids around, and he/she presses the touchpad button to shoot. The bullets which are rays, come from two guns on both sides of the player, so the player uses their head for aiming and the controller button to shoot, so the controller's orientation and direction doesn't matter (photo attached) . The problem I'm having is that the controller button doesn't work unless the player is holding it up, facing forward, but I want them to be able to shoot no matter how they're holding the controller. I was wondering if there's any way in Unity to use the controller only for button press and disregard its facing direction/orientation? I have an InputModuleManager and I've set the Raycast Mode to Fixed and Beam Length to 0.01, so we don't see the beam length at least, but the original issue exists. I'd appreciate any tips. Thanks a lot.
  22. I have a Unity application that needs to record pupil dilation as an indicator of user stress levels. For this to be meaningful, it needs to be normalized relative to scene luminosity. I don't see any obvious way to measure overall luminosity short of adding up all the pixel values and dividing by the number of pixels. Is there system parameter that provides this value or an API call that does that calculates it?
  23. How do I get the current timestamp from the eye tracker in Unity? @Corvus @Daniel_Y @Andy.YC_Wang
  24. Specifically, I need to know where my participants will be looking at a precise time, so the video and the samples from the EyeData object must be perfectly aligned in time. I am also wondering what the equation is to calculate where the participant is looking from the left and right gaze_origins and the left and right gaze_directions or if there is a better way than trying to compute where the participant is looking in VR from those 4 sets of values. @Corvus @Daniel_Y
  25. I am investigating HMDs for a game soon to be in development, and I was hoping someone could fill me in on best practices for the Focus Plus units. Is there a functional difference between the developer kit version and the consumer model of the Focus Plus? This unit appears to be the old version of the Focus. Is there a dev kit for the Plus model? The units would be running a single application at any given time so I presume something akin to kiosk mode would be used, or are apps only intended to be loaded via Viveport? Thanks @Tony PH Lin @Cotta @JustinVive
×
×
  • Create New...