Jump to content
 

Search the Community

Showing results for tags 'unity'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 14 results

  1. I'm working on a Vive Focus Plus project using Unity. I'm using Wave SDK v3.1.1 (Beta) and Unity 2018.3.10. I've been trying to get single pass stereo rendering to work properly for a while and I keep running into the same issue. When I try to turn on SPS using the WaveVR settings dialog (see below), I see both passes rendered side-by-side in the left eye of the Focus, and nothing in the right eye. It's as if one had taken what should be in the right eye and shoved it over to the left eye. Here's all the settings I think would be relevant: When I don't enable the last WaveVR settings option (the Single Pass Stereo support one) the application runs fine. Both eyes render properly, but it's clear it's using multipass rendering. How can I fix Single Pass Stereo rendering, and what am I doing wrong?
  2. Hello! I am developing with Unity2017.4.34f1 using Vive pro eye. In the project I developed, scene transitions occur frequently, but at that time, an error [[SRanipal] Initial Eye: DEVICE_NOT_FOUND] occurs and the eye tracking function stops responding. Please help me if you know the cause or solution of this problem. Thank you! @Daniel_Y @zzy
  3. Hey, I'm trying to get SRWorks working on Unity so I can access the front cameras on an HTV Vive Pro. After I import the SRWorks package, open the sample scene and hit play, my Unity editor stops responding. Occasionally when I reopen the project there'll be an error suggesting there was a memory overflow (not to mention my memory rocketing to 96% lol). I'm running this on Unity 2019.2.10f1. Has anyone seen this problem before, or know how I can solve this and get SRWorks working? Thank you @Daniel_Y @Corvus
  4. I am using an older version of unity and waveSDK. unity2017.4.1 waveSDK2.0.37 The old project file was converted to unity2018.4.11. I updated waveSDK2.0.37, but I get some errors. What is the simplest migration procedure? @Tony PH Lin
  5. Dear VIve forum, I am having a hell of a time trying to get my Vive Focus Plus to work with WaveVR in Unity 2019.1.0f2. Could we get an updated document on how to set up WaveVR? Specifically linking to our SDK and JDK? https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html The guide says that Android SDK 7.1.1 (API Level 25) or higher is required. As well as Android SDK Tools version 25 or higher. Since I am starting off fresh and am also new to Unity and Android development, I had no Nvidia Codeworks or Android Studio. When I download Android studio, I go to the Android SDK section and I click on Android 7.1.1 (Nougat) (API Level 25), and I click on Oreo as well just because the picture shows they have it selected. Moving on to SDK tools, I choose Android SDK Tools (version 26.1.1) and from there it seems really self explanatory so far. I complete all the build settings and move on to the External Tools part to set the Android SDK and JDK paths. In the image in Vive's documentation, I do not have those paths that they show. How do I know where my SDK and JDK locations are? When I Google "NVPACK," Nvidia Codeworks comes up. Do I also need to download that so I have those directories and tools? When I make an Android Studio project, I can then go to "Project Structure" to see my SDK and JDK location, but is that just for my newly created Android Studio project, or where the SDK and JDK live to use with everything else (including Unity)? In short, where are my Android SDK and JDK locations I need to link the paths to in the External Tools section in Unity Preferences? I think this is what is causing me to not see the WaveVR simulator and have "Connection Timeouts" when trying to PIE and causing build fails. Any help would be appreciated. Thanks, Tyler @Tony PH Lin @Cotta
  6. Is there some way to get the image of user's eye(s) and show it via Unity app? @zzy @Corvus
  7. I'm working on an asymmetric 2-player Steam VR experience, in which one of the players uses an HMD, while the other one controls their experience with a mouse and a screen. The non-VR user is shown a complicated UI, in which the central frame is supposed to show the VR user's perspective. I know I can do it by: 1. Using a UI canvas in 'Screen Space - Overlay' mode, and overlaying the UI on top of regular non-VR user display. This way, the regular VR feed can be shown through the UI, in the parts that I leave transparent. However, that's not what I want. I want to see the full VR user's view in a scaled frame, without losing the peripheral elements which would be covered by the UI in this case. 2. Using an additional camera, that is parented to the Head object and renders to a RenderTexture. This RenderTexture is then displayed in a RawImage UI element. Yes, but this is very costly to do in good quality. 3. Using a post-process effect, to blit the VR user's feed into a RenderTexture. Yes, but the image is then fish-eye distorted (which is later corrected by HMD's lens). I could use a shader to undistort it, which is currently my best bet, but I'm thinking: Isn't there a simpler and less expensive way? For example, do you know a way I can access the final texture displayed as a VR user's stream for 2d screens? The thing that's behind my UI in Option 1 is all I need, if I can only scale it to show fully in a UI element.
  8. The most recent Wave SDK is version 3.14 (Released October 16, 2019) https://developer.vive.com/resources/knowledgebase/wave-sdk/ Here are the release notes and latest development guides: https://hub.vive.com/en-US/profile/documents Release Notes Wave SDK 3.14 SDK Plugin Recommended Versions: Unity 2017.4 LTS - see https://hub.vive.com/storage/app/doc/en-us/UnitySdk.html Unreal 4.22 - see https://hub.vive.com/storage/app/doc/en-us/UnrealPlugin/UnrealPlugin.html Changes Improve the performance of system recording for device maker. Bug fixes Fixed bugs of the Adaptive Quality. Fixed bugs for input device mapping failure when input device does not support volume up/down. Needed to update wave_server.apk to solve this issue. Fixed bugs which might result in presenting right eye image to both eyes when applying partial texture and present with different left/right eye uv values. Known Issues Plugin kit SDK Bug fixes Enhance device config data (disableVirtualDpad) to let this configuration data work if you didn’t use setVirtualDpadSupport to enable disableVirtualDpad. Unity Changes Remove emulation-switching in DirectPreview options. Not to export log to file of DirectPreview Server. Set default Auto Graphics API false and Graphics Apis OpenGLES3. Improve DirectPreview stability. Bug fixes Fixed foveation function which was not working in IL2CPP build. Unreal Bug fixes Fixed “The battery did not show when controller is in left-handed mode” issue. Fixed “Z-Fighting problem” by following the NearClippingPlane value in Project Settings. Fixed Unreal presenting the right eye image to both eyes if disable MultiView feature. Known Issues Does not support IME in current version. See Release Notes for more information on Unity and Unreal Engine Plugins as well as former SDK versions - https://hub.vive.com/en-US/profile/documents
  9. @Corvus Is it possible to GET calibration data as byte array OR json (or any other data set) and SET it for different users via Unity app?
  10. I'm trying to show 2 kinds of screens(external camera - or webcam / front camera on HMD) on lens. I find a function? class? "WebCamDevice" - https://docs.unity3d.com/2020.1/Documentation/ScriptReference/WebCamDevice.html and also a project using that on Oculus VR. Anyway I imitate that project for study how to use that. I thought front camera is connected with PC so may they capture that cameras as webcams. but It doesnt. how can I project a texture on any object with a video from front camera? Is "WebCam" sth function right? this is an example what I want to do... that 2 Main Camera means 2 lens for stereoscopy and 2 quads are for screen sth(from Webcam or Front camera). I will attach a code I found at the bottom. It is tooo long to put here... I'm beginner in VR and Unity so I may think wrong about that. hope your wisdom and know-how!... thx developers! ------------------------------------------------------------------------------------------------------------------------------------------------- using System.Collections; using System.Collections.Generic; using UnityEngine; public class WebcamTexture : MonoBehaviour { WebCamTexture webcamTexture; public int webcamNumber; float cameraAspect; private float margin = 0f; public float scaleFactor = 1f; public bool rotatePlane = false; public bool fitting = false; //Transform cameraOVR; //public static Vector3 cameraPos; // Use this for initialization void Start() { WebCamDevice[] devices = WebCamTexture.devices; webcamTexture = new WebCamTexture(); if (devices.Length > 0) { webcamTexture.deviceName = devices[webcamNumber].name; Renderer renderer = GetComponent<Renderer>(); renderer.material.mainTexture = webcamTexture; webcamTexture.Play(); } if (rotatePlane) transform.Rotate(Vector3.forward, 180); if (fitting) FitScreen(); // camera position //cameraOVR = GameObject.Find("OVRCameraController") as Transform; } void FitScreen() { Camera cam = transform.parent.GetComponent<Camera>(); float height = cam.orthographicSize * 2.0f; float width = height * Screen.width / Screen.height; float fix = 0; if (width > height) fix = width + margin; if (width < height) fix = height + margin; transform.localScale = new Vector3((fix / scaleFactor) * 4 / 3, fix / scaleFactor, 0.1f); } // Update is called once per frame void Update() { //cameraPos = cameraOVR.position; //print(cameraPos); } } @Corvus @Daniel_Y
  11. Hello! I'm following the start guide: https://vr.tobii.com/sdk/develop/unity/getting-started/vive-pro-eye/ I'm stuck on step 5. I set standalone provider to "Tobii or HTC" and get these errors: Assets\TobiiXR\External\Providers\TobiiHTC\TobiiHTCProvider.cs(20,44): error CS0117: 'SRanipal_Eye' does not contain a definition for 'IsViveProEye' Assets\TobiiXR\External\Providers\HTC\HTCProvider.cs(56,54): error CS0117: 'Convergence' does not contain a definition for 'CalculateDistance' I use Unity 2019.1.14f1. Any suggestions?
  12. SDK contains methods to check eyes blinking state: Is there some way to track each eye position separately? For now, the result is "combined": @Corvus
  13. Hi everyone, The last week i bought a HTC Vive Focus plus with the Vive Entreprise Advantage for Focus Plus. To test it, i download on unity the WaveVr SDK with Samples scenes, but i can not build or export this example on my HTC Vive Focus Plus. Can you help me solve this problem please? @Cotta @Tony PH Lin
  14. Hi, Everyone. Today, I started making VR contents for Vive Focus by unity. I could build now but this windows popup everytime. How can I solve this?
×
×
  • Create New...