Jump to content

Search the Community

Showing results for tags 'unity'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

  1. Unity Version - 2020.1.17f1 URP - 8.3.1, Single Pass Vive Wave XR - I've attempted all of them. 4.0.0 - r28, r39 and 4.1.0-preview 5.1 Vive System Version - 3.0.999.124 Wave SDK Version - 4.1.0-u4 Everything now renders as pink due to removed shaders. Deployment to device provided no issues until Monday afternoon(probably deployed .apk to device over 20 times without issue and shaders/materials rendered properly). To my knowledge, I made no changes to any graphics settings, shader stripping, etc. 2021/07/21 15:16:38.725 10355 10393 Error Unity Unable to find libGfxWXRUnity 2021/07/21 15:16:44.781 10355 10393 Debug Unity Shader 'Universal Render Pipeline/Particles/Lit': fallback shader 'Universal Render Pipeline/Particles/SimpleLit' not found 2021/07/21 15:16:44.781 10355 10393 Debug Unity WARNING: Shader 2021/07/21 15:16:44.781 10355 10393 Debug Unity Unsupported: 'Universal Render Pipeline/Particles/Lit' - All subshaders removed 2021/07/21 15:16:44.781 10355 10393 Debug Unity WARNING: Shader 2021/07/21 15:16:44.781 10355 10393 Debug Unity Did you use #pragma only_renderers and omit this platform? 2021/07/21 15:16:44.781 10355 10393 Debug Unity WARNING: Shader 2021/07/21 15:16:44.781 10355 10393 Debug Unity If subshaders removal was intentional, you may have forgotten turning Fallback off? 2021/07/21 15:16:44.781 10355 10393 Debug Unity WARNING: Shader 2021/07/21 15:16:44.781 10355 10393 Debug Unity Unsupported: 'Universal Render Pipeline/Particles/Lit' - All subshaders removed 2021/07/21 15:16:44.781 10355 10393 Debug Unity WARNING: Shader Unfortunately, the file upload is not working. Graphics links below. forward renderer https://www.dropbox.com/s/kpervnhrciptb5b/forward_renderer.png?dl=0 graphics https://www.dropbox.com/s/nt39gci0cjfog8r/graphicssettings.png?dl=0 urp https://www.dropbox.com/s/4lpwx04quj98jlp/urp-renderer.png?dl=0 wave sdk https://www.dropbox.com/s/ys6c6yjpibwslu4/wavesdk_settings.png?dl=0
  2. Hi All, I'm trying to integrate Wave VR SDK 4.1.0 for Unity using Vive Wave XR Plugin. I'm using Unity 2021.1.12f. I was following this guide: https://hub.vive.com/storage/docs/en-us/UnityXR/UnityXRGettingStart.html My problem is, that after I've added scoped registry (both manually and using VIVE Registry Tool) i get this errors in Unity: [Package Manager Window] Cannot perform upm operation: Unable to perform online search: Request [GET https://npm-registry.vive.com/com.htc.upm.preview.a.b.c] failed because it lacks valid authentication credentials [Unknown]. UnityEditor.EditorApplication:Internal_CallUpdateFunctions () [Package Manager Window] Error searching for packages. UnityEditor.EditorApplication:Internal_CallUpdateFunctions () So I've searched what is it and found this topic: https://forum.unity.com/threads/npm-registry-authentication.836308/ But it requires me to log in with some credentials. And here comes my questions: Is this something normal that I have to do this authorization manually? Or maybe there is some other way? Where can I get this login and password? Do I have create some developer account and use this account? Thanks in advance for all the help. @Tony PH Lin
  3. Hi, I am using SRanipal to collect and process gaze data, and several times in the process I have to use gaze rays, which I get with the provided method SRanipal_Eye_v2.GetGazeRay. Despite this method returning the origin of the ray, I have seen in several classes in the SDK that Camera.main.transform.position or Camera.main.transform.position - Camera.main.transform.up * 0.05f are used instead. I tried comparing both, and results are very similar, but not exactly the same, as shown below (in this example I simply get the gaze ray for combined eyes, raycast it in my scene, and display hits). I am thus wondering which value should be used for optimal precision, as this is used in a scientific research and such offset is very impactful on the results. I look forward to hearing back from you, Regards, Vivien
  4. Vive Cosmos OpenXR Feature package for Unity Unity has released a preview of their OpenXR plugin available with Unity 2020.2 https://forum.unity.com/threads/unity-support-for-openxr-in-preview.1023613/ After installing the Unity OpenXR plugin (see link above) you can add the HTC Vive Cosmos Controller Support Feature to OpenXR Features as shown: You can download the HTC Vive Cosmos Controller package preview as a tarball file here: com.htc.upm.vive.openxr.controllers-0.1.0-preview.4.tgz UPDATE: (preview 4) Add common usage “gripButton” for grip-pressed input. Add common usage “triggerButton” for trigger-pressed input Use the package manager to install as either a tarball or if you run into any issue you can unzip it and use the from disk option and point to the json file. Unity includes several samples with their OpenXR package so be sure to install the Controller Sample. This sample will display all the controller's inputs visually for almost all the input because the Cosmos controller has an additional "bumper" button next to the trigger button. As shown you can run in editor connecting to the OpenXR PC runtime. In this case it's the "Default" Vive Cosmos OpenXR runtime. To set it as the default you should select it as the default when you first run the runtime from the Vive Console. This will set the default Windows registry values. To install the Vive Cosmos OpenXR runtime please see: https//forum.vive.com/forum/92-vive-openxr/ Now as you may have noticed above I've added an indicator for the Bumper button status (green when pressed) for both left and right hand controllers for this sample. You can add them with the following steps. 1) Once you've enabled the feature (first screen shot above) we can edit the Controller sample to include the additional indicator by duplicating the Trigger Press visual element, renaming it as "Bumper Press" and moving it slightly to the right of Trigger Press (remember to do likewise for RightHand). 2) To edit the Action binding for this new indicator first double click on the Action Reference field of the Action to Button ISX script component in the inspector as shown above. Once you get the following Input Actions window, you can add the new actions and bind them to the appropriate binding path that the Vive Cosmos Controllers feature provided as shown below and while this window is open also add one for the right hand controller. And with that you can start using all the Vive Cosmos controllers inputs for your OpenXR applications! Build available here.
  5. I'm trying to display a UI element in the position of the gaze collision. I found out that just using: Camera.main.WorldToScreenPoint(collision.position, Camera.MonoOrStereoscopicEye.Left); does not fully work for the "LeftEye" display - there seems to be some screen-cropping going on (about 15% on each side). Does anybody have a way how to reliably obtain a screenpoint from world point in SteamVR in Unity please? @Daniel_Y @Corvus
  6. I'm working with Unity 20.19.4 and Vive Pro Eye and on Universal Pipeline (URP). Is there any way to enable foveated rendering on this configuration? The published asset only works on the old (built-in) pipeline. Thanks! @Corvus @Tony PH Lin
  7. Hi, So I've been building an app for the Vive Focus plus. Occasionally when running the app I seem to notice tracking just doesn't start. I can see controllers working but the headset is basically just stuck to the floor (looking like 0,0,0 position). I'm using the xr interaction toolkit plugin with the xr plugin management to build to wave. I seem to have to quit then put the headset to sleep and then put it back on again for it to reinitialize. Any thoughts about what could be causing this? Thanks,
  8. How do I get the current timestamp from the eye tracker in Unity? @Corvus @Daniel_Y @Andy.YC_Wang
  9. Hello everyone, I'm writing this post to share the recent issues I ran into past couple of weeks and the conclusions my companie and I made to fix them. First I'd like to specify that we work on a VR solution implying the use of 3 trackers (2 of them with pogopin inputs) and a leap motion on Unity. We recently ran into a series of problems involving : - First we couldn't connect some trackers in our App, even tho they where connected just fine in SteamVR. - After i fixed that, we had "random" troubles with the orientation of trackers (apparently a (90, 0, 0) rotation). We read many posts issuing the same problem but I figured out an information I never found on the forums so I thought might as well share it. Here's a sample of the posts we read : So here the good stuff : As you can see there is a problem currently with the orientation of trackers marked as "Held in Hand". It is not a problem if you don't seek to retrieve inputs from pogo pins as you can simply mark them with any other roles and never see the issue again. In the same way we read that supposedly you can retrieve pogopin inputs from any roles since SteamVR 1.16.8 but it's not working for us (we are currently on SteamVR 1.16.10). This was, more than a fix, a kind of work around in order not to run into orientation problems. But let's face it, it's not a real solid fix (small reminder, some people actually use HTC Vive and trackers for serious business reasons, and our work and income can be directly impacted if SteamVR ain't working). But it appears that in our app we had this little code line that checked if the connected device was a tracker : if(deviceClass == ETrackedDeviceClass.GenericTracker) There's the spicy stuff : it is this line that caused our first issue, the connecting trackers problem. And the problem appeared ONLY when tracker was set to "held in hand". You see where i'm going ? It appears, for some reason, that SteamVR tells to Unity that the last connected device is a normal controller. Wich should not be a problem since normal controllers and tracker pogopins share the same input identification so your custom controller's buttons should work as intended. But you know what trackers and controllers do NOT share ? Yup, orientation in virtual world. In conclusion we think, and i can't accentuate this enough since it's a random problem so you can never be sure if you fixed it or if you are just extremely lucky, that if your tracker is set as "Held in hand" SteamVR will tell Unity that you just connected a normal controller (even tho in the Steam VR window you see a tracker) so Unity will display it with the orientation of a controller wich is different from the one of a tracker (why ? nobody knows, they could very well share the orientation as well but you know...). So our fix for now is this : if(deviceClass == ETrackedDeviceClass.GenericTracker || deviceClass == ETrackedDeviceClass.Controller) { if(deviceClass == ETrackedDeviceClass.Controller) //Change tracked object orientation with a (90, 0, 0) rotation } In hope that will help some of you, and that Vive developpers can maybe fix that problem any time soon. Sincerely, Syko
  10. Hi everyone, I'm struggling with reaching the specified sampling rate of 120 Hz (8.33 ms) on a VIVE Pro Eye HMD. We use the SRanipal eye callback: SRanipal_Eye_v2.WrapperRegisterEyeDataCallback() in a script, derived from MonoBehaviour. The registered callback is only called every 14~16 ms, which leads to approx. 62 Hz. Way below the targeted 120 Hz. I think the PC specs are quite decent and should allow for 120 Hz sampling: Windows 10Pro Intel i7-10750H (specs can be found here) 32GB Ram GeForce RTX 2070 with Max-Q Design Following tool versions are used: SRanipal SDK & Runtime 1.3.1.1 Unity 2019.4.18f1 Pleas note that I am aware of these threads and articles, but did not find a explanation/solution that fits for me: Getting VerboseData at the fastest rate possible. - Vive Eye Tracking SDK - Community Forum Assessing Saccadic Eye Movements With Head-Mounted Display Virtual Reality Technology (nih.gov) Already many thanks, Scot
  11. Hello,I'm using the new input system with Oculus XR Plugin and OpenXR Plugin. I set the XR plug-in management to use OpenXR and to have HTC Vive, Oculus Touch and Valve Index inputs.After a input action's save and a Play in Editor, the Vive Controller disappear from the Input Actions. The actions already linked are still here, but I can't add new OpenXR input type.I don't know if this is a known issue. Is there a correction for that ? Unity Version : 2020.2.2f1 manifest.json of the project : { "dependencies": { "com.unity.2d.sprite": "1.0.0", "com.unity.analytics": "3.5.3", "com.unity.ide.visualstudio": "2.0.5", "com.unity.ide.vscode": "1.2.3", "com.unity.inputsystem": "1.1.0-preview.2", "com.unity.mobile.android-logcat": "1.1.1", "com.unity.nuget.newtonsoft-json": "2.0.0", "com.unity.probuilder": "4.4.0", "com.unity.progrids": "3.0.3-preview.6", "com.unity.settings-manager": "1.0.3", "com.unity.terrain-tools": "3.0.2-preview.3", "com.unity.test-framework": "1.1.22", "com.unity.textmeshpro": "3.0.3", "com.unity.timeline": "1.4.6", "com.unity.ugui": "1.0.0", "com.unity.xr.interaction.toolkit": "1.0.0-pre.2", "com.unity.xr.legacyinputhelpers": "2.1.7", "com.unity.xr.management": "4.0.0-pre.3", "com.unity.xr.oculus": "1.7.0", "com.unity.xr.openxr": "0.1.2-preview.2", "com.unity.modules.ai": "1.0.0", "com.unity.modules.androidjni": "1.0.0", "com.unity.modules.animation": "1.0.0", "com.unity.modules.assetbundle": "1.0.0", "com.unity.modules.audio": "1.0.0", "com.unity.modules.cloth": "1.0.0", "com.unity.modules.director": "1.0.0", "com.unity.modules.imageconversion": "1.0.0", "com.unity.modules.imgui": "1.0.0", "com.unity.modules.jsonserialize": "1.0.0", "com.unity.modules.particlesystem": "1.0.0", "com.unity.modules.physics": "1.0.0", "com.unity.modules.physics2d": "1.0.0", "com.unity.modules.screencapture": "1.0.0", "com.unity.modules.terrain": "1.0.0", "com.unity.modules.terrainphysics": "1.0.0", "com.unity.modules.tilemap": "1.0.0", "com.unity.modules.ui": "1.0.0", "com.unity.modules.uielements": "1.0.0", "com.unity.modules.umbra": "1.0.0", "com.unity.modules.unityanalytics": "1.0.0", "com.unity.modules.unitywebrequest": "1.0.0", "com.unity.modules.unitywebrequestassetbundle": "1.0.0", "com.unity.modules.unitywebrequestaudio": "1.0.0", "com.unity.modules.unitywebrequesttexture": "1.0.0", "com.unity.modules.unitywebrequestwww": "1.0.0", "com.unity.modules.vehicles": "1.0.0", "com.unity.modules.video": "1.0.0", "com.unity.modules.vr": "1.0.0", "com.unity.modules.wind": "1.0.0", "com.unity.modules.xr": "1.0.0" } }
  12. Greeting, Currently I having some issues that my app will crash whenever my client are using it from his side. I tested and casted from my side here that every time, there are no crashes. But when my client using it, it will crash and specially during casting at Windows Connect at his PC and using Windows Display Wireless Adapter. Here are my client PC hardware specs are:- Lenovo p330 i7 8th, 16G memory, LEADTEK p400 display card with 2G memory. Please, if anyone with have these kind of experiences, please share your solutions. Thank you.
  13. Is there some way to get the image of user's eye(s) and show it via Unity app? @zzy @Corvus
  14. Hi! Hope you can see my inquiry. I use unity 2018.4.30f, srwork 0.9.7.1, steamVR1.15.19. I follow the step as @Tony PH Lin mentioned: 1. asset srwork to unity 2. restart unity 3. asset steamvr 4. asset SRexample 5. run demo But it still can not work... here is the problem: NullReferenceException: Object reference not set to an instance of an object Vive.Plugin.SR.Experience.ViveSR_Experience_Initialization.CheckCurrentHMDDevice (System.Action done) (at Assets/ViveSR_Experience/Scripts/ViveSR_Experience_Initialization.cs:90) Vive.Plugin.SR.Experience.ViveSR_Experience_Initialization.<CheckBasicStatus>b__3_1 () (at Assets/ViveSR_Experience/Scripts/ViveSR_Experience_Initialization.cs:24) ViveSR_Experience_ActionSequence+<StartActionSequence>d__8.MoveNext () (at Assets/ViveSR_Experience/Scripts/ChairSegmentation/ViveSR_Experience_ActionSequence.cs:47) UnityEngine.SetupCoroutine.InvokeMoveNext (System.Collections.IEnumerator enumerator, System.IntPtr returnValueAddress) (at C:/buildslave/unity/build/Runtime/Export/Coroutines.cs:17) I want to know what happened? And how to solve this. Waiting for reply. @Daniel_Y @Tony PH Lin
  15. Hello, I'm using Vive Pro Eye headset with Unity and SR Works SDK for an Augmented Reality project. I would like to get the stream video experienced through the front facing cameras and then apply to it some processing tecnique to have a real time modified aspect of the real world. Do you know where I can get the video stream and how I can use it? I have already read the documentation provided by SRWorks and the previous topics but I didn't find a solution. Thank you!
  16. Hello, I am a software engineer at NASA's Johnson Space Center, and I've got some questions regarding the Vive's software interface. The objective of my team is to provide a virtual reality teleoperation interface to the Valkyrie robot, using two cameras to provide stereoscopic vision to somebody using the Manus VR gloves and HTC Vive Trackers to manipulate the robot's arms, so that they can see what they're doing. We've been using Unity to render the incoming camera images onto planes that are positioned in front of the cameras, whose resulting views are then put into the Vive screens through the black-boxy SteamVR array of scripts. However, there exist YouTube videos and other records of people being able to put incoming images into the VIVE screens directly without resorting to the roundabout methods I'm using now. I would like to know if any of you fine folks know how to access the VIVE's two screens and render JPEG images onto them. Please don't hesitate to ask for clarifications about this question! @Jad
  17. Hi, I just (auto)updated to SRanipal 1.3.0.9 and it seems that EyeCalibration.exe is not working properly anymore. The application does launch but does not continue once the loading message had disappeared. I tried to launch EyeCalibration from within the Unity( EyeSample scene from the latest SRanipal Unity Plugin Package) and directly(as administrator). I tried this on 2 systems without any results. I did notice that SRanipal has moved from C:\Program Files (x86)\VIVE\SRanipal\ to C:\Program Files\VIVE\SRanipal\ which suggest that the contents have been rebuild for x64. Has anyone else experienced this or know if this a known bug. A solution/workaround would be greatly appreciated. @Corvus Thanks!
  18. Hello! I'm working on a Vive Focus Plus project using Unity 2018.3. We're trying to build a screensharing feature into our VR application. Currently, we do this by getting the active render texture off of the main camera and using Graphics.Blit to copy its contents to a different, equally sized render texture. That all happens in a Camera.onPostRender callback. Later, we stream the contents of that second texture to the receiver. I'd like to have the second texture (the one that gets copied to) be smaller than the main camera texture so we can save on texture memory, bandwidth, frame time etc. However, if I try to blit the main camera texture to a smaller texture, the smaller texture always ends up black. I've tried changing the filter modes on both textures and come up with similar results. Is blitting to a different sized texture simply not supported on the Vive Focus, or am I missing something?
  19. Hi all, How to auto turn on and turn off ObjModelViveController and line renderer when Laser Beam Behavior hit target. Or when user click button in controller device, it will turn on model in game in Unity. I work with Vive Focus Plus. Thanks for help!
  20. I have imported the SRanipal SDK (SRanipal_SDK_1.1.0.1) into my Unity Project. This program works well, and could get eye tracking information on Unity Editor (Unity 2019.2.15). However, This program couldn't get the information as built file (exe). It seems that the built file couldn't Synchronize with SR_Runtime (Ver.1.1.2.0). It has the same result of the new project that just imported SDK. Is it necessary to edit something Unity's Property, in particular, When I build a program. @Corvus
  21. The most recent Wave SDK is version 3.2 (Released August 11, 2020) https://developer.vive.com/resources/vive-wave/ [Important Note] This version includes Experimental Features that require the latest public ROM, version 4.14.623.1 for Focus Plus. Or 3.13.623.1 for Focus. Major New Features include: Unity XR Plug-in Framework Support Hand and Gesture Tracking. Please note these features are experimental and are likely to be updated in our next SDK release. Passthrough Support via new APIs. Adaptive Quality and Foveated Rendering are now enabled by Default. This can be modified in the WaveVR settings. Please review the Release Notes for more information about the great new features and for Important Changes - https://hub.vive.com/storage/docs/en-us/ReleaseNote.html#release-3-2-0 And leave any feedback or questions on the forums so that we can get back to you and the community. Wave Unity and Unreal SDK Recommended Versions: Unity 2019.4.3 or newer - see https://hub.vive.com/storage/app/doc/en-us/UnitySdk.html Unreal 4.24 or 4.25 - see https://hub.vive.com/storage/app/doc/en-us/UnrealPlugin/UnrealPlugin.html
  22. Hi, Everyone. Today, I started making VR contents for Vive Focus by unity. I could build now but this windows popup everytime. How can I solve this?
  23. In my project, it is very important to use separate cameras for each eye. Any camera with the Target Eye set to Both or Left appears to work as intended, but when set to Right, the result on the Android HMD is like a left-eye camera; only the left part of the display renders, and the right side does not render. My project has successfully used per-eye cameras on other platforms previously. This is the first time implementing Wave platform. I can't see how to fix this issue, which is probably not a concern for most developers, but for me it is critical. I am using Wave XR Plugin 1.0.0 (com.htc.upm.wave.xrsdk) on Unity 2019.4.8 LTS. @Tony PH Lin
  24. Hello, I am trying to use the HTC Vive Pro in Unity with Tobii and I am getting this error: "Assets\TobiiXR\External\Providers\Vive\ViveProvider.cs(8,7): error CS0246: The type or namespace name 'ViveSR' could not be found (are you missing a using directive or an assembly reference?)" when I enable the "Support Vive Pro Eye" in the "Standalone Player Settings" section. I am using Unity 2019.4.7f1 but I have also tried in newer versions of Unity and I still have the same error. I have redone all the installation steps just in case something went wrong but Eye Tracking is working well as I could calibrate it in the Steam VR Home. Someone knows what could be happening? Thanks!
  25. Hi, As the title implies, I would like to use Unity 2019 (version 2019.3.8f1) for my project, an application which would use hand tracking on the Vive Focus Plus. However, a few seconds after I start the application on the headset, it suddenly freezes and crashes. The same code, on a 2018 version of Unity, works perfectly. I would like to know if anyone had encountered those issues with the Vive Focus Plus and handtracking on Unity 2019, and what steps you took if you did encounter this problem. Have a good day !
×
×
  • Create New...