Jump to content

Activity

Showing topics in Vive News and Announcements, Vive Community Guidelines and FAQs, General Vive Discussion, Vive Technical Support, Developer Blog, Developer Discussion, Developer Support, Viveport SDKs and Downloads, Vive Wave SDK, Vive SRWorks SDK, Vive Audio SDKs, Vive Input Utility, Vive Hand Tracking SDK, Vive Cosmos Developer FAQs and VIVE Eye and Facial Tracking SDK posted in for the last 365 days.

This stream auto-updates

  1. Yesterday
  2. We were updating our upm this week to ensure that the latest versions were selected as the default. The times seem to correlate roughly when this work was happening, from my best understanding. Thank you for your patience ๐Ÿ™‚
  3. Have set up everything on a colleague's machine and it works perfectly for them. I completely uninstalled and reinstalled Unity and so on but afterwards I still get the same issue. Just re-importing the library folder now to see if that will fix it. Otherwise I must assume that something is very wrong with my machine, but I have no idea what. It still only refuses to compile shader variants when deploying with Wave XR checked in the XR Plug-in Management part of settings. This could be related to the following issue as it seems pretty similar:
  4. @Dario could you possibly elaborate on what you mean by read only materials in Packages?
  5. My requirement is that when users use HTC Vive Pro Eye to watch panoramic videos, I record the time stamp, video playback time, and eyeball position data. The official SRanipal SDK C source code solved most of my problems. But I donโ€™t know how to record the playbacktime of video when the user watches the video. In addition, there is a structure about eyeball information in the C source code, in which I am very curious about timestamp which indicates the time when the frame was capturing in millisecond // The code is in SRanipal_EyeData_v1.h which is offically provided. namespace ViveSR { namespace anipal { /** * Eye module */ namespace Eye { /** @struct EyeData * A struct containing all data listed below. */ struct EyeData { bool no_user; int frame_sequence;/*!<The frame sequence.*/ int timestamp;/*!<The time when the frame was capturing. in millisecond.*/ VerboseData verbose_data; }; } } }
  6. Hi everyone! I have one question about the Vive Tracker. Currently, I would like to detect the posture of the user when running while wearing the HMD. I would like to use the Vive Tracker to detect the user's movements. Since the HTC Vive ProEye is too heavy for the user, we tried to use the Oculus Quest 2 or Vive Focus Plus. Is there any way to use Oculus Quest 2 and Vive Tracker at the same time, because I can't seem to use Vive Tracker with XR Interaction Toolkit. Also, is it possible to use Vive Focus Plus and Vive Tracker at the same time? I'd like to avoid using running VR content for long periods of time while wearing the Vive ProEye.
  7. Last week
  8. Good Afternoon, I'm trying to identify the differences between the above two packages to see which fits our needs better for Unity Development. I've explored both a little but I'm hoping for there to be a document or reference highlighting the significant differences. We intend to use Focus 3's as our target headset, as well as the most recent support Unity LTS. Thanks
  9. Hey @zzy I've tried with 2020.3.11f1 as I already had it installed and had the same result. My phone is a Xiaomi P20 Pro in case that matters and also tried with a Samsung A50, both of which are on the ARCore supported list. I've recreated the project and attached it together with two logs from the P20Pro. One where there is no XR Provider set and one where ARCore is set. I see that there is a camera error when transitioning from reconfiguring to capturing. AnotherARProject.rar Thank you for the help. ARCoreXRProvider.txt NoXRPluginProvider.txt
  10. This is for Wifi Direct Preview. It is working in a similar sense to USB debugging so far. URP is not enabled the current version is 2020.3.18f though I have tested and had matching results on 2019.4.30f
  11. I am working on an application that is fairly heavy and cannot quite keep up the native 90FPS refresh rate of the Vive Focus 3. As a result, it drops to the next harmonic: 45FPS. The app could keep up a 60FPS rate, maybe even 72FPS. So I'm wondering if there's any way to tweak the VF3 firmware to change the native refresh rate down to one of those values?
  12. Hi all, I'm having issues getting the wave sdk to work with Unity. Details are as follows: When I run the application on the Focus 3 all materials are pink. The shaders have not been found ๐Ÿ˜ญ I'm using the following Unity Version: 2020.3.17f1 (LTS) "com.unity.xr.management": "4.0.7", "com.htc.upm.wave.essence": "4.1.1-r.3.1", "com.unity.render-pipelines.universal": "10.6.0", Basically the issue/repro steps are as follows: Create new URP project, use the default scene in the build. Switch to Android platform, add OpenGL, remove Vulkan (may take some time) Try and build an .apk, observe that shader variants are compiled for build (cancel, before they all compile) Package manager -> install com.unity.xr.management Try and build an .apk, observe that shader variants are compiled for build (cancel, before they all compile) Add "scopedRegistries": [ { "name": "VIVE", "url": "https://npm-registry.vive.com/", "scopes": [ "com.htc.upm" ] } ] to packages manifest, enable Preview Packages and install com.htc.upm.wave.essence via package manager Visit Project settings, XR Plug-in Management (at the bottom), and enable WaveXR under android Plug-in providers Agree to all settings suggested by WaveXRPlayerSettingsConfigDialogue (found via ProjectSettings, Wave XR, playersettings configure dialogue button) Perhaps convert main camera to an XR rig, not sure how important that is. Try and build an .apk, observe that shader variants are NOT compiled for build. If you run this build on the device you should observe that all materials are pink. Return to Project Settings, XR Plug-in Management (at the bottom), and disable WaveXR and enable Oculus under android Plug-in providers Try and build an .apk, observe that shader variants are compiled for build (cancel, before they all compile) For me it seems that URP shaders just don't get compiled for some reason when using the wave sdk. They do when using the Oculus sdk. I have no idea why. I tried adding a shader variant collection tracking the URP lit shader variants etc, shoved it in a resources folder and even referenced it in a scene. Still no shader variant compilation during build. Any suggestions would be much appreciated!
  13. Hi @jwatson, May we clarify if you use USB or WIFI for Direct Preview? If it's under USB, then it's non issue. In addition, may we know if your environment is Unity 2020 and do you enable URP? Thanks.
  14. Were you ever able to find any solution? Whether making SRanipal 1.3.2 work better or by finding 1.3.1 or 1.3.3?
  15. Hi @hunteer11743, We haven't tried to use full body mesh before with our sdk. Can you please share more detail about the crash, i.e. crash code and stack trace from visual studio?
  16. Hi @RRob Can you please share the full adb log? I would suspect that hand tracking plugin doesn't detected arcore and thus getting camera stream using ndk camera, not from arcore apis. Although you may have done this, please make sure you follow the instructions in the ARFoundation sample to replace "AR Session" and "AR Session Origin" game object with latest version. Can you please also try Unity LTS versions? I have previously tested on 2019.4 and 2020.3 which should work fine. I'll check if I can reproduce this on my side.
  17. Cross posting this for visibility.
  18. Having same issue. "Cannot perform upm operation: Request [GET https://npm-registry.vive.com/-/v1/search?text=com.htc.upm&from=0&size=250] failed with status code [502] [NotFound]"
  19. Currently getting com.htc.upm.wave.essence: Request [GET https://npm-registry.vive.com/com.htc.upm.wave.essence] failed with status code [502] from the package manager
  20. Hi guys, Great work on this hand detection sdk. Im working in unreal with Vive pro2, i have a metahuman pawn that was originally used for interaction in game. I would like to switch out the vr controllers to this hand detection plugin. Currently I have a crash whenever i try to switch out the skeletal mesh of rigged hand to my mesh, but this is probably because the bone heirarchies dont match. My question is: is it possible to use the hand detection on a skeleton mesh that has the full body, and the respective hand bone names are matched exactly to what the rigged hand skeleton had? To elaborate, id have a full skeleton structure including both hands, and i want to map the hand detection to this skeleton. Let me know if you have any suggestions or have done it already and i might have missed it in the documentation Best regards, hunter
  21. You're right. This looks like a mistake. Anyone knows where the SDK v1.3.2 is to be found online?
  22. One thing I am curious about is, has anyone found the installer for 1.3.3 or is it only in SDK format? I would assume 1.3.3 would have a fix that would reduce the CPU requirement.
  23. Hey, sorry for the late response. Couldn't get onto a computer until now and the phone wouldn't let me login. I am using an i7-9700k. I sent a message into Vive Support and will update here when they respond. Hoping they have an answer for us. I appreciate your work around. Unfortunately from the sound of it, it will not help me out very much though if the eye tracking lags. I am using this to test programs, software, and a video game I have been working on. It definitely isn't being cause by the vive facial tracker, since with eye tracking disabled, I can track my mouth with the facial tracker at 4% usage. Eye tracking with facial is around 40 to 45% for me, and without facial tracker it is about 35% to 40%. Like others have said, even if I do not have the headset on my head, or any programs using it, it still drains 40% with the eye tracker. And based on what testamundo presented, 1.3.1 works just fine at about 8%. I think there is something wrong with the 1.3.2 version itself. Both you and I shouldn't have any problems even with our weaker processors. I am surprised after how many months nobody else has mentioned an issue though.
  24. I was able to get rid of my errors by removing essence and samples. Thanks
  25. Good Afternoon, I am trying to setup direct preview on my Focus 3 headset. I have successfully (I believe) installed the Device APK and started the server. When following the steps on the direct preview guide page, the server begins to respond to headset packets, and when started remotely, the headset can control the camera rotation & position within Unity. The problem is that the Connecting... text never goes away and there is no streaming from Unity to the headset. I've tested this in a variety of configurations/plugged-in/etc and have been receiving the same results. Any guidance would be appreciated. Headset - Focus 3 Wave SDK - 4.11 We are attempting to use Hand tracking also if that helps
  26. For me, the solution was adding the android SDK tools to my system path variables I.E adding stuff like this C:\Program Files\Unity\Hub\Editor\2020.3.18f1\Editor\Data\PlaybackEngines\AndroidPlayer\SDK\XXX for tools, platform tools
  1. Load more activity
ร—
ร—
  • Create New...