Jump to content
 

Activity

Showing topics in Vive News and Announcements, Vive Community Guidelines and FAQs, General Vive Discussion, Vive Technical Support, Developer Blog, Developer Discussion, Developer Support, Viveport SDKs and Downloads, Vive Wave SDK, Vive SRWorks SDK, Vive Audio SDKs, Vive Input Utility, Vive Hand Tracking SDK, Vive Cosmos Developer FAQs and Vive Eye Tracking SDK posted in for the last 365 days.

This stream auto-updates     

  1. Past hour
  2. Thank you for your help @zzy -- the whole process you outlined is flawless! I was previously not using the SteamVR plugin (instead relying on Unity's built-in OpenVR support, which was rather limited), so I was trying to get Unity's own WebCamTexture to work for providing the camera feed. As it turns out, WebCamTexture had a bunch of problems (most important of which is that it exclusively locks the camera). Thanks for pointing me towards the SteamVR TrackedCamera API!
  3. Yesterday
  4. The only thing I don't like about the cosmos elite is the fact that the headphones hover next to my ears. They work and do their job just fine, but I'm used to the feel of my wireless gaming headset that I've been using. Is it possible to use my wireless gaming headset along with my cosmos elite? The reason I'm bringing this up is because the wireless kit for the cosmos elite will be delivered at my house in a few days, thus making my VR headset wireless, and was curious if I can use the cosmos elite along with my wireless headset. And yes, I know that there's already a mic built into my VR headset, I just prefer the feel of headphones covering my ears, instead of just hanging next to them. Here's the link to the wireless headset I have: https://www.bestbuy.com/site/steelseries-arctis-7-wireless-dts-headphone-gaming-headset-for-pc-and-playstation-4-black/6285956.p?skuId=6285956
  5. A patent pending was filed to provide exams for educational institutions and employers to qualify students and staff locally and remotely using VR/AR. The system and method guarantee the identity of the test taker, security of exam material, prevents cheating and requires no human intervention in the process. Educational institutions and industry have expressed great interest in the project, especially in this Pandemic environment. The industry seems focused on games and education but no one is providing a way for people to take certified or approved exams in V/AR. If you would like to know more or receive a copy of the patent documentation kindly contact me. skwatch@notionpatch.com
  6. Why does SRanipal force you to update? There should be an option to disable auto-update as this can potentially break things last minute for important projects. Why is there no changelog? Why is in-game eye calibration broken in the latest version? Why are there no archives for older versions? Why is there no bug tracker? Why is there eye data v2 rather than just update v1? Now having included the tongue tracking it’s still called v2 rather than v3, why the inconsistency? Why does pupil data etc. not work on v2 when TobiiXR is installed but it does work on v1? Why does expression data only work in Unreal but not Unity? Why does combined pupil data not work? It’s such a simple calculation. And if you think it’s redundant then just delete the useless property. Same goes for convergence distance. None of this is documented, you’re just supposed to figure it out on your own or through the community forums. You can get both v1 and v2 data regardless of which version is selected, so then what does selecting a version actually do? Why is the system button broken? Why is the SteamVR settings window blank sometimes? On multiple systems, not an isolated case. Some of these may be excusable but in my opinion there seriously needs to be a change in the way things are run here. @Tony PH Lin
  7. ViveSR_Log.zip Attached. The issue appears in Unity 2019.4.10f1 with SteamVR 1.14.16 and SRanipal Ruuntime 1.3.0.9, camera version 2.41.0-942e3e4.
  8. Hi @DrAdamStreck, Could you provide us the log for further analysis? Find the SR_runtime from Windows System Tray and right click the icon to select "pack log". Thanks.
  9. Not working in my second day. First day was fine. I do not know why. It is not stable for business purpose. Any other tutorial, plz? @MariosBikos_HTC
  10. Once we start the calibration process from Unity, using the SRanipal_Eye_API.LaunchEyeCalibration(IntPtr.Zero); the environment changes to the grey calibration screen and displays "Loading" with the loading animation. This disappears after about 5 seconds, and only the gray background remains. Previously, we would see an animation that would advise us on correct positioning of the headset. If we try to restart our application once the calibration has been started, our application freezes. This issue has appeared last week and it seems it follows after the SRanipal SW auto-updated. We have reproduced it on two different PCs.
  11. In my case it didn't prompt me since SR_Runtime wasn't running yet. It updated automatically upon launch of SR_Runtime for me hence I needed the firewall.
  12. Hi @ericvids After a quick test, I cannot reproduce this problem locally. My steps are listed below. Please let me know if I missed something. Unity version is 2018.4.19f, Vive Hand Tracking SDK 0.9.3, SteamVR plugin 2.6.1 (latest from Asset store). SteamVR version 1.14.16 I used SteamVR_TestTrackedCamera example scene as a start point, since this already has a camera texture. Please first make sure this scene alone can work fine. Attach GestureProvider script to [CameraRig]/Camera game object. For display purpose, add LeftHandRenderer and RightHandRenderer prefabs to the scene. Prefabs are inside Assets/ViveHandTracking/Prefab folder. Play the scene now, you can see both hand skeleton and camera texture. Tested on both Vive and Vive Pro.
  13. I just reinstalled VIVE_SRanipalInstaller_1.1.2.0 and everything seems to be in working order again. Do note that everytime SR_Runtime is run it will prompt to update. Just cancel and everything should be fine I would love to have an status update on the issues with 1.3.0.9 @Corvus
  14. This is showing in the headset so i have no way to show you, it says "connection timeout" @chengnay @Tony PH Lin
  15. Could you paste the actual error log? And, is it showing in the headset or command prompt?
  16. BTW I tried to get this to work on another computer and after updating Nvidia drivers there and starting the server i'm still getting a time out error. Can there be something wrong with the headset?
  17. @Corvus Thank you! I had chosen x86... So, I changed the architecture to x86_64. New Project & My Program works well and get the information as built file, Now. (Stripping was set in disable from a beginning.)
  18. Last week
  19. Thanks @Dario What I want is that the grab occurs at the start of the scene. Some background: I have a tennis application that is used for training (not a game) and the athlete can choose from a number of drills. It starts with a selection scene where the athlete can select if they are right- or lefthanded and if they use the controller in their hand or the tracker on the racket. And then they select the drill they want to do. Typically during a practice session they will do several drills. Each drill is a separate Unity scene, so at the start of each scene the racket should be attached automatically to the correct controller or to the tracker and at the correct position and rotation. You're right, the stickyness itself is not related to the collision detection. But if I use the Transform option the collision is not detected. And I do need the stickyness to avoid that the racket is released right at the next frame after attaching, because the trigger is no longer pressed. This is what happened when I tried the SteamVR interaction system earlier this year (https://steamcommunity.com/app/250820/discussions/7/3148519094654061096/). I created a new topic for the Following Duration question:
  20. Is it possible to make the Following Duration shorter than 0.02 (seconds, I guess)? I see a remark in the code that it depends on the Update rate, so if I would set fixedDeltaTime to let's say 0.001 (1 ms), can I then get a following duration of 0.001 as well? ( I know I need to add both the [SteamVR] and [ViveInputUtility] prefabs in the scene and in both uncheck the 'Lock physics update rate to render frequency' option to avoid falling back to 90 fps at runtime). The remark I referred to is in RigidPose.SetRigidBodyVelocity(), which is called from GrabbableBase.OnGrabRigidBody(). I understand that you need some time to calculate the velocity and that more time means greater accuracy. But more time also means more lag; the attached object seems to be on an elastic band to the controller rather than securely fixed. So my ideal solution would be to specify this in frames rather than seconds. Then as a developer I can select how much accuracy I want by selecting the number of frames and I can influence the lag by setting the framerate (fixedDeltaTime). Higher framerate (lower fixedDeltaTime) means less lag.
  21. Hi @ericvids From my previous experiences, I think cameras from SteamVR can be accessed by multiple callers. A quick example is that when hand tracking is running, you can still see the camera preview in camera tab from SteamVR settings. As for how to access the camera frame/textures, you can refer to the OpenVR sample code here: https://github.com/ValveSoftware/openvr/tree/master/samples/tracked_camera_openvr_sample I'll have a check next week to see if I can reproduce this on my machine.
  22. so I got some advice from someone else at vive that seemed to help actually start the sever but still getting time out errors, and yes i checked the ip of the device and reinstalled and went through the process you outlined. The advice i got was to go into Nvidia control panel and make sure the gpu of NVIDIA was selected (in mine it was on auto select, you can see the image attached). I needed to re enable the intel card to get into nvidia control panel. This is the log that i get now, so you see nvidia is in [0] location and it seems to actually start. after I start device apk should it say connected? Is there some time limit that i need to start unity while it is trying to connect? I've been trying different things but nothing works 😞 it seems so close now that server is starting. The server config:.\config\serverSetting.setting is loaded! The HmdList is loaded! 1 2 OnPipeInfo : Pipe Server Started Pipe accept begin. [2020-10-16 14:43:08.312 -7 4152 DEBUG logger] [Main] Set Log level as DEBUG [2020-10-16 14:43:08.318 -7 4152 INFO logger] [Main]dpServer Version: 0.4.5.3 [2020-10-16 14:43:08.318 -7 4152 INFO logger] [Main]Init() Config from json: H265:false, Bitrate:23000000, GOP:100, EncodeHQ true, AllIFrame false , ForceToSendTexture false, WaitDecodeDoneTimeOut 40 ms, fps 75,LogToFile:false,RTPPayloadSize:1400 [2020-10-16 14:43:08.324 -7 4152 INFO logger] [Main] GetGpuType() There are 3 adapter [2020-10-16 14:43:08.325 -7 4152 INFO logger] [Main] GetGpuType() The GPU [0] is Nvidia! [2020-10-16 14:43:08.325 -7 4152 DEBUG logger] OnPipeInfo(): Pipe Server Started [2020-10-16 14:43:08.466 -7 4152 INFO logger] [CServerIns]openServer() url:rtsp://192.168.137.1:6555/live [2020-10-16 14:43:08.469 -7 13668 INFO logger] GetMixFormat() ----- from audio end point -----, wFormatTag:65534 [2020-10-16 14:43:08.469 -7 13668 INFO logger] nChannels:2 [2020-10-16 14:43:08.470 -7 13668 INFO logger] nSamplesPerSec:48000 [2020-10-16 14:43:08.470 -7 13668 INFO logger] nAvgBytesPerSec:192000 [2020-10-16 14:43:08.470 -7 13668 INFO logger] nBlockAlign:4 [2020-10-16 14:43:08.470 -7 13668 INFO logger] wBitsPerSample:16 [2020-10-16 14:43:08.470 -7 13668 INFO logger] cbSize:22 [2020-10-16 14:43:08.470 -7 13668 INFO logger] wValidBitsPerSample:16 [2020-10-16 14:43:08.470 -7 13668 INFO logger] wSamplesPerBlock (valid if wBitsPerSample ==0):16 [2020-10-16 14:43:08.470 -7 13668 INFO logger] GetMixFormat : PCM data [2020-10-16 14:43:08.476 -7 13668 INFO logger] GetBufferSize :48000
  23. I'm doing some testing with Unreal 4.25 and SRanipal 1.3.0.9. I've calibrated the Eye Tracker, imported the SDK, opened the EyeSample level and gazed at the dartboard sample objects. All works great. However, when I move away from the level origin (ie 0,0,0), even just a few steps from my desk, I noticed the eye tracking on the dartboards to be very offset, to the point that the gaze is unusable. This is corrected when I step back toward the level origin. Is this just an issue with the dartboard examples? I don't recall this issue from previous versions of SRanipal, but I don't have the old installers to test. Anyone else noticing this behaviour? Thanks @MariosBikos_HTC
  24. Hi jboss, Since the Duration question is more about frame rate, yes I think it should be in a separate thread if you don't mind. So am I to understand that you don't want the stickiness to occur on grab (with button press) but on collision? So is the dominant hand assumed to be the one grabbing or is it specified elsewhere first? I also I didn't understand why stickygrabbable would be different than grabbable regarding having racket going through the ball - that's more about collider's material options and RigidBody collision options.
  25. I wasn't able to reply earlier due to the limited number of daily posts on this community which is obnoxious. I started my weekend early and didn't have a chance to try it out yet. Going to try to revert to 1.1.2.0 on monday. I'll posts my findings when I'm ready.
  26. The Hand Tracking SDK requires the headset's back-facing camera to be enabled in SteamVR settings. However, this stops Unity from being able to access the same device camera as a WebCamTexture when I import the Unity plugin into my project. Is there a workaround to get camera frames as textures, either through the plugin itself (since obviously the plugin initialized it) or some other means, perhaps my own Unity plugin? (I'm ok with having to write C++ code, I just don't know what camera API to use to be compatible with the Hand Tracking SDK.) Purpose: I want to simulate an Augmented Reality setup with hand tracking, mimicking a (very crude) HoloLens. I'm using the original HTC Vive.
  27. @Asish What issues did you need to solve? I am also getting around 60/70 Hz output. I added a timestamp in the log and there is 15 or 16 miliseconds between the entries in the csv. It makes no difference if I run the scene in the editor or in a build. I have set the fixed Timestep in Unity to 0.008. Also the gaze ray does not work (which looks logical, as it is only set in StartRecord, so that would mean only once, right?).
  28. Could you try ping the ip address again? Is it still the same? Usually I will do the following steps, 1. Update IP Address to the Direct Preview Config 2. Install Device apk 3. Start streaming server 4. Start device apk 5. Play Unity Editor If this fails, stop device apk and streaming server and try again from step3. NOTE: If it connects successfully, the streaming server will keep printing lots of logs. One more thing is that when you stop device apk, make sure you did not just exit by pressing the system button on controller button. It will be better to re-install the device apk to ensure it is completely exited.
  1. Load more activity
×
×
  • Create New...