Jump to content

Search the Community

Showing results for tags 'pro eye'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 8 results

  1. Hi! I've been trying to get the Vive Pro Eye to work for a while now but am running into issues. In particular, I am unable to get the SRanipal robot tray icon to turn its eyes from orange to green and I've run out of ideas. I've been following the Getting Started guide but Step 3 says to "Make sure the VIVE Sranipal SDK works before going to the next step" and I am not entirely certain what that means in practice. It seems to be compressing a lot (like a potential whole other Getting Started Guide) into that one short sentence. I've downloaded and unzipped the SDK. I then downloaded and installed United and imported the package into the sample scene, as instructed in the SDK guide. I tried playing but nothing really happens so I can't verify if it's actually working or if the sample is just pretty plain. I've also had a bear of a time calibrating the Pro Eye. Out of the more than fifty times (over several days), I've only succeeded once, somewhere around the 30th attempt. Every other time, after following the dot, I get a "Calibration Failed" message. It would be helpful to know what exactly is causing the failure since the routine gets very frustrating if I don't know why it's failing. I've cleaned the lens, etc., but to no avail. I'm wondering if I'm missing something, the Getting Started Guide needs some serious editing, or the technology is really ready for use. Probably a bit of each...Thanks in advance for any help!
  2. Hi everyone, I'm struggling with reaching the specified sampling rate of 120 Hz (8.33 ms) on a VIVE Pro Eye HMD. We use the SRanipal eye callback: SRanipal_Eye_v2.WrapperRegisterEyeDataCallback() in a script, derived from MonoBehaviour. The registered callback is only called every 14~16 ms, which leads to approx. 62 Hz. Way below the targeted 120 Hz. I think the PC specs are quite decent and should allow for 120 Hz sampling: Windows 10Pro Intel i7-10750H (specs can be found here) 32GB Ram GeForce RTX 2070 with Max-Q Design Following tool versions are used: SRanipal SDK & Runtime 1.3.1.1 Unity 2019.4.18f1 Pleas note that I am aware of these threads and articles, but did not find a explanation/solution that fits for me: Getting VerboseData at the fastest rate possible. - Vive Eye Tracking SDK - Community Forum Assessing Saccadic Eye Movements With Head-Mounted Display Virtual Reality Technology (nih.gov) Already many thanks, Scot
  3. Hi - I would like to entirely eliminate the head motion sensor input on the HTC vive pro eye such that if the user moves their head, the display does not change and keeps the same view / screen. I know this is a bizarre request - it is for an eye tracking study with no external variables coming into play), but any help would be appreciated on whereto look in the SDK. Thank you so much!
  4. Hello, I would like to ask if it is possible to adjust (in code or in the configuration) the speed of the moving point during the eye calibration using the HTC Vive Pro Eye. We are using the HTC Vive Pro Eye in a clinical center for visual tests. There are some visually impaired patients who are not able to follow the moving point since it is moving too fast for them. Any help would be appreciated. @Jad @Corvus @Daniel_Y
  5. Hello. Because of my room environment, I mounted the VIVE BaseStations(2.0 / 4ea) upside-down. So, the LEDs of basestations face to the floor Now. After the STEAMVR Room Calibration, only two of basestations seem to be in place in VR. The others seem to be connected to STEAMVR, but they are not in place in VR. Do I have to make sure to mount the basestations upright?
  6. I have a Unity application that needs to record pupil dilation as an indicator of user stress levels. For this to be meaningful, it needs to be normalized relative to scene luminosity. I don't see any obvious way to measure overall luminosity short of adding up all the pixel values and dividing by the number of pixels. Is there system parameter that provides this value or an API call that does that calculates it?
  7. I'm trying to show 2 kinds of screens(external camera - or webcam / front camera on HMD) on lens. I find a function? class? "WebCamDevice" - https://docs.unity3d.com/2020.1/Documentation/ScriptReference/WebCamDevice.html and also a project using that on Oculus VR. Anyway I imitate that project for study how to use that. I thought front camera is connected with PC so may they capture that cameras as webcams. but It doesnt. how can I project a texture on any object with a video from front camera? Is "WebCam" sth function right? this is an example what I want to do... that 2 Main Camera means 2 lens for stereoscopy and 2 quads are for screen sth(from Webcam or Front camera). I will attach a code I found at the bottom. It is tooo long to put here... I'm beginner in VR and Unity so I may think wrong about that. hope your wisdom and know-how!... thx developers! ------------------------------------------------------------------------------------------------------------------------------------------------- using System.Collections; using System.Collections.Generic; using UnityEngine; public class WebcamTexture : MonoBehaviour { WebCamTexture webcamTexture; public int webcamNumber; float cameraAspect; private float margin = 0f; public float scaleFactor = 1f; public bool rotatePlane = false; public bool fitting = false; //Transform cameraOVR; //public static Vector3 cameraPos; // Use this for initialization void Start() { WebCamDevice[] devices = WebCamTexture.devices; webcamTexture = new WebCamTexture(); if (devices.Length > 0) { webcamTexture.deviceName = devices[webcamNumber].name; Renderer renderer = GetComponent<Renderer>(); renderer.material.mainTexture = webcamTexture; webcamTexture.Play(); } if (rotatePlane) transform.Rotate(Vector3.forward, 180); if (fitting) FitScreen(); // camera position //cameraOVR = GameObject.Find("OVRCameraController") as Transform; } void FitScreen() { Camera cam = transform.parent.GetComponent<Camera>(); float height = cam.orthographicSize * 2.0f; float width = height * Screen.width / Screen.height; float fix = 0; if (width > height) fix = width + margin; if (width < height) fix = height + margin; transform.localScale = new Vector3((fix / scaleFactor) * 4 / 3, fix / scaleFactor, 0.1f); } // Update is called once per frame void Update() { //cameraPos = cameraOVR.position; //print(cameraPos); } } @Corvus @Daniel_Y
  8. Cory, from HTC VIVE, will conduct a free workshop on Eye Tracking development using HTC VIVE’s SRAnipal SDK. Topics will include eye tracking, foveated rendering, variable-rate shading, and using eye tracking for avatar animations. If you are interested in using eye tracking or foveated rendering in your VR content then come to learn, network, ask questions, and try a demo! This workshop is free and opened to the public. You will not need a SIGGRAPH badge to attend. RSVP for your free ticket here This workshop is in partnership with LA Unity 3D Meetup, XRLA Meetup, and Two Bit Circus. There's going to be a strong & passionate community of XR developers attending, and it'll be a great opportunity to connect / network. Location: Two Bit Circus – 634 Mateo St, Los Angeles, California Date: 7/30/2019, Tuesday Time: 6:30pm - 9:00pm Hope to see you there! Check back here on our developer blog for more workshop dates in the future. - Global Developer Relations Team
×
×
  • Create New...