Jump to content
 

Search the Community

Showing results for tags 'pro eye'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • VIVE
    • Vive Community Forums
    • Vive Technical Support
    • Developer Forums

Blogs

  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me

Found 5 results

  1. Hello, I would like to ask if it is possible to adjust (in code or in the configuration) the speed of the moving point during the eye calibration using the HTC Vive Pro Eye. We are using the HTC Vive Pro Eye in a clinical center for visual tests. There are some visually impaired patients who are not able to follow the moving point since it is moving too fast for them. Any help would be appreciated. @Jad @Corvus @Daniel_Y
  2. Hello. Because of my room environment, I mounted the VIVE BaseStations(2.0 / 4ea) upside-down. So, the LEDs of basestations face to the floor Now. After the STEAMVR Room Calibration, only two of basestations seem to be in place in VR. The others seem to be connected to STEAMVR, but they are not in place in VR. Do I have to make sure to mount the basestations upright?
  3. I have a Unity application that needs to record pupil dilation as an indicator of user stress levels. For this to be meaningful, it needs to be normalized relative to scene luminosity. I don't see any obvious way to measure overall luminosity short of adding up all the pixel values and dividing by the number of pixels. Is there system parameter that provides this value or an API call that does that calculates it?
  4. I'm trying to show 2 kinds of screens(external camera - or webcam / front camera on HMD) on lens. I find a function? class? "WebCamDevice" - https://docs.unity3d.com/2020.1/Documentation/ScriptReference/WebCamDevice.html and also a project using that on Oculus VR. Anyway I imitate that project for study how to use that. I thought front camera is connected with PC so may they capture that cameras as webcams. but It doesnt. how can I project a texture on any object with a video from front camera? Is "WebCam" sth function right? this is an example what I want to do... that 2 Main Camera means 2 lens for stereoscopy and 2 quads are for screen sth(from Webcam or Front camera). I will attach a code I found at the bottom. It is tooo long to put here... I'm beginner in VR and Unity so I may think wrong about that. hope your wisdom and know-how!... thx developers! ------------------------------------------------------------------------------------------------------------------------------------------------- using System.Collections; using System.Collections.Generic; using UnityEngine; public class WebcamTexture : MonoBehaviour { WebCamTexture webcamTexture; public int webcamNumber; float cameraAspect; private float margin = 0f; public float scaleFactor = 1f; public bool rotatePlane = false; public bool fitting = false; //Transform cameraOVR; //public static Vector3 cameraPos; // Use this for initialization void Start() { WebCamDevice[] devices = WebCamTexture.devices; webcamTexture = new WebCamTexture(); if (devices.Length > 0) { webcamTexture.deviceName = devices[webcamNumber].name; Renderer renderer = GetComponent<Renderer>(); renderer.material.mainTexture = webcamTexture; webcamTexture.Play(); } if (rotatePlane) transform.Rotate(Vector3.forward, 180); if (fitting) FitScreen(); // camera position //cameraOVR = GameObject.Find("OVRCameraController") as Transform; } void FitScreen() { Camera cam = transform.parent.GetComponent<Camera>(); float height = cam.orthographicSize * 2.0f; float width = height * Screen.width / Screen.height; float fix = 0; if (width > height) fix = width + margin; if (width < height) fix = height + margin; transform.localScale = new Vector3((fix / scaleFactor) * 4 / 3, fix / scaleFactor, 0.1f); } // Update is called once per frame void Update() { //cameraPos = cameraOVR.position; //print(cameraPos); } } @Corvus @Daniel_Y
  5. Cory, from HTC VIVE, will conduct a free workshop on Eye Tracking development using HTC VIVE’s SRAnipal SDK. Topics will include eye tracking, foveated rendering, variable-rate shading, and using eye tracking for avatar animations. If you are interested in using eye tracking or foveated rendering in your VR content then come to learn, network, ask questions, and try a demo! This workshop is free and opened to the public. You will not need a SIGGRAPH badge to attend. RSVP for your free ticket here This workshop is in partnership with LA Unity 3D Meetup, XRLA Meetup, and Two Bit Circus. There's going to be a strong & passionate community of XR developers attending, and it'll be a great opportunity to connect / network. Location: Two Bit Circus – 634 Mateo St, Los Angeles, California Date: 7/30/2019, Tuesday Time: 6:30pm - 9:00pm Hope to see you there! Check back here on our developer blog for more workshop dates in the future. - Global Developer Relations Team
×
×
  • Create New...