Jump to content
 

Annabell

Members
  • Content Count

    20
  • Joined

  • Last visited

Community Reputation

0 Neutral

About Annabell

  • Rank
    Explorer

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. @Dario Yes I already looked into this example. But within this example, I have to "define" which objects can be focussed. But I just want to get the point of focus, indepent if a person focussed an object or just watch into the air. Therefore I thought about the following idea: I can receive the eye data via SRanipal_Eye_API.GetEyeData(ref data) and therefore have the eye data of the left, right eyes as well as the combined data. So in the next step I am calculating the focussed point by calculating the minimal distance between the two lines of the left and right eye. If the minimal distance is 0, they have an intersection point: lineLeft= gazeOrigin_left + l * normalizedGazeDirectionVector_left lineRight= gazeOrigin_right + k * normalizedGazeDirectionVector_right where: gazeOrigin_left == data.verbose_data.left.gaze_origin_mm normalizedGazeDirectionVector_left_left == data.verbose_data.left.gaze_direction_normalized gazeOrigin_right == data.verbose_data.right.gaze_origin_mm normalizedGazeDirectionVector_right == data.verbose_data.right.gaze_direction_normalized Unfortunately the two lines do not have an intersection point, but I do not understand why. Anyone has any ideas why there is no intersection point? @Corvus
  2. I am trying to get the focus of the focussed point. Therefore I have multiple ideas: 1) Calculate the intersection of the line consisting out of data.verbose_data.left.gaze_origin_mm and data.verbose_data.left.gaze_direction_normalized (left eye line) as well as data.verbose_data.right.gaze_origin_mm and data.verbose_data.right.gaze_direction_normalized (right eye line). Unfortunately, this doesn't return an intersection point because they do not cross each other. 2) I saw that Tobii XR SDK offers a function "GetEyeTrackingData" (https://vr.tobii.com/sdk/develop/unity/documentation/usage-examples/). Unfortunately the class TobiiXR has no function GetEyeTrackingData and therefore I am not able to use it. Does anyone has any idea/ideas?
  3. @chengnay Yeah I am looking for the in actions given in your screenshot (InteractUI, Teleport, GrabPinch, GrabGrip,...)
  4. @chengnay thanks for the tip. I already found another option in the asset shop
  5. @chengnay Okay this seems to be a way how to figure out if a button was pressed, but is it also possible to figure out which action belongs to this button?
  6. @VibrantNebula Yes exactly the developer has to define Actions and match them to one or multiple buttons. You can then trigger those actions, but only when you exactly know what those actions are. For example in Unity it is possible to get all exisiting gameObjects via "UnityEngine.Object.FindObjectsOfType<GameObject>();". So you do not have to tell the code which gameobjects you have. With actions I only figured out that you have to know the names of each action to trigger them, but I did not find a solution how to find out those actions and their corresponding names via a function. @chengnay @zzy @Jad @Corvus
  7. figured out a way. Basically used websocket for a bidirectional communication. therefore I had to set up a websocket server and the frontend, so at the end I have two clients (Frontend and Unity application) and the server.
  8. I would like to know if there is a method how to get all possible actions of the controllers (HTC vive Pro Eye) in a unity application programmatically in C#. Do you know any method how to do so? @Corvus @chengnay
  9. I would like to know if there is a method how to get all possible actions of the controllers (HTC vive Pro Eye) in a unity application programmatically in C#. Do you know any method how to do so?
  10. the goal is to set up some variables within a webbrowser for a Unity application. The communication needs to be bidirectional, because we also need to get first some information from Unity (which gameObjects/actions/... exists) for some pop up menus and secondly we have to fill out the "form" on the website which information should be sended to the Unity application in order to start the scene. Does anyone has any ideas? I already tried to work with this tutorial, but unfortunately Network.perrType is not available for Unity version 2019.2.9f1 anymore. @VibrantNebula
  11. I want to have something like a start canvas in the vr world. More precisely, before starting the "real application", the player has to type in his name and email adress I am using a HTC Vive Pro Eye, so the idea would be to have a virtual keyboard which the user can controll via the controllers. Are there any similar projects/ideas/... how to do this? I only found within the gameObjects the input field, which unfortunately does not occur in the vr world, just on the screen of the computer, but not visible with the headset. @chengnay
  12. Within Unity I can only download SteamVR Plugin and not OpenVR Plugin. On top of that the IVRSystem seems to be in C++, but within Unity you usually use C# @Corvus
  13. I am using Unity @VibrantNebula
×
×
  • Create New...