Jump to content

Attention research 2D visualization - help!


99percento

Recommended Posts

Hello, I need to visualize 2D images on a VIVE pro eye for a research project on attention. The program is built in python (part psychopy) and stimuli mainly generated in pyglet.
It would be perfect if I could use the VIVE just like a secondary monitor (timing matters a lot). At present I just managed to have MS windows detecting it as a secondary monitor with an horizontal resolution resulting from the sum of the left and the right eye monitors. So any image is shown on the left eye for the left part of this "virtual" monitor OR on the right eye for the right half of such virtual monitor.

I would need to have the same 2D image shown to both eyes, possibly with the proper pre processing in place in order to maintain perceptual linearity according to lens distorsion. I have read some whitepaper and I know that every VIVE has even specific correction parameters in firmware.

Is there a setting that I am missing, to set the vive to internally process the video feed in order to render as 2D to both eyes? (it would make sense to have such working mode)
Should I try to force a 3D side by side output? This would not take into account lenses distortion.
There are libraries like OpenVR and OpenHMD but at present they're far out of my reach (I'm a noob)

Thank you so much for any suggestion!

Francesco

@Jad @Corvus @Daniel_Y

Link to comment
Share on other sites

  • 1 year later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...