Our all-star interns have been brewing up some cool devices this summer at 3DR. Three words: Oculus Rift FPV.
We have the Oculus communicating through the Mission Planner as a joystick input, driving the pan/tilt servo gimbal on our Skyhunter fixed-wing aircraft. Video is piped to the ground via standard 5.8GHz TX/RX gear, with the 3DR OSD kit in line. The end result is one for which you should stay firmly seated; even without the video stretched all the way to the periphery, the experience is incredibly immersive.
Our current setup is based on monocular video, but stayed tuned for more from Project Warg, including stereoscopic video, new airframes, and closed-loop aircraft control with the Oculus!
... here a nice Oculus Rift FPV video. Believe it or not, Oculus Rift + FPV will be gross
Thanks for the info Alex.
I was afraid code was involved. Im trying a hardware solution.
I'm in the process of splitting the image and displaying two camera images and streaming it via 5.8ghz transmitter. The image splitter is from a 2 camera security monitor box.
Using a composite to HDMI converter to the OR.I know the image will be crap but its one step closer for mankind!
@Choppy We are using something called vjoy which is a virtual joystick emulator. We had to write a little bit of code to convert the USB input from the oculus rift to a game controller that Mission Planner could understand. I am going to be posting a design doc for the whole thing soon. Along with all the code and everything you would need to set this up. You will be able to a take a look and poke around in it for yourself.
If I may ask, how did you get the AMP mission planner to recognize the head tracking via the usb? When I bring up the config page it shows a blank field for the joystick.
Or if you want to do the DIY method, you need to simply add 2 resistors. You need to add 2x 1.2k resistors inline on the outer leads of the servo's potentiometer. As a result the servo is tricked into thinking it needs to turn an extra 60-70 degrees to meet the same resistance.
I use that method for all my FPV camera servos.
@John; I am using the rift to control an Axis Q6035-E PTZ IP camera.
This camera has absolute positioning so it is fairly easy to use with the rift.
It's a proof of concept for my boss who is showing it a a trade show in Canberra tomorrow.
So far I have it working well enough for him to demo it at the show.
The code I wrote for the rift sends PTZ commands via HTTP calls to the NVR (which I also wrote).
The code displays the live camera feed from the NVR in the rift headset and as you turn your head around the camera follows accordingly. The latency is not too bad but can certainly be improved upon.
So far it is good enough for my boss to demo at the show tomorrow.
I'd love to get the specifics of the setup you're using.The hardware we're using is pretty simple; we've got a
CMOS camera on a pan-tilt gimbal.
Our approach is to configure the Oculus gyro as a joystick and then to use that through the APM Mission Planner. One of the issues we've been encountering with this approach is the PWM range of the joystick input hasn't been enough to take advantage of the full range of the servos we're using. Any thoughts on how we could get around this?
@John; Actually I found some example code from the snes emulator that I could use.
I am displaying video from an IP camera in the Oculus and using the sensor to control the PTZ on the camera.
Would be interesting to see you code anyway if you can find it.
@Lloyd; I'll see if I can grab the code we have to send your way.
As far as video is concerned, we are getting accurate video in the Oculus, though not yet fullscreen. We also have a USB screencap dongle to pull camera feed to display on the laptop as well.
More video to come!