Our all-star interns have been brewing up some cool devices this summer at 3DR. Three words: Oculus Rift FPV.
We have the Oculus communicating through the Mission Planner as a joystick input, driving the pan/tilt servo gimbal on our Skyhunter fixed-wing aircraft. Video is piped to the ground via standard 5.8GHz TX/RX gear, with the 3DR OSD kit in line. The end result is one for which you should stay firmly seated; even without the video stretched all the way to the periphery, the experience is incredibly immersive.
Our current setup is based on monocular video, but stayed tuned for more from Project Warg, including stereoscopic video, new airframes, and closed-loop aircraft control with the Oculus!
Comments
Any chance of sharing the Oculus rift code?
That video doesn't seem to show what should normally be seen on a rift display.
Are you just doing head tracking or is the video display correct in the rift?
Thanks
@Daniel, not sure what tips are useful.... We used 50mm focal length lenses, they were just loupe lenses. The IPD was 64mm which was roughly applicable to our eyes. ie. the precise centre of each half was 64mm apart. The software we used to stitch the images together was Tridef. I am not sure if it does a live feed though, just a conversion.
Jeroen--The video latency is pretty low, I couldn't put a number to it, but i'd say it's not noticeable.
Mark--The Oculus resolution is quite low, but the new HD Oculus should hopefully look a bit better. I agree, stitching the images together in the air makes a lot of sense if you're interested in doing 3D. I think that makes a lot of sense for copter, but I don't think you'll see too much of a 3D effect from a plane at 100m. We'll definitely going to give it a shot though!
More video is in the works! We're currently testing out a few new features and working on reducing latency so that you can actually steer a plane or copter rather than just a gimbal.
I was hoping for a little more in the video...
Mark - since you seem to know a lot about a DIY OR, can you explain the process? My goal is to make a FPV mask that uses a quality 5" FPV monitor. Any tips on it, like how to use lenses?
Had some fun with the OR this weekend. This technology will definitely be the future for FPV. Far more immersive than goggles (including some top notch Zeiss ones). Unfortunately the OR screen resolution is quite low res and rather disappointing. After making a DIY high res OR, we ripped the OR apart and hacked a new very high res screen. It's quite straight forward, its just two images stitched together into pano. Besides being wonderfully light and comfortable to wear, the IMU in the OC is unbelievably realistic, no lag, no drift, no glitch, its perfectly 1:1.
IMO, the solution is to stitch the two images together in the air. Unfortunately, we don't get HiDef as is today, sending a OR image is going to require much higher data rates again, assuming you want 3D, and you do :), it is brilliant.
So the laptop does the translation from RCA video to HMDI stereo video for the rift, how about the latency of the video signal?
I see a little latency from the mavlink connection. That will tend to give you motion-sickness. Head tracking needs to be on the 50 Hz RC link. I used a 3dr mpu3. I should put the code up. It works quite well. It handles all the ppm in/out while it's doing the DCM stuff, & a cli for setup/cal and some mode buttons. You need to hack into a 5 v ppm signal if your trainer port is full voltage as my Hitec is. It adds 3 extra channels to a cheap 6 ch Tx.
Great! That brings me to another idea! The days of 8 Bit arduino FC like apm 2.x are counted. But they could be re-used on the ground as drift free headtracker - antenna movement steering and supplying the possibility of differential GPS and airpressure (for planes). Cables on the ground could be reduced by telemetry (Blutooth etc)
Yes! I was just talking about this. Awesome job. Let me know if you need any test pilots. This is what I got a Rift for!