3DR Labs :: Oculus Rift FPV with head tracking

Our all-star interns have been brewing up some cool devices this summer at 3DR. Three words: Oculus Rift FPV.  

We have the Oculus communicating through the Mission Planner as a joystick input, driving the pan/tilt servo gimbal on our Skyhunter fixed-wing aircraft. Video is piped to the ground via standard 5.8GHz TX/RX gear, with the 3DR OSD kit in line. The end result is one for which you should stay firmly seated; even without the video stretched all the way to the periphery, the experience is incredibly immersive.

Our current setup is based on monocular video, but stayed tuned for more from Project Warg, including stereoscopic video, new airframes, and closed-loop aircraft control with the Oculus!

Views: 9095

Comment by Mike Bristol on July 30, 2013 at 4:27pm

Yes!  I was just talking about this.  Awesome job.  Let me know if you need any test pilots.  This is what I got a Rift for!

Comment by Crashpilot1000 on July 30, 2013 at 4:29pm

Great! That brings me to another idea! The days of 8 Bit arduino FC like apm 2.x are counted. But they could be re-used on the ground as drift free headtracker - antenna movement steering and supplying the possibility of differential GPS and airpressure (for planes). Cables on the ground could be reduced by telemetry (Blutooth etc)

Comment by Greg Fletcher on July 30, 2013 at 7:41pm

I see a little latency from the mavlink connection. That will tend to give you motion-sickness. Head tracking needs to be on the 50 Hz RC link. I used a 3dr mpu3. I should put the code up. It works quite well. It handles all the ppm in/out while it's doing the DCM stuff, & a cli for setup/cal and some mode buttons. You need to hack into a 5 v ppm signal if your trainer port is full voltage as my Hitec is. It adds 3 extra channels to a cheap 6 ch Tx.

Comment by Jeroen van de Mortel on July 30, 2013 at 10:52pm

So the laptop does the translation from RCA video to HMDI stereo video for the rift, how about the latency of the video signal?

Comment by Mark on July 31, 2013 at 4:54am

Had some fun with the OR this weekend. This technology will definitely be the future for FPV. Far more immersive than goggles (including some top notch Zeiss ones).   Unfortunately the OR screen resolution is quite low res and rather disappointing.  After making a DIY high res OR, we ripped the OR apart and hacked a new very high res screen. It's quite straight forward, its just two images stitched together into pano. Besides being wonderfully light and comfortable to wear, the IMU in the OC is unbelievably realistic, no lag, no drift, no glitch, its perfectly 1:1.   

IMO, the solution is to stitch the two images together in the air. Unfortunately, we don't get HiDef as is today, sending a OR image is going to require much higher data rates again, assuming you want 3D, and you do :), it is brilliant.   


Comment by Daniel Allen on July 31, 2013 at 10:52am

I was hoping for a little more in the video...

Mark - since you seem to know a lot about a DIY OR, can you explain the process? My goal is to make a FPV mask that uses a quality 5" FPV monitor. Any tips on it, like how to use lenses?


3D Robotics
Comment by Brandon Basso on July 31, 2013 at 2:58pm

Jeroen--The video latency is pretty low, I couldn't put a number to it, but i'd say it's not noticeable.

Mark--The Oculus resolution is quite low, but the new HD Oculus should hopefully look a bit better.  I agree, stitching the images together in the air makes a lot of sense if you're interested in doing 3D.  I think that makes a lot of sense for copter, but I don't think you'll see too much of a 3D effect from a plane at 100m.  We'll definitely going to give it a shot though!

More video is in the works!  We're currently testing out a few new features and working on reducing latency so that you can actually steer a plane or copter rather than just a gimbal.

Comment by Mark on August 1, 2013 at 6:01am

@Daniel, not sure what tips are useful.... We used 50mm focal length lenses, they were just loupe lenses. The IPD was 64mm which was roughly applicable to our eyes. ie. the precise centre of each half was 64mm apart.  The software we used to stitch the images together was Tridef. I am not sure if it does a live feed though, just a conversion.    

Comment by Lloyd Breckenridge on August 9, 2013 at 7:36am

Any chance of sharing the Oculus rift code?

That video doesn't seem to show what should normally be seen on a rift display.

Are you just doing head tracking or is the video display correct in the rift?

Thanks

Comment by John Boyer on August 10, 2013 at 4:21pm

@Lloyd; I'll see if I can grab the code we have to send your way.

As far as video is concerned, we are getting accurate video in the Oculus, though not yet fullscreen. We also have a USB screencap dongle to pull camera feed to display on the laptop as well.

More video to come!

Comment

You need to be a member of DIY Drones to add comments!

Join DIY Drones

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service