OpenFPV Update 2 - Roadmap for 2015, Oculus Rift Support, 5GHz

I have worked hard in my sparse time to improve OpenFPV. I got a lot of mails from interested people around the world and can't wait to share the latest results with you. Your support is amazing, and my goal is to bring this project alive as soon as I can. Special thanks for all who donated a few bucks to the Project, every dollar helps me a lot.

A quick overview from the last month:

  • MinIMU-9 v2 Gyro, Accelerometer, and Compass test integration
  • Further latency optimisation
  • 160° Wide Angle Fish Eye Camera
  • 5 GHz transmission tests
  • Oculus Rift DK2 Support (experimental)
  • 720p at 40 FPS Tests
  • Source code re-structure
  • Optimised video rendering


  • The gyro integration works pretty good so far
  • Telemetry overlay works well with the gyro
  • The gyro data transmission works even if the video feed is broken (separated data channels)
  • The latency is is now stable over hours and pretty promising
  • The wide-angle camera does the job pretty good, nothing else to say
  • The new 5 GHz (diversity) adapter gave me really good results and is affordable (20 € each)
  • The Oculus Rift was mind blowing - The image is that big that I'm not able to describe
  • The source code is now better structured and modularised
  • 720p (1280x720) works good on the ground, not tested in the air


  • The CPU Load is to high on the ground station (20% / 400%, 640x480 H264, 2.3 GHz I7, MacOS)
  • No hardware decoding for the video stream
  • No Telemetry data for the oculus at the moment
  • No Fragment shader for Oculus Rift distortion
  • Bad weather in Germany

The Roadmap for 2015:

  1. Between January and February I will release a first access to the project. This version should be stable enough for testing.
  2. Between May and June I will release a updated version with improvements from the community
  3. I target August 2015 as a first "release*".

* what means release:

  • Easy to install
  • Documentation
  • Download a Ready To Fly image for the raspberry
  • Maybe sell a complete setup online (amazon)

(Please note that all dates are in subject to change, depending on issues, bugs, time)

Some pictures from the lab:

Me holding a prototype with 5Ghz diversity, 160° fish eye camera. I can even write messages on my phone trough the oculus live feed with this setup. Can't show you how it looks like, but all who tested it in my lab are amazed.

Screenshot from the oculus mode. *Note the missing fragment shader for the rift distortion, this is not 3D (but possible in the future, you will see a huge 2D screen in front of you)

The prototype with attached 5Ghz adapter watching birds.

I have no in flight footage yet, the main reason is the bad weather here. But I will do some recordings, pictures if the weather is a bit better and I have enough time and a working drone / plane*, you have my word.

* My last plane crashed and destroyed one OpenFPV Setup while testing


- First release for developers between January and February 2015

- First "consumer" release summer 2015 (maybe later)

If you have any questions, please leave a comment below or write me:



Views: 10943

Comment by bocorps on December 9, 2014 at 4:52pm

One more question , do you use TCP/IP or UDP protocole  ? 

i didn't see if you use gstreamer to pipe data.

Comment by Caesar Winebrenner on December 9, 2014 at 5:26pm

please be sure to consider the Samsung VR as a supported device. Oculus will have a much smaller client footprint initially.

Comment by Tilman Griesel on December 9, 2014 at 6:19pm

@bocorps I do not know how much it draws but I try to figure out. The Rocket M5 looks pretty robust and professional, not a bad decision. The cheaper hardware always feels cheap, even the CLS one. As mentioned in a comment before the CLS is a bit wobbly in my USB port (mac book pro) and feels not as good as my alfa 2.4 GHz adapter, anyways it works well with a short usb cable. I only use UDP protocols, because the most data don't have to be reliable. But I plan to create an architecture to use TCP as well. In OpenFPV you will have a place where you can put a "module" on Tx and Rx side. There you can request a connection which will be negotiated between Tx and Rx. I still work on the software architecture how to do right. Currently it supports python, maybe others will follow ... but I am digressing :), thanks for the questions.

@Caesar Winebrenner I consider other VR headsets but the SamsungVR (wich afaik is done together with Oculus) depends on a Galaxy Phone / Phablet. Here I ran into a problem, the current environment don't supports Phone and other mobile devices due to it's combination of video feed(s) and html. It will be possible to get the feed working on phones, but overlays (TLMY etc.) as they build in OpenFPV will be a serious problem. I hope to overcome this, but currently the target device is a Notebook / Netbook not mobile devices. There are other solutions (I think pod is his username which merge tlmy data into the stream, this approach is much better for this case. All other VR headsets,   if they work in a similar like the oculus, should work with OpenFPV.

To all who wanna know why supporting VR headsets is not an easy task I want to give a short overview what the problem is. Number one is chromatic aberration. The lenses of VR headsets are shifting the color from the center of each eye. To eliminate this effect OpenFPV is using shaders to correct this effect but they are not done yet completely and the chromatic aberration differ from every camera, image size, vr headset and so on.

Number two is the incredible huge video size from the oculus. It is so big that you get dizziness from it at the beginning. There should be options to adjust the image size to fit the users needs. I work really hard to get this stuff solved.

Comment by benbojangles on December 11, 2014 at 4:38am

Hello, please can you tell me, this is basically mounting an IPcamera/wifi camera to a vehicle and sending the signal to a laptop that has an Occulus Rift headset connected?

I'm a little bit confused.

How do you plan on solving the latency problem? And how much signal range do you hope to acheive (is it standard wifi range?)

Thanks and good luck! 

Comment by Tilman Griesel on December 11, 2014 at 7:11am

Thanks @benbojangles ,Basically it is a single-board computer (like the RaspberryPi) with a camera attached. This sends the signal down to the ground with standard wifi or HSDPA. On the ground you need currently a laptop yes. What it makes different from other projects is that you can easily add / remove on screen data with standard HTML5 and a bit of JS if you like.

I try to get the latency as low as it is possible by optimising all settings (encoder, decoder, framerate, resolution, etc.) but It should be possible to fly a drone with the current results (latency around 120ms). The wifi range depends on the antennas used, the frequency (2.4 or 5 GHz) the Tx Rx power and the environment. I got video feeds from 3000 meters line of sight but around 500 meters seems more realistic. Did this info helped you?

Comment by benbojangles on December 11, 2014 at 8:22am

Thanks for the info. OSD data will come from wifi also?

Comment by benbojangles on December 11, 2014 at 8:24am

also, have you considered two raspberrypi's - one with camera, one with monitor?

Comment by bocorps on December 12, 2014 at 4:33am

Hi Tilman, thanks for infos. Hope we can find a way to get rid of the flimsy pi camera :-)

Comment by Bill Bonney on December 12, 2014 at 8:36am

Where's the code? This project is called OpenFPV but i find nothing (actually all links to 'fork on github' on the site return to this post on diydrones.

I think really if asking for donations and having a project being described as an 'open source project', code should be 'published early and often'. 

For example here's a more open FPV project than "Open FPV"

Comment by Tilman Griesel on December 13, 2014 at 10:05am

@benbojangles Yes the OSD data will come via wifi too. But not merged into the video feed. Yes I considered that too and I'll keep this in mind. For the current state it is more practical with a notebook.

@Bill Bonney as mentioned in the roadmap (openfpv_roadmap_2015.png) the first milestone includes the GitHub access. A lot of people have access to the private repo at the moment but currently I'm restructuring a lot of things and I won't support early dev versions. After finishing the my work, everyone will have access and support.


You need to be a member of DIY Drones to add comments!

Join DIY Drones


Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service