Hey guys,

a long time passed since the last update from OpenFPV.

Thank's to the community for the great feedback and special thanks to all people around the world who wrote me feedback mails, joined the project or donated money. I really appreciate your interest about the project.

It was very silent the last month about the project, the reason is very simple:

I do not want to ship crappy unusable software.

My vision for OpenFPV:

OpenFPV should be a simple intuitive application for all desktop platforms. Easily extendable, stable, well designed. And easily extendable was the biggest challenge. Do you want beat around with C++? Do you want build your GUI with Qt?

Imagine you can create a dedicated data channel on TX side with python and receive the data on RX side with simple JavaScript. Layout your UI with HTML5 Elements. If you want to go deeper, why not edit the GStreamer pipe by your own. Without recompiling the whole application. Create your telemetry module in no time.

The trouble:

It was insanely difficult to get the video played back inside the webkit browser engine without adding delay to it. It is not possible to do it with HTML Video, Flash, WebRTC. The biggest challenges are buffer settings. Event Flash cannot display the video fast enough (and this is the most stable technology for live streams inside a browser) for serious FPV flying. My OpenFPV workspace directory is full with prototypes, ideas, and more prototypes. After month of fighting with myself and my vision I came to a deadly simple idea. Convert the H264 stream to a format like MPEG-1 and decode it with JS.

Maybe you think: "What the ... ?"

Don't worry, that was my first though too. I started to write a MPEG-1 decoder with JS until I stumbled upon a unpopular project on github which is doing exactly this job. After integrating all the GStreamer pipelines, transcoding, sockets and web sockets and this library, I displayed the fist output inside the browser. Without high CPU load and a good quality. And the best thing: Without a noteworthy additional lag. Sometimes the simple Ideas are more difficult then the complex ones.


I created from this Iron Bird the first prototype. And I decided to give you guys a little update about the project. I am excited about your thoughts and ideas.

- Tilman

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • If you're going to put telemetry, control and A/V through the same data stream, implementing quality of service (QoS) should be part of the initial design.  Control takes the highest priority, telemetry second and A/V is left with best effort.  There should be plenty of open source code available to handle the packet delivery portion with low latency in mind.

  • @mP1

    Still will use an APM for RC / flight control. Pi only used for video, although it can get the mavlink command through serial port as well.

  • @Randy - thanks for that.  I have the 4 cores doing recognition of the photos we are taking so I don't have much left and that may be the problem.  I used multiprocessing to do this and it works well.

    We aren't trying to fly FPV so it does not really matter too much.

  • -to add we have used a beaglebone black for our testing on video, latency and development. The setup was originally sourced through a forum or blog post found here that we continued. The C920 had limitations on the encoding that was output limiting the file size to the best capture setting. We did also find a latency decrease when using a hardware decoder.

    Finding using the keying as you suggest could be then coordinated with a secondary signal from the telemetry radio the base station can compare timedata stamps and could also provide a video latency number. Going to a digital system that could be agnostic to the transmission while using the base station to coordinate all of these signals really could be amazing.
  • @Tilman I will lend the help in the coding for the keying etc... Have a few ways after I slept on it that I think could be possible. Also thinking about input video signal and have some further idea on how to intergrate multiple cams to create a seem less multiple view display..

    To the point of WiFi 2.4 using channel 13 it is feasible however we have dropped video down to 1.3 GHz and have used notch filters for simultaneous 2.4 rc control and WiFi bidirectional data transmission however it seems due to the congestion it has been a bit unreliable.

    If the keying of data frames works and telemetry can be send via 2 sources 1 on the video channel and then on either a bidirectional UHF or 2.4 compatible system the redundancy or ability for a very simple setup could be completed.

    I see allot of possibilities pm me and we can discuss what you may need from my side
  • @Marius

    How are you going to read the PWM from your rx on the Pi ?

  • Developer

    I agree with JAB's idea that the best idea is to combine the telemetry and video into one feed and then intelligently prioritize the controls over the video.  Easier said than done though!

    @Stephen, In my tests I saw lag of about 1.2seconds if I only captured the image from the camera 5 times per second.  If instead the pythong/opencv program running on the Odroid captured the image 15 times per second the lag dropped almost to zero.  So in short, pull the image faster from the camera and the lag will go away.  The issue of course is that if your small linux computer isn't very fast it may not be able to capture the image and process it that quickly.  For the balloon finder I got over that problem by making use of 2 cores on the Odroid (it has 4 in total) using Python's "Multiprocessing".  So the image/video capture would run in a separate process all on it's own so it could run much faster.

  • @John - we have watched the RSSI and there is no difference with the WiFi gear on.  We have flown out to 2km with no real problems.  It is not luck - Channel 13 is 2.472 and I set it to G only and 20Mhz bandwidth.

    2.4Ghz radio control starts at about 2.405 leaving about 50 or 60 Mhz to hop around in.


    The idea is to use quality gear that you can control things.  I use MicroTik however the Ubiquity gear can do the same.  I also use the NV2 TDMA protocol for the link.  We are having some problems at the moment with range however I have had it to 4km.  We tried it again over different terrain and had some problems with the link - we need to work on it a bit.

    Review: FrSky/FriSky FHSS 2.4GHz RC module/receiver
    A review of the FrSky/FriSky 2.4GHz RC system
  • Developer

    @Stephen, then you have been incredibly lucky.

    I'm my experience (and that of most FPV'ers), mixing radio and video on 2.4ghz works on the bench, but has severely reduced range. The biggest challenges are getting enough separation on the video transmitter and RC receiver, and telling the RC system to specifically work at the opposite end of the spectrum from the video transmitter. Also many of the cheaper video transmitters emit a lot of noise outside the band they are supposed to work on, further impacting the problem.

    Btw.. The ideal solution to the 2.4ghz video vs 2.4ghz radio control issue, is to implement the radio control signals into the video stream carrier. This was you only have one 2.4Ghz signal to worry about. If this is designed properly using QoS, your radio control signal should continue to work (have full range) long after the video is lost.

    But sadly no cheap commercial solutions for that yet..

    The second best (arguable the best for long range), and simpler solution. Is to move control over to 433mhz as already mentioned. But then you have conflict with telemetry (most of Europe cannot use 900mhz) if you are using that..

  • Randy - what are you using to drive the C920.  I am using Derek Malloys tutorial and have 2 seconds lag.  What do you mean to drive it at 15hz

This reply was deleted.