Hey guys,

a long time passed since the last update from OpenFPV.

Thank's to the community for the great feedback and special thanks to all people around the world who wrote me feedback mails, joined the project or donated money. I really appreciate your interest about the project.

It was very silent the last month about the project, the reason is very simple:

I do not want to ship crappy unusable software.

My vision for OpenFPV:

OpenFPV should be a simple intuitive application for all desktop platforms. Easily extendable, stable, well designed. And easily extendable was the biggest challenge. Do you want beat around with C++? Do you want build your GUI with Qt?

Imagine you can create a dedicated data channel on TX side with python and receive the data on RX side with simple JavaScript. Layout your UI with HTML5 Elements. If you want to go deeper, why not edit the GStreamer pipe by your own. Without recompiling the whole application. Create your telemetry module in no time.

The trouble:

It was insanely difficult to get the video played back inside the webkit browser engine without adding delay to it. It is not possible to do it with HTML Video, Flash, WebRTC. The biggest challenges are buffer settings. Event Flash cannot display the video fast enough (and this is the most stable technology for live streams inside a browser) for serious FPV flying. My OpenFPV workspace directory is full with prototypes, ideas, and more prototypes. After month of fighting with myself and my vision I came to a deadly simple idea. Convert the H264 stream to a format like MPEG-1 and decode it with JS.

Maybe you think: "What the ... ?"

Don't worry, that was my first though too. I started to write a MPEG-1 decoder with JS until I stumbled upon a unpopular project on github which is doing exactly this job. After integrating all the GStreamer pipelines, transcoding, sockets and web sockets and this library, I displayed the fist output inside the browser. Without high CPU load and a good quality. And the best thing: Without a noteworthy additional lag. Sometimes the simple Ideas are more difficult then the complex ones.


I created from this Iron Bird the first prototype. And I decided to give you guys a little update about the project. I am excited about your thoughts and ideas.

- Tilman

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Developer

    Receiving and decoding a H264 stream using GStreamer on a regular PC should have no more then 1 frame (monitor refresh) latency when all buffers and timestamp sync is turned off.

    The actual frame decoding even using the software decoder (avdec_h264) usually only takes about 1-2ms on an average PC, and the color space conversion from YUV to RGB is best done on the GPU when displaying the resulting image.

    To get smooth playback you should then add a 1 frame buffer at the display sink (ping-pong buffer), to compensate for monitor and camera not being synchronized.

    So using a 60hz monitor as reference, you should not have more then 33ms (2 frames) latency in the decoder.

    Low latency and efficient compression in the encoder, is MUCH harder. Another problem is that most cameras has a lot of internal latency (2-3 frames) just processing the raw sensor data before delivery to the encoder.

  • I finished the new playback component today. Now the browser playback is as fast as the autovideosink of GStreamer. I will make a new blog post about the new achievement with a better demonstration of the latency.

  • Hey guys, I am currently working on removing the mpeg part completely. My first tests are done and the results are promising. The CPU load is very high because a "bug" in the node framework, but I will solve it. The current browser playback latency is about 140 ms. I will further optimise it until I am on 116 ms. The raspberry h264 encoder needs about 70-80ms. I will go to optimise this in the next weeks. Maybe I can get it under 100ms but I am not sure yet. Thank you again for all the feedback, infos, ideas and links.

  • (And which hardware were you using?)
  • How did you manage to change the hardware encoder settings to reduce the buffer?
  • It depends on the h.264 encoder hardware, several have almost nothing for latency, what er found was most were caching to many frames prior to display or sending from the camera or both.

    Once we limited the cache to just 1 frame and a good hardware we can get it really low close to and maybe 20ms behind analog
  • Hi!, I don't undertand the problem 100%. Is the bottleneck in browser because h264 decompression is slow in the browser?

    In my case little lag was in the camera side, but not significant. But there was a cheap and low quality one, with MJPEG. In my HTML5 telemetry was as embedded iframe:


  • h264 uses multiple frames to perform encoding, therefore introducing delay. Thats why people see lower latency with faster frame rates. A good webpage on this is (even has instructions to modify x264 into low latency encoding)


    Beaglebone has more processing power (thats why it is used to develop it into an auto-pilot) however it does not have a direct camera connection (relying on external hardware and usb connection). Raspeberry pi has less processing power but the direct camera connection makes it better as an FPV platform (in my opinion).

    To save the h264 to MPEG1 encoding delay would you be able to use u4vl to directly encode to MPEG1 from the camera?


    Streaming mavlink telemetry is also pretty easy with the pi


  • Developer

    @Tearig, that simulated view is incredible!  Wow.  I'm speechless.

    Did you have to adjust the timing of the simulation on the right to make it line-up?  I'm just wondering how much lag the simulator introduces.

This reply was deleted.