Beginnings of Raspberry Pi FPV software


Inspired and informed by the Raspberry Pi-based FPV project started by Philip, I'm putting together some software that will run a RasPi FPV rig, complete with an OSD based on data pulled from a Naza GPS module tap and an MCP3008 analog to digital converter for voltage and current.

It's based on the GStreamer media library, and will be designed to work with two Raspberry Pis – one on the airframe itself, and one as the ground station, with an Oculus Rift for the display.

The software's not yet functional, being still under heavy development, but I thought I'd make it available in case anyone else wants to join in the development, as my time's a bit limited.

Here's the Github project:

The software comes in two parts: A binary for the TX side, based on the airframe, and a binary for the RX side on the ground-station. The TX side is split into a number of modules: a main entry point that sets up GStreamer video transmission, and telemetry gatherer and transmitter. The RX side has its own entry point that sets up the receive functionality, along with a telemetry receiver, and a renderer to display OSD data.

The TX side is almost complete, missing only an integration of the Naza GPS tap module and a bit of spit-and-polish.

I've encountered some trouble with the RX side after discovering Gstreamer has a very inefficient pipeline for overlay rendering with Cairo, which is prohibitively expensive for the Raspberry Pi's processor. Instead, I'm investigating an EGL implementation that creates the EGL display context for GStreamer, then performs the OSD rendering in hardware. In addition, I need to find a way to apply a pixel shader (or whatever works) to the GStreamer video to split the video into left and right sections for display on the Oculus Rift. This work is in a very early stage; I'm using the SubtitleRenderer.h class from omxplayer for guidance, along with the eglgles GStreamer plugin (which seems to have been replaced by the new glimagesink module, which I've not had a chance to look into yet).

I plan to get some more work done over the next weeks, but if anyone wants to take a crack, please do – and do get in touch so we can coordinate.

Edit, Sun, May 11: Some more work done this weekend - I've got it to the point where it's building and running with the custom EGL display context. Just need to write the rendering code in egl_telemetry_renderer.c and find a way to split the display into left and right halves in gstreamer_renderer.c now.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • About the Rocket M5, I thing it is too big and heavy to put it on the drone (at least the sizes I am dealing with). Of course when you just replace cable with 2 Rockets the latency will not increase dramatically. But If you use some wifi USB dongle then it starts to matter. I met a guy who is making similar video links with mikrotik and nstream on acrobatic aeroplanes. But again smallest mikrotik is in size of rocket M5.

  • I achieved this over Cable.

    But for WiGi i use Ubiquiti Rockets M5, these add ~2-10ms to latency. No big deal.

  • @ChristianL: cool times. I think I usually always use latest possible updates. But what write about the fps could be true. Anyway did you achieve it through wifi or cable. If wifi, can you tell more about your configuration (wifi drivers)?

  • Same for me, tried to compile the 1.3.90 Gstreamer but got stuck. Takes forever on the Pi.

    I am currently running on 1.2.4-1, these packages are available in the testing section of the official raspbian repro

  • Nice one, Christian!

    I've been flat out with a big product launch, unfortunately, and just have had no spare brainpower left over. Getting to the end of that though, happily. I've gotta package up the new version of GStreamer (or wait around and hope someone else does it!), then I can finish building the project, using the new GL elements.

  • You guys might be interessted of how to decrease Latency to ~110-130ms at least for 720p.

    Do a

    apt-get update

    apt-get upgrade

    The result is that we now have a set of mode as follows :

    • 2592×1944 1-15fps, video or stills mode, Full sensor full FOV, default stills capture
    • 1920×1080 1-30fps, video mode, 1080p30 cropped
    • 1296×972 1-42fps, video mode, 4:3 aspect binned full FOV. Used for stills preview in raspistill.
    • 1296×730 1-49fps, video mode, 16:9 aspect , binned, full FOV (width), used for 720p
    • 640×480 42.1-60fps, video mode, up to VGAp60 binned
    • 640×480 60.1-90fps, video mode, up to VGAp90 binned

    Now you can go for higher fps rates:

    raspivid -n -w 1280 -h 720 -b 6500000 -fps 49 -vf -hf -t 0 -pf high -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay ! udpsink host= port=9000

    My conclusion: the latency is limited by a certain number of frames that the compressor needs to do its job.

    Give them these frames at a shorter time (higher fps) then you lower the latency.



    I now would like to go for 1080i, these half dield have a rate of 60fps.

    Unfortunately this mode is not available.

    POD: How is your stuff evolving?

  • Hi, This is exactly what I was thinking of doing, I was just not sure if anyone had done it yet!

    It is awesome that you have gotten it semi working.

    I will install gstreamer on my RPI and test it out. First I will buy an RPI camera this week.

    How does your telemetry and overlay work?

    I am following you on Github for any more advancements you make.

    The Logitech C 920 seems like a good option for reducing power requirements on the TX side.

    I will download the software on my RPI and try it out. 

    Sorry, I am not a programmer so I can't help you there, but I am fully knowledgeable in the hardware area :)

  • @Tilman I didn't tested yet 5Ghz yet. But definitely I am going to test it. Currently I am playing with the RasPi setup as AP, so there is no need for additional router in the field, only client NB, or tablet.

    C920 I have seen the setup working only with BeagleBone, but I think there can be a simple cabling workaround to overcome it.

  • @Jonek, here's an interesting thread about fail-over using GStreamer:

    Should be doable

    GStreamer-devel - udpsrc input failover
    udpsrc input failover. Hi everyone! Is there any solution about implementing failover for 2 input transport streams? It is of a small importance if t…
  • Some more work done this weekend - I've got it to the point where it's building and running with the custom EGL display context. Just need to write the rendering code in egl_telemetry_renderer.c and find a way to split the display into left and right halves in gstreamer_renderer.c now.

This reply was deleted.