3+km HD FPV system using commodity hardware


Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/

Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.

The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?

- No association: A receiver always receives data as long as he is in range

- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)

- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.

For FPV usage this means:

- No stalling image feeds as with the other WIFI FPV projects

- No risk of disassociation (which equals to blindness)

- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range

The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)

Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...

In the end, what does it cost? Not much. You just need:

2x Raspberry A+

2x 8€ wifi dongles

1x Raspberry camera

1x Some kind of cheap display

Happy to hear your thoughts/rebuild reports :)

See you,


You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –


        • Yep, just changed the main loop (amongst other things) to write to two cards.  I've seen it work where at range (through a few walls in my house) one channel is clearer than the other and thus continues to give me a clean picture. :)

    • Perhaps separate processes are a good idea (especially to make use of multiple cores, although I think low latency has the highest priority here) but I don't see much advantage of using a network interface as the IPC mechanism here.  Concpetually something simple like a socket or a pipe matches better.

      • I think it depends on how you want to use it.

        Right now it uses inter process communication, with the tun/tap solution it would be network communication, which is more versatile.

        The idea materialized when I wasn't sure if befinitiv would implement FEC inside wifibroadcast, because it would allow people to use udpcast or whatever other network application they like. One could also make a transparent wireless bridge with this and connect an ethernet cam or whatever other ethernet device to it.

        For me personally, the approach with FEC inside wifibroadcast is fine as I don't need any other functionality and it is simpler.

  • Can someone confirm if you are able to run mavlink telemetry through the raspberry pi while also sending the HD video using this method. http://dev.ardupilot.com/wiki/companion-computers/raspberry-pi-via-...

    • Developer


      I think it'll work using that method of splitting the pipe into two so video can go down one, telemetry the other (in malkauns's links above).  One thing is the pipe is one-way only so the ground station won't be able to send commands to the flight controller and this may upset the mission planner (I'm not sure).  As a minimum I think you'll need to set the flight controller's SRX_xxx parameters (i.e. SR0_, SR1_) to "1" (hz) so that it sends out telemetry data proactively (normally Copter waits for the ground station to request a datastream).

      • Thanks for the replies Malkauns and Randy.

        Its a shame that it doesn't do 2 way comms but I understand that it isn't what this project is about.

        • It can do two-way comms although the throughput will obviously be less than optimal.  You can run tx and rx on both ends.  Worked for me.

          • Has anybody tried/succeeded in running two-way comms?

            A very useful application of this would be GCS communication.  Another useful application would be to be able to fire a command from the ground to restart tx video when it fails (which it does, then you have to land and restart currently).

            • yessss it works.
              • Ooh, with a GCS like mission planner or apmplanner2 talking both ways?  Could you give example commands how to do this?

This reply was deleted.


DIY Robocars via Twitter
RT @chr1sa: Just a week to go before our next @DIYRobocars race at @circuitlaunch, complete with famous Brazilian BBQ. It's free, fun for k…
23 hours ago
DIY Robocars via Twitter
How to use the new @donkey_car graphical UI to edit driving data for better training https://www.youtube.com/watch?v=J5-zHNeNebQ
Nov 28
DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
Nov 26
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
Nov 25
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Nov 23
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord: https://discord.gg/EPsZHkg9Nx) https://t.co/PNDewvJdrb
Nov 23
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Nov 23
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar https://www.youtube.com/watch?v=9yy7ASttw04
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge https://t.co/nZQTff5…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉 https://t.co/NtKnO…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch http://indyautonomouschallenge.com/stream
Oct 23