3+km HD FPV system using commodity hardware


Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/

Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.

The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?

- No association: A receiver always receives data as long as he is in range

- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)

- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.

For FPV usage this means:

- No stalling image feeds as with the other WIFI FPV projects

- No risk of disassociation (which equals to blindness)

- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range

The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)

Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...

In the end, what does it cost? Not much. You just need:

2x Raspberry A+

2x 8€ wifi dongles

1x Raspberry camera

1x Some kind of cheap display

Happy to hear your thoughts/rebuild reports :)

See you,


You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –


            • The quality should be pretty much same unless you are using an entirely different screen/laptop. Or you may be using a higher bitrate on the pi side when working with the windows system. (the -b option).

              • Pritam, I am using the command you uploaded. it works at low res 480x320 but pixelates / freezes on higher res

                One question please: do I have to do some compiling / make or is it enough to just download gstreamer? Also what is this good for :

                gst-rpicamsrc is a GStreamer wrapper around the raspivid/raspistill
                as in : github.com/thaytan/gst-rpicamsrc

                Thank you for your time
                • this is first time I have heard of gst-rpicamsrc. Need to look at its capabilities.

                  guess for pixelation:

                  1. Link not upto the mark to support the bitrate and resolution specified. 

                  2. In that command I am using a just enough bitrate for 720p, if your link has good bandwidth, you may have better luck increasing it. I actually record it on the SD card on pi as well. I always pull a clean copy later on from the pi.

                  Lower resolution with high fps will always be more resilient than higher res streams.

                  I find 320p is good enough for FPV. I am working at way to record 720p on raspberry pi and stream only 320p or so on the network and do it all by sending  few http requests to a server running on pi with udp auto discovery. I will update something crude by next weekend. Mostly initial poc.

                  Downloading GStreamer packages is enough.

                  • the original command that I gave you has bitrate very very low. I think I missed a zero. its only 2mbps. I think I wanted to write 20mbps. A 10mbps value should be a decent compromise between quality and bandwidth.

                  • Pritam, thank you very much for your reply

                    Fnoop, thank you also

                  • Hi, gst-rpicamsrc looks like a great idea initially, but unfortunately it increases latency slightly (vs using piped raspivid), and also that if the video stream stalls for any reason I found that it then stalled the camera and could not be restarted.  Using piped raspivid it isolates the camera from gstreamer and doesn't stall it - it keeps on going and gstreamer just recovers whenever the network stream recovers.

      • thanks a lot Pritam, will try tomorrow

  • This could be an interesting wifi stick for wifibroadcast. Relatively small, two antenna connectors (inside the case), 2.4Gh/5Ghz support and Atheors AR7010/AR9280 chipset which works with the same ath9k_htc driver as the AR9721 sticks like the TPLink WN722N.


    It seems to be the same as this one:


    This camera also seems interesting, dual lens cam that puts out two MJPEG streams:


    • Nice find, will add that to the cart

    • Any specific tutorial I should follow to get the TL-WN722N working on rpi 2B as a AP?

      I have tried the Alfa AWUS036NHR and it was a disaster. Worked on it for around 10 hours and finally got it doing something. The AP showed up but the tx power was super low ~1mW and the connection fell away after 2 minutes. Tried pretty much evry power setting but it did not make any difference. Thats why I got two TL-WN722N to experiment with. First with raspicam interface over normal wifi and later wifi broadcast as showed in this topic.

This reply was deleted.


DIY Robocars via Twitter
RT @a1k0n: Did I get rid of hand-tuned parameters? Yes. Am I still hand-tuning more parameters? Also yes. I have a few knobs to address the…
DIY Robocars via Twitter
RT @a1k0n: I'm not going to spoil it, but (after charging the battery) this works way better than it has any right to. The car is now faste…
DIY Robocars via Twitter
RT @a1k0n: Decided to just see what happens if I run the sim-trained neural net on the car, with some safety rails around max throttle slew…
DIY Robocars via Twitter
DIY Robocars via Twitter
RT @SmallpixelCar: @a1k0n @diyrobocars I learned from this. This is my speed profile. Looks like I am too conservative on the right side of…
DIY Robocars via Twitter
RT @a1k0n: @SmallpixelCar @diyrobocars Dot color is speed; brighter is faster. Yeah, it has less room to explore in the tighter part, and t…
DIY Robocars via Twitter
RT @a1k0n: I'm gonna try to do proper offline reinforcement learning for @diyrobocars and throw away all my manual parameter tuning for the…
Sep 23
DIY Robocars via Twitter
RT @circuitlaunch: DIY Robocars & Brazilian BBQ - Sat 10/1. Our track combines hairpin curves with an intersection for max danger. Take tha…
Sep 22
DIY Robocars via Twitter
RT @SmallpixelCar: Had an great test today on @RAMS_RC_Club track. However the car starts to drift at 40mph. Some experts recommended to ch…
Sep 11
DIY Robocars via Twitter
RT @gclue_akira: 世界最速 チームtamiyaのaiカー https://t.co/1Qq2zOeftG
Sep 10
DIY Robocars via Twitter
RT @DanielChiaJH: Always a good time working on my @diyrobocars car at @circuitlaunch. Still got some work to do if I’m to beat @a1k0n howe…
Sep 10
DIY Robocars via Twitter
RT @SmallpixelCar: My new speed profile for @RAMS_RC_Club track https://t.co/RtLb7TcgIJ
Sep 10
DIY Robocars via Twitter
RT @SmallpixelCar: Practiced at @RAMS_RC_Club today with my new @ARRMARC car https://t.co/AEu2hCx89T
Aug 28
DIY Robocars via Twitter
Aug 24
DIY Robocars via Twitter
RT @gclue_akira: 柏の葉で走行させてるjetracerの中身 #instantNeRF #jetracer https://t.co/giVvuE4hP7
Jul 4
DIY Robocars via Twitter
Cool web-based self-driving simulator. Click save when the AI does the right thing https://github.com/pncsoares/self-driving-car
Jul 4