3+km HD FPV system using commodity hardware

Hi

Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/

Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.

The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?

- No association: A receiver always receives data as long as he is in range

- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)

- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.

For FPV usage this means:

- No stalling image feeds as with the other WIFI FPV projects

- No risk of disassociation (which equals to blindness)

- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range

The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)

Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...

In the end, what does it cost? Not much. You just need:

2x Raspberry A+

2x 8€ wifi dongles

1x Raspberry camera

1x Some kind of cheap display

Happy to hear your thoughts/rebuild reports :)

See you,

befinitiv.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

          • Developer

            befinitiv,

            Tridge (arduplane lead+) also thinks it may be a software issue on the encoding side.  If it might help to chat with Tridge you'll find him on our mumble server.  He's Tridge is pretty awesome at helping solve all kinds of problems.

          • What resolution and bitrate are you using to get these numbers?  Also, what effect do they have on the latency results?

            • 1280x720 @ 700kbsp in case of 2 fps, 3mbsp otherwise.

              I've found where in the pipeline the last frame gets stuck. Raspivid uses fwrite to output the h264 data to stdout. Since fwrite is buffered, not all data received from the encoder has been written out immediately. Therefore I modified raspivid to use a plain "write" function call to get rid of the buffering.

              Another issue is the conversion of a stream of bytes into NALU units. Afaik the only way to determine the length of the current NALU is to wait until the start header (0x00000001) of the next NALU. But if you do that you are already one frame too late. Appareantly raspivid (with my change from fwrite to write) always outputs complete NALUs in a single write. Therefore I made a hack that appends the NALU start header at the end of the current write call and removes the start header at the beginning of the next write (and so on).

              My changes allowed me to repeat the encoding latency tests without any frame being stuck in the pipeline. This time I worked directly on the Raspi with the camera to really get the encoding latency (without transmission and decoding). I attached a screen to the raspi with a timestamp running on it. In parallel I captured video data from my modified raspivid and timestamped each NALU. In case of 1280x720, 700kbps, 2fps, every second frame a keyframe I measured the encode latency to be somewhere between 70 and 80ms. This of cause also includes the latency of my screen. So the actual latency should be some ms below that.

              While my hacks helped to reduce latency a bit (roughly by the time of a frame, in case of 48fps 20ms) it wasn't a big step forward. But at least now we have an understanding what part of the system causes what latency.

              • Developer

                Yes, like the others have said, that's really great work.  Really detailed analysis and fixes.  No pressure but once you've checked in you changes, I'm happy to repeat some latency tests.

                My TP-Link receivers and antennas have arrived now so I hope to do another range test perhaps next weekend.

                I've also been trying to reproduce Jaime Machuca's work to get a webcam working but so far the wifi transmitter seems to go offline the moment the webcam starts up.  I originally thought it was a power issue but I don't think so anymore.  Anyway, Jaime's going to get an RPI2 and see if he can figure it out.

              • Great work!! Really appreciate the effort! Even a 10-20 ms reduction is worth it.
              • cool, great work!

                • Using the same technique I also timestamped capture and encode. This was just a quick hack so no guarantees for correctness. But:

                  Encode of a 1280x720 from memory to memory frame takes 10ms (hello_encode)

                  Capture of a frame using raspistillYUV takes ~150ms

                  This doen't quite add up and I do not have enough insight into the raspicam modules to explain the discrepancy. However it is very interesting to see that most likely the frame capture is what takes most of the time, not the encoding.

                  • True, but the preparational tasks from raspistill are excluded from my measurements. I measured the time from capture (which is after the preparational tasks) to arrival in memory.

                  • raspistill is known to be slow. The reason is it starts up the camera and does some manadatory tasks just for capturing one frame. Most of them will be calibration of the sensor, like exposure.  Frame latency form raspistil and raspivid is not a very fair comparison.

                     

                  • Have you taken a look at the rpicamsrc project --> https://github.com/thaytan/gst-rpicamsrc

                    It offers the same functionality as raspivid, but can be used with gstreamer without using a pipe. Some people say that using pipes also increases latency because pipes have buffers.

                    I'm using it, but i have not compared it to raspivid in terms of latency.

                    thaytan/gst-rpicamsrc
                    GStreamer element for the Raspberry Pi camera module - thaytan/gst-rpicamsrc
This reply was deleted.

Activity