Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...
Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.
The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?
- No association: A receiver always receives data as long as he is in range
- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)
- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.
For FPV usage this means:
- No stalling image feeds as with the other WIFI FPV projects
- No risk of disassociation (which equals to blindness)
- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range
The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)
Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...
In the end, what does it cost? Not much. You just need:
2x Raspberry A+
2x 8€ wifi dongles
1x Raspberry camera
1x Some kind of cheap display
Happy to hear your thoughts/rebuild reports :)
Could you please post the command line?
Here it is :
/usr/bin/sudo /sbin/ifconfig wlan0 down
/usr/bin/sudo /sbin/iw dev wlan0 set monitor otherbss fcsfail
/usr/bin/sudo /sbin/ifconfig wlan0 up
/usr/bin/sudo /sbin/iw reg set BO
/usr/bin/sudo /sbin/iwconfig wlan0 rate 18M fixed
/usr/bin/sudo /sbin/iwconfig wlan0 channel 149
sudo ./rx -b 1 wlan0 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false
The RPi currently supports only one sensor model. Look it up and see if you can find a sensor module with OIS.
FYI: I get noticably lower latency with:
gst-launch-0.10 -v fdsrc ! h264parse ! ffdec_h264 ! xvimagesink sync=false
I did some bench tests to see how some of the parameters affected latency. I'm using TommyLarsen's images but I think what it does is consistent with befinitiv's startup scripts.
Below is a table showing the "average lag" (of 3 ~ 5 samples, see far right column) and the parameters changed (changes from previous row shown in green).
Thanks for the results! How exactly did you measure the latency? Using the stopwatch+screenshot approach?
Currently I'm diving into the main source for the latency. So far I've found that there are most likely no frames "stuck" in the raspi encoder (in the sense of the decoder having a fixed latency of N frames until the first result appears (which is already a good sign). Next step is that I'll timestamp the NALUs (containing an image with a readable timestamp) upon reception to get the actual coding+transfer latency. Since this excludes the decode latency this should help to decide whether it makes more sense to optimize the encode or decode side.
Yes, I used estopwatch.net and then used a camera to take a picture of the original and the FPV screen. I started taking just 3 pictures for each test but for the last 5 rows or so I took 5 pictures per test. Sometimes the camera catches one or the other screen just as it's updating and the numbers can't be read.
By the way Jaime Machuca tells me he's managed to get it working on an odroid using a webcam. I'm attempting to reproduce with an RPi + webcam. He says this worked and I'll verify very soon:
gst-launch-1.0 -v -e uvch264src initial-bitrate=5000000 average-bitrate=5000000 iframe-period=1000 device=/dev/video0 name=src auto-start=true src.vidsrc ! queue ! video/x-h264,width=800,height=448,framerate=30/1 ! h264parse ! fdsink | ./tx -r 2 -f 1024 wlan0
that detailed investigation sounds really good.
Looks like the error is happening right at the start of the pipeline, perhaps try rx -b 8? Make sure you have the latest version of rx compiled?
I've had very little time for fun lately but I tried another range test yesterday. I set a waypoint mission for 1.5km straight out over some fields (and a hill with trees) and then back again and set it off. I started to get a bit of break up around 1km but still perfectly useable, then lost it at 1.2km when the by now little dot disappeared behind the treeline and the hill. I also lost RC signal at this point, so after a few buttock clenching minutes it was a huge relief to see the tiny dot reappear over the treeline. Thanks Randy for awesome autonomous flight code :)
I'd say the 1.2km was probably nearing the edge of the useable range but I'll do another test at higher altitude to avoid the treeline horizon sometime.
I looked into this model but it's sold in very few places, it's a bit big and needs dual antennas. At least to start with I liked the smaller simpler implementation of the Awus051NH which is more widely available. If it really does have a 1w amplifier then for those that need the extra range it could be great :)
It would be really great if you can get to grips with where latency actually sits in the system. Traditional digital wifi approaches to fpv typically use gstreamer to gstreamer over an associated link and quite often latency builds up, I think usually on the tx side within gstreamer if the available sending bandwidth reduces or if there are delays due to packet loss or link disassociation. Sometimes the link can start off with almost no perceptible latency but then over time can lag by a second or more. Since using this wifibroadcast technique I haven't seen this at all, I presume because there is no gstreamer at the tx side. I also used to get stream stalls using rpicamsrc, but didn't get them using raspivid through a pipe. Raspivid seems to be a good source, and they've added lots of options for tweaking that can be very useful.
At the rx side if using hardware decoding there are few bottlenecks to back up, and ways of reducing pockets that potential latency (reducing ring buffers, tcp buffers, disabling sync etc). I believe that with enough trial, testing and tuning (like the great work Randy testing different configurations), we can come up with sets of settings that reduce latency to very good levels. Being able to narrow down where the latency sits in the system would be a great help to target efforts.