Hi
Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/
Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.
The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?
- No association: A receiver always receives data as long as he is in range
- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)
- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.
For FPV usage this means:
- No stalling image feeds as with the other WIFI FPV projects
- No risk of disassociation (which equals to blindness)
- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range
The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)
Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...
In the end, what does it cost? Not much. You just need:
2x Raspberry A+
2x 8€ wifi dongles
1x Raspberry camera
1x Some kind of cheap display
Happy to hear your thoughts/rebuild reports :)
See you,
befinitiv.
Replies
@Nicolas
DM368 dev kits are as readily available as the RPi --> http://www.mouser.com/new/leopardimaging/leopardimaging368/
There would be no problem implementing wifibroadcast on a DM36x. There is no open source hardware video encoder on the market and I don't think there will be at least in the near future.
@Fnoop Dogg
I think the RPi camera is good enough. If you mount good quality optics to the sensor it should be at least as good as a gopro 2.
If I made gopro style camera with quality optics based on the raspberry would you buy it? I could even integrate the wifi module into the case.
I would definitely pay for an RPi camera that could be more easily mounted on a gimbal. I think the major issue is the ribbon cable. It's too short, it doesn't bend well in at least one direction and the cable itself also produces interference that affects the GPS it seems.
For my wifi broadcast setup I use the NAVIO+ board which uses an RPi2 so it's just the camera + cable part that cause me troubles.
Depending on the price, I'd definitely be interested too, at least to try it. Even putting the camera into a gopro size case such that it would be balanced in a gopro gimbal would be a great start. The problem is that the raspberry csi only currently supports a single cmos camera, and that's not really good enough for decent quality video/photos. It might do for FPV but not for AP or research, and the quality is limited to 1080p/30 and 5MP for stills. But, I'd be very happy to proved wrong.
For the cable, I made a cable from a kit and posted the ebay link earlier back in the thread somewhere. It's shielded, round, and as long as you want and I've had no interference problems with it. It's a pain to make, although the ebay seller will make them for you for a bit extra (well worth it! I spent a day and a half swearing and cursing with limited success).
Yeah, there are different adaptors available to mount quality lenses. Some people even managed to mount really big zoom objectives to it: http://www.truetex.com/raspberry_pi_13.jpg
http://www.truetex.com/raspberrypi
http://www.ebay.com/itm/RaspCAM-2-8TFT-LCD-display-TSP-camera-modul...
http://www.ebay.com/itm/Raspberry-Pi-Camera-Board-w-M12x0-5-mount-L...
Here are some interesting links about how to change medium access strategy with ath9k_htc driver. There already is a patched driver included which allows changing things nicely via sysfs.
https://github.com/vanhoefm/modwifi
http://people.cs.kuleuven.be/~mathy.vanhoef/papers/acsac2014.pdf
http://ath9k-devel.ath9k.narkive.com/2mGfRO52/disabling-physical-sp...
Looks like it just needs to be compiled and then virtual and physical carrier sensing can be disabled completely with:
echo 1 > /sys/kernel/debug/ieee80211/phy*/ath9k_htc/registers/force_channel_idle
echo 1 > /sys/kernel/debug/ieee80211/phy*/ath9k_htc/registers/ignore_virt_cs
If that is disabled, the problem with latency building up because of queued packets at the TX side in case of congested medium will probably disappear. And in general, the traffic will be less "jittery" which should give a smoother stream. To take it even further, one could also alter interframe timings and backoff timings to get more usable bandwidth.
Keep in mind when doing this, you are changing the way Wifi is designed in such a way that it's very unfair to other users of the spectrum. Basically, the card will occupy the channel almost "all the time" without leaving time for other users to transmit frames, similar to analog video transmission (which is the reason why legal TX Power limits are usually lower for analog devices, so you might want to decrease TX power).
Hey befinitiv, I have seen you have started implementing FEC from udpcast. Thanks, thanks, thanks.
Now I feel bad, because I haven't had time to dig into the buildroot stuff further. But I will do that sometime :)
But I have looked into the other options in the meantime, namely this one I wrote about earlier:
"– Make some kind of virtual wrapper-interface using tun/tap or something. So that udpcast speaks to the wrapper-interface that behaves like a normal network interface. The wrapper-interface will then speak to the atheros monitor-mode interface. Also not sure if possible at all and how much effort."
I'd say it is actually possible and probably the cleaner solution because this way, we can use any normal network application we'd like to use and don't need to touch wifibroadcast for new functionality. Drawback would be of course, that it's harder to control buffers and latency because there is more stuff in the chain.
And I had another idea regarding the TX side. One could use two wifi cards on two different channels for the TX side to increasy bandwidth and resiliency against noise/interference and other wifi networks. If the FEC/interleaving occurs before sending out the packets on the physical interfaces, that would mean we have an interleaved and FECed data stream running over two different frequencies. I'd say the probability of both links having heavy interference (that the FEC/interleaving can't deal with) at the same should be around zero, giving a very very stable link even in the worst conditions.
I have attached a diagram of how it could work. The tun0 interface would be a "normal" (i.e. non monitor mode) interface with an IP configured.
Here is a tutorial about tun/tap programming in C:
http://backreference.org/2010/03/26/tuntap-interface-tutorial/
Raspi-Video-Solution.png
I've tried the two wifi cards idea and it definitely works!
Oh cool, I must have missed that. How did you do it, patched the tx application?