Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...
Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.
The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?
- No association: A receiver always receives data as long as he is in range
- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)
- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.
For FPV usage this means:
- No stalling image feeds as with the other WIFI FPV projects
- No risk of disassociation (which equals to blindness)
- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range
The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)
Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...
In the end, what does it cost? Not much. You just need:
2x Raspberry A+
2x 8€ wifi dongles
1x Raspberry camera
1x Some kind of cheap display
Happy to hear your thoughts/rebuild reports :)
very interesting idea, is there hope to include telemetry in the same stream ? - (duplex?)
Very very interesting,
btw did you measure latency ?
Yes, the software allows you to multiplex 256
Yes, and it is comparable to the normal WIFI setup. This gives you around 110ms for 1280x720 and 87ms for VGA (glass to glass). Not quite racing-compatible but pretty close :)
Very cool. I am definitely interested in this.
Whops, my last reply got somehow truncated. You can multiplex 256 independent data streams. Yesterday I finished my telemetry hardware on my quad (which is just a usb2uart converter) and connected it to my Naze32. This way I could transmit FrSky telemetry data in parallel to the image stream. Doing this is really simple using the existing tools: "cat /dev/ttyUSB0 | ./tx -p 1 wlan0"
Hi, not yet studied your webpage, but exactly what I was thinking about. Use powerful wifi chip without actually using wifi and IP overhead. I am working a lot with IP cameras and TI Davinci ones like Hikvison has very nice picture at low bitrates. When using little to no buffer on receiving PC, latency is low. But using wifi is pain. Hikvision even made 720p camera with CCD! which is has zero problems with jello. I am wondering if I could load OpenWRT to cheap Atheros chipset based Mikrotik or Ubiquity board, somehow use your software and set camera to multicast frames. As far as I know multicast do not need ACKs, so it should work with simplex data connection. Atheros is capable of packet injection an monitor mode. Everything is there, just set it up...
Very Very interesting and clever implementation! Bravo!
Thank you for sharing and in true diy spirit! This is great stuff!
does it work also with an external USB camera or the conversion video time is to high (both time and cpu consuming)
Is possibile to stream more video at once?
Yes, an USB camera does work. Before I received my Raspberry camera I did my initial tests this way: https://www.youtube.com/watch?v=ew9LHAs2FGYBut I don't have any numbers on latency or CPU consumption. I guess that both are higher compared to the Raspberry cam.
Streaming several videos is theoretically possible. However, the compression of the video data might be your limitation here. I haven't tested to compress two video streams in parallel using the OMX api (I don't know enough about it to give an estimate if this would work). But it should be easy to test. The necessary gstreamer command for converting an USB camera to H264 is found in this post: https://befinitiv.wordpress.com/2015/01/25/true-unidirectional-wifi... Just try to execute that line twice with different camera inputs and you have the answer :)
Here you can find a fresh video shot 1 hour ago with my system (the video shows a recording from the ground station).
The video is sometimes corrupted since I had the receiver in my pocket. Occasionally my body blocked the line of sight. Also the transmission in this video is optimized for safety instead of image quality.