Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...
Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.
The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?
- No association: A receiver always receives data as long as he is in range
- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)
- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.
For FPV usage this means:
- No stalling image feeds as with the other WIFI FPV projects
- No risk of disassociation (which equals to blindness)
- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range
The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)
Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...
In the end, what does it cost? Not much. You just need:
2x Raspberry A+
2x 8€ wifi dongles
1x Raspberry camera
1x Some kind of cheap display
Happy to hear your thoughts/rebuild reports :)
Yes, the 300m were with the white omnis that came with the 722s.
That is amazing! Thanks a lot for the detailed results!
Very useful and timesaving to see the details of your setup. Are you managing to feed your GoPro into this or using a Picam?
The ground station side OSD also sounds promising, I'm hoping to be able to help with the development of a good looking OSD (we can take advantage of colours, transparency, etc.) once I get the hardware. I'm also looking into having the image piped to an Oculus Rift and it would be best to display the OSD only in the central part of the image that can be seen with both of your eyes, and doing this on the receiver side helps here compared to an analog camera + minimosd + vtx. In fact offsetting the two OSD overlays slightly could create an effect of the OSD floating in front of the video which might turn out to be useful or annoying.
Just a NOOB question. Wouldnt a dual band adapter work better than a single band TX and RX setup? If yes, which chipset is best?
It would be amazing if we could use a MIMO dual band adapter.
Hi, one noob here. I have a though and do not know where to put it, so, lets see what do you think about.
Cameras like Mobius doing compression of the seen and seending that to the sd card. Why not to cheat that channel of compressed data?
BTW, very interesting project, starting puting things together, with a combination of headtracker, googles, and modell tracking ground station for a biger glider.
Is it possible to to get and calcualate the vector of the plane to manage the parabolic antena direction?
Hey, exactly what I was planning to work on ! I already have the system working with two CSL adapters, I still need to do some range test, and maybe improve the latency which is currently about 250ms. Good antennas should be a part of the solution :)
Anyway, I could help you if you want !
QGroundControl might be a good start point for the OSD part ?
Great results!!! I've gotta get my hands on those CSL adaptors!! Please post some videos if you can. :)
Encoding for storage is a bit different than encoding for streaming. The compression is higher and less error tolerant when encoding for storage. Also, the latency will be much higher depending on the buffer size of the camera.
Storage compression and FPV don't play well together.
Take a look at the antenna tracker project --> http://diydrones.com/forum/topics/figuring-out-the-antenna-tracker
My ready made images on Dropbox is running everything needed on boot
Thanks a lot for the idea and code :)
No just using the picam. I haven't seen anyone doing hdmi into the raspberry successfully/easily yet unfortunately, although there are some projects working on it. When hdmi in works, it will open up a whole new world of possibilites :)