Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/
Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.
The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?
- No association: A receiver always receives data as long as he is in range
- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)
- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.
For FPV usage this means:
- No stalling image feeds as with the other WIFI FPV projects
- No risk of disassociation (which equals to blindness)
- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range
The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)
Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...
In the end, what does it cost? Not much. You just need:
2x Raspberry A+
2x 8€ wifi dongles
1x Raspberry camera
1x Some kind of cheap display
Happy to hear your thoughts/rebuild reports :)
Take a look at the "hello_font" program. It should be straight forward to adapt this for OSD (running on the ground station).
Things to do (on my plan when I have some spare time):
- Transmit telemetry over second port via wifibroadcast to the ground
- Receive telemetry via wifibroadcast and pipe the data into a modified hello_font program to display information.
This should be much better that the "analog" way of drawing into the video:
- Video stays clean
- Data can be recorded and is machine readable
Works as expected: https://youtu.be/EagDJrwleHQ :)
This one transmits the frsky telemetry from the naze32 over wifibroadcast. The battery voltage is displayed on the ground station PI as an overlay onto the video.
fantastic! If we get OSD to work this would be an awesome FPV system at similar or even less cost than camera+minimOSD+analog transmitter+....
Is anybody else thinking it needs to go on github.com?
befinitiv already has it online on bitbucket here: https://bitbucket.org/befi/wifibroadcast
For now he has very good info on his blog: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...
I don't know if github is better than bitbucket. I think bitbucket also has the possibility to add wikis to repos.
no probs. Threads like this always get a bit messy and the real information disappears...
Just wondering; is there a way to use WiFiBroadcast with gStreamer?
I mean, considering we could use a HD USB camera, it would be useful.
You mean like this?
gst-launch-1.0 -v v4l2src ! 'video/x-raw, width=1280, height=720, framerate=30/1' ! omxh264enc target-bitrate=500000 control-rate=variable periodicty-idr=10 ! h264parse ! fdsink | sudo ./tx -r 2 -f 1024 wlan0
I looked through the raspivid parameters, and I saw the annotations, so I tried it and just did -a "testing", and it works, it sends the video with "testing" text on top/center of the video.
So, if raspivid can put text on top of the video, this means that it should be possible to change the code to make an OSD