Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...
Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.
The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?
- No association: A receiver always receives data as long as he is in range
- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)
- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.
For FPV usage this means:
- No stalling image feeds as with the other WIFI FPV projects
- No risk of disassociation (which equals to blindness)
- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range
The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)
Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...
In the end, what does it cost? Not much. You just need:
2x Raspberry A+
2x 8€ wifi dongles
1x Raspberry camera
1x Some kind of cheap display
Happy to hear your thoughts/rebuild reports :)
Having a bit of a bad time setting this up.
So i have a fresh install of Ubuntu running on my mac through parallels. Im using your scripts, firstly i run the RX script and it sits there waiting at -
setting pipeline to paused...
pipeline is prerolling...
Then as soon as i run the TX script the TX terminal gives me information about data packets sent, but the RX machine splits out...
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)1296, height=(int)730, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01640028ffe1000f27640028ac2b402882efc900f1226a01000528ee025cb0
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = video/x-h264, width=(int)1296, height=(int)730, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01640028ffe1000f27640028ac2b402882efc900f1226a01000528ee025cb0
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1296, height=(int)730, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, framerate=(fraction)25/1
ERROR: from element /GstPipeline:pipeline0/GstFdSrc:fdsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstFdSrc:fdsrc0:
streaming task paused, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = NULL
Freeing pipeline ...
Any specific tutorial I should follow to get the TL-WN722N working on rpi 2B as a AP?
I have tried the Alfa AWUS036NHR and it was a disaster. Worked on it for around 10 hours and finally got it doing something. The AP showed up but the tx power was super low ~1mW and the connection fell away after 2 minutes. Tried pretty much evry power setting but it did not make any difference. Thats why I got two TL-WN722N to experiment with. First with raspicam interface over normal wifi and later wifi broadcast as showed in this topic.
Nice find, will add that to the cart
That is a shame. The AWUS036H is superior in all respects though, with the exception of "N" support. I'd love to hear if anyone has tried that unit. It really is the best wifi unit on the market, so it would be a shame not to support it.
Hello all I am trying befinitiv system, but at the moment without the wifi in monitor mode.
I am having problems getting the stream displayed on the receiving raspberry p2
On the transmitter side (pi B+) I use the command
raspivid -n -t 0 -w 960 -h 720 -fps 30 -b 6000000 -o - | gst-launch-1.0 -e -vvvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host= 192.168.192.x port= 9000
and I can watch the stream in windows and android (as per Patrick Duffy tutorial and software)
On my pi2 (connected to a monitor via HDMI) after this terminal command :
gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! autovideosink sync=false
it stops at :
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
but I can see the ethernet/wifi leds flashing heavily on tx rx
Thanks a lot
in your pi2 client side you forgot to specify the port number where data is arriving. your command should be
gst-launch-1.0 -v fdsrc port=9000 ! h264parse ! avdec_h264 ! autovideosink sync=false
thanks a lot Pritam, will try tomorrow
Please will any of the professional on this tread have a look at this ?
Seems extremely interesting:
this is the link:
"This is for the compute module only and requires two camera modules (obvious I know!). It is not possible to run it on a standard Pi as only one camera interface is exposed. Please don't ask."
Fnoop, I understand that, but what would be the problem in using the compute module ? (apart the money involved)
And thank you for your former reply.
You would also need the IO board for the compute module. The dev kit is $200.
sorry my lack of knowledge but why would the original compute module motherboard not work?