Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...

Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.

The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?

- No association: A receiver always receives data as long as he is in range

- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)

- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.

For FPV usage this means:

- No stalling image feeds as with the other WIFI FPV projects

- No risk of disassociation (which equals to blindness)

- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range

The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)

Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...

In the end, what does it cost? Not much. You just need:

2x Raspberry A+

2x 8€ wifi dongles

1x Raspberry camera

1x Some kind of cheap display

Happy to hear your thoughts/rebuild reports :)

See you,


Views: 69883

Reply to This

Replies to This Discussion

Hello Wolke,

How do you intend to transmit the stream using this E12 board?

auvida s e12 also operate with linux. currently i hope that it is possible to crate an fifo based pipe to netcat or socat directly from the h264 encoded video data stream. by default the e12 stream via rtsp and also http protocol. here are the cons at the player side. so an netcat pipe to mplayer maybe is the more simple but fastet choise. if i get the e12 i will test this.

Hi all!
Thanks to all of you for sharing your expiriense with this project.
If it possible to connect the second camera (by usb), use libusb API(for getting frames) and gstream this video in parallel
with the first video stream(from RPicam) then to wifibroadcast these 2 video streams?


Not sure if you're still following/replying on this thread but I'm wondering if then pipe between the raspivid and TX or the pipe between Rx and hello_video might contribute to lag or some jitter in the lag.  I was just thinking that the stdin/stdout pipes are not very big (4k?) So maybe the sender could block and/or maybe there is buffering there too.  You've probably already investigated this but just wanted to check.

Hi. I looked at the spec sheet for the GW5100 and see that it has some analog inputs. I didn't look too much into it, but I would think that the driver would have some user configs that would all you to change hue/saturation/brightness etc via v4l2-ctl.

I'm curious, what things do you want from the encoder that isn't already provided? Maybe the analog in driver they're using doesn't have these user hooks, in which case (yay opensource), we can always add them in ourselves. I'm very interested in a GW5510 for more than just video streaming, but I would have to see how developed their input drivers are (and really the feature list of their input capture device) for HDMI. I saw a price around here that the GW5510 is about $180 - $190, in which case I would attempt to buy it so I can mess around with the input stuff. For that price with full industrial parts on it, I think it's perfect for me. Everyone here wants a rpi, but really I think a more specialized board fits the bill *much* better.

Also, I'd run Yocto instead of OpenWrt. I just looked at their Yocto offerings and it seems like it has somewhat better wireless performance anyways.

Things have changed a bit since I posted here. The issue was with the gstreamer plugin. They are using an open source plugin now, which has more features. I have not looked at the new plugin yet though.

The problem with the h264 encoder before was that you could not set parameters like the I-frame frequency and some other parameters I can't recall at the moment. Those are important if you have packet loss and you don't want the whole image to be garbled.

For video stuff you definitely have to go with yocto. Ubuntu is also ok and probably much easier to use, but you need more flash than the default 256 MB to run it. Openwrt is for network stuff only, it does not have any multimedia support.

Is it possible to run Wifibroadcast on a Linux machine running inside a VMware Workstation?

I have set up a VM running Ubunto 14.04.4, updated it with the latest patches ("apt-get install") and "apt-get install gstreamer1.0"

The WLAN interface (TP-LINK TL-WN722N) is set to monitor mode:
wlan0 IEEE 802.11bgn Mode:Monitor Frequency:2.472 GHz Tx-Power=20 dBm

But when I run the following command, it allways exit with the following error message:
ERROR: from element /GstPipeline:pipeline0/GstFdSrc:fdsrc0: Internal data flow error.

osboxes@osboxes:~/wifibroadcast$ sudo ./rx -b 8 -r 4 -f 1024 wlan0 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false
DLT_IEEE802_11_RADIO Encap
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)1280, height=(int)720, parsed=(boolean)true, stream-format=(string)avc, alignment=
(string)au, codec_data=(buffer)01640028ffe1000e27640028ac2b402802dd00f1226a01000528ee025cb0
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = video/x-h264, width=(int)1280, height=(int)720, parsed=(boolean)true, stream-format=(string)avc, alignment=
(string)au, codec_data=(buffer)01640028ffe1000e27640028ac2b402802dd00f1226a01000528ee025cb0
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1,
interlace-mode=(string)progressive, colorimetry=(string)bt709, framerate=(fraction)25/1
ERROR: from element /GstPipeline:pipeline0/GstFdSrc:fdsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstFdSrc:fdsrc0:
streaming task paused, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = NULL
Freeing pipeline ...

The transmitter is working and I can receive the stream on a Raspberry PI loaded with Wifibroadcast RPI FPV image v0.4.

Any ideas?


Has anyone used the Teradek Vidiu with a Linux machine running Ubuntu 14.04? I can't see to find any drivers to run the Vidiu.


Are you still active on this project ?

yes the communauty grow up

i just to order all hardware

Reply to Discussion


© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service