Hi
Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/
Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.
The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?
- No association: A receiver always receives data as long as he is in range
- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)
- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.
For FPV usage this means:
- No stalling image feeds as with the other WIFI FPV projects
- No risk of disassociation (which equals to blindness)
- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range
The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)
Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...
In the end, what does it cost? Not much. You just need:
2x Raspberry A+
2x 8€ wifi dongles
1x Raspberry camera
1x Some kind of cheap display
Happy to hear your thoughts/rebuild reports :)
See you,
befinitiv.
Replies
yes the communauty grow up
i just to order all hardware
@befinitiv
Are you still active on this project ?
Has anyone used the Teradek Vidiu with a Linux machine running Ubuntu 14.04? I can't see to find any drivers to run the Vidiu.
Is it possible to run Wifibroadcast on a Linux machine running inside a VMware Workstation?
I have set up a VM running Ubunto 14.04.4, updated it with the latest patches ("apt-get install") and "apt-get install gstreamer1.0"
The WLAN interface (TP-LINK TL-WN722N) is set to monitor mode:
wlan0 IEEE 802.11bgn Mode:Monitor Frequency:2.472 GHz Tx-Power=20 dBm
But when I run the following command, it allways exit with the following error message:
ERROR: from element /GstPipeline:pipeline0/GstFdSrc:fdsrc0: Internal data flow error.
osboxes@osboxes:~/wifibroadcast$ sudo ./rx -b 8 -r 4 -f 1024 wlan0 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false
DLT_IEEE802_11_RADIO Encap
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)1280, height=(int)720, parsed=(boolean)true, stream-format=(string)avc, alignment=
(string)au, codec_data=(buffer)01640028ffe1000e27640028ac2b402802dd00f1226a01000528ee025cb0
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = video/x-h264, width=(int)1280, height=(int)720, parsed=(boolean)true, stream-format=(string)avc, alignment=
(string)au, codec_data=(buffer)01640028ffe1000e27640028ac2b402802dd00f1226a01000528ee025cb0
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1,
interlace-mode=(string)progressive, colorimetry=(string)bt709, framerate=(fraction)25/1
ERROR: from element /GstPipeline:pipeline0/GstFdSrc:fdsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstFdSrc:fdsrc0:
streaming task paused, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = NULL
Freeing pipeline ...
osboxes@osboxes:~/wifibroadcast$
The transmitter is working and I can receive the stream on a Raspberry PI loaded with Wifibroadcast RPI FPV image v0.4.
Any ideas?
Thanks
Ronnie
Befinitiv,
Not sure if you're still following/replying on this thread but I'm wondering if then pipe between the raspivid and TX or the pipe between Rx and hello_video might contribute to lag or some jitter in the lag. I was just thinking that the stdin/stdout pipes are not very big (4k?) So maybe the sender could block and/or maybe there is buffering there too. You've probably already investigated this but just wanted to check.
Hi all!
Thanks to all of you for sharing your expiriense with this project.
If it possible to connect the second camera (by usb), use libusb API(for getting frames) and gstream this video in parallel
with the first video stream(from RPicam) then to wifibroadcast these 2 video streams?
the tp-link is only installed for testing. i do not use wifi broadcast on my gimbal. later i need an bidirektional setup.
auvida s e12 also operate with linux. currently i hope that it is possible to crate an fifo based pipe to netcat or socat directly from the h264 encoded video data stream. by default the e12 stream via rtsp and also http protocol. here are the cons at the player side. so an netcat pipe to mplayer maybe is the more simple but fastet choise. if i get the e12 i will test this.
hi again Wolke, so if I understand you well it is only plugged, not even working as normal wifi?