3+km HD FPV system using commodity hardware

Hi

Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/

Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.

The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?

- No association: A receiver always receives data as long as he is in range

- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)

- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.

For FPV usage this means:

- No stalling image feeds as with the other WIFI FPV projects

- No risk of disassociation (which equals to blindness)

- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range

The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)

Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...

In the end, what does it cost? Not much. You just need:

2x Raspberry A+

2x 8€ wifi dongles

1x Raspberry camera

1x Some kind of cheap display

Happy to hear your thoughts/rebuild reports :)

See you,

befinitiv.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • yes the communauty grow up

    i just to order all hardware

  • @befinitiv

    Are you still active on this project ?

  • Has anyone used the Teradek Vidiu with a Linux machine running Ubuntu 14.04? I can't see to find any drivers to run the Vidiu.

  • Is it possible to run Wifibroadcast on a Linux machine running inside a VMware Workstation?

    I have set up a VM running Ubunto 14.04.4, updated it with the latest patches ("apt-get install") and "apt-get install gstreamer1.0"

    The WLAN interface (TP-LINK TL-WN722N) is set to monitor mode:
    wlan0 IEEE 802.11bgn Mode:Monitor Frequency:2.472 GHz Tx-Power=20 dBm

    But when I run the following command, it allways exit with the following error message:
    ERROR: from element /GstPipeline:pipeline0/GstFdSrc:fdsrc0: Internal data flow error.


    osboxes@osboxes:~/wifibroadcast$ sudo ./rx -b 8 -r 4 -f 1024 wlan0 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=false
    DLT_IEEE802_11_RADIO Encap
    Setting pipeline to PAUSED ...
    Pipeline is PREROLLING ...
    /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)1280, height=(int)720, parsed=(boolean)true, stream-format=(string)avc, alignment=
    (string)au, codec_data=(buffer)01640028ffe1000e27640028ac2b402802dd00f1226a01000528ee025cb0
    /GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = video/x-h264, width=(int)1280, height=(int)720, parsed=(boolean)true, stream-format=(string)avc, alignment=
    (string)au, codec_data=(buffer)01640028ffe1000e27640028ac2b402802dd00f1226a01000528ee025cb0
    /GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1,
    interlace-mode=(string)progressive, colorimetry=(string)bt709, framerate=(fraction)25/1
    ERROR: from element /GstPipeline:pipeline0/GstFdSrc:fdsrc0: Internal data flow error.
    Additional debug info:
    gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstFdSrc:fdsrc0:
    streaming task paused, reason not-negotiated (-4)
    ERROR: pipeline doesn't want to preroll.
    Setting pipeline to NULL ...
    /GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = NULL
    /GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = NULL
    /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = NULL
    Freeing pipeline ...
    osboxes@osboxes:~/wifibroadcast$

    The transmitter is working and I can receive the stream on a Raspberry PI loaded with Wifibroadcast RPI FPV image v0.4.


    Any ideas?

    Thanks
    Ronnie

  • Developer

    Befinitiv,

    Not sure if you're still following/replying on this thread but I'm wondering if then pipe between the raspivid and TX or the pipe between Rx and hello_video might contribute to lag or some jitter in the lag.  I was just thinking that the stdin/stdout pipes are not very big (4k?) So maybe the sender could block and/or maybe there is buffering there too.  You've probably already investigated this but just wanted to check.

  • Hi all!
    Thanks to all of you for sharing your expiriense with this project.
    If it possible to connect the second camera (by usb), use libusb API(for getting frames) and gstream this video in parallel
    with the first video stream(from RPicam) then to wifibroadcast these 2 video streams?

  • hi paul,
    the tp-link is only installed for testing. i do not use wifi broadcast on my gimbal. later i need an bidirektional setup.
    • hi,
      auvida s e12 also operate with linux. currently i hope that it is possible to crate an fifo based pipe to netcat or socat directly from the h264 encoded video data stream. by default the e12 stream via rtsp and also http protocol. here are the cons at the player side. so an netcat pipe to mplayer maybe is the more simple but fastet choise. if i get the e12 i will test this.
    • hi again Wolke, so if I understand you well it is only plugged, not even working as normal wifi?

      • yes it act as normal wifi dongle. no driver needed. i am on kernel 3.4.1.
This reply was deleted.

Activity

DIY Robocars via Twitter
yesterday
DIY Robocars via Twitter
RT @SahikaGenc: AWS DeepRacer & Hot Wheels Track https://youtu.be/4H0Ei07RdR4 via @YouTube
Sep 14
DIY Robocars via Twitter
Sep 8
DIY Robocars via Twitter
RT @davsca1: We are releasing the code of our Fisher Information Field, the first dedicated map for perception-aware planning that is >10x…
Sep 8
DIY Robocars via Twitter
RT @SmallpixelCar: How this works: 1)object detection to find cones in single camera image, 30 frames/sec on @NVIDIAEmbedded Xavier. 2)comp…
Sep 8
DIY Robocars via Twitter
RT @SmallpixelCar: Use two color cones to guide the robocar. No map needed, on onsite training needed. Just place the cones and it will fol…
Sep 7
DIY Robocars via Twitter
Sep 7
DIY Robocars via Twitter
RT @roboton_io: Great to see http://roboton.io running at 60fps on the cheapest #chromebook we could find! #edtech #robotics #educat…
Sep 3
DIY Robocars via Twitter
RT @openmvcam: Crazy in-depth article about using the OpenMV Cam for Astrophotography: https://github.com/frank26080115/OpemMV-Astrophotography-Gear https://t.co/BPoK9QDEwS
Sep 3
DIY Robocars via Twitter
RT @openmvcam: Hi folks, it's finally here! Our first draft of our Arduino Interface Library is out! It works over SoftwareSerial, Hardware…
Sep 3
DIY Robocars via Twitter
RT @chr1sa: Please let them have an open API. This would be perfect for @DIYRobocars races https://twitter.com/NintendoAmerica/status/1301513099707658246
Sep 3
DIY Robocars via Twitter
RT @SmallpixelCar: Lanenet pretty much used all my GPU power on @NVIDIAEmbedded Xavier since I optimized with tensorRT. I need to run anoth…
Sep 3
DIY Robocars via Twitter
RT @LyftLevel5: Our @kaggle competition on Motion Prediction for Autonomous Vehicles is now live! Experiment with the largest-ever self-dri…
Aug 24
DIY Robocars via Twitter
RT @chr1sa: Our next @DIYRobocars virtual AI car race will be on Sept 26th. Sign up here https://www.meetup.com/DIYRobocars/events/272786977/ https://t.co/UENKGSOWO8
Aug 24
DIY Robocars via Twitter
New ready-to-run @NVIDIAEmbedded JetRacer car from Waveshare. Perfect for the next @diyrobocars race as soon as we… https://twitter.com/i/web/status/1297960223013867520
Aug 24
DIY Drones via Twitter
RT @chr1sa: The US government just approved 5 US-made drones for purchase, all based on the @Dronecode @PX4Autopilot standard. Great news f…
Aug 20
More…