3+km HD FPV system using commodity hardware

Hi

Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/

Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.

The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?

- No association: A receiver always receives data as long as he is in range

- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)

- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.

For FPV usage this means:

- No stalling image feeds as with the other WIFI FPV projects

- No risk of disassociation (which equals to blindness)

- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range

The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)

Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...

In the end, what does it cost? Not much. You just need:

2x Raspberry A+

2x 8€ wifi dongles

1x Raspberry camera

1x Some kind of cheap display

Happy to hear your thoughts/rebuild reports :)

See you,

befinitiv.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Does anybody know if it would be possible to use such a USB board cam that gives out MJPEG with the Raspi?

    http://www.aliexpress.com/item/2pieces-1080p-hd-cmos-ov2710-sensor-...

    Description says "Linux with UVC" and it can do the following framerates:

    Frame rate

    1920 (H) x 1080 (V) pixels   MJPEG 30fps       YUY2 6fps
    1280 (H) x 1024 (V) pixels   MJPEG 30fps       YUY2 6fps
    1280 (H) x  720 (V) pixels    MJPEG 60fps       YUY2 9fps
    1024 (H) x  768 (V) pixels    MJPEG 30fps       YUY2 9fps
    800 (H)   x  600 (V) pixels    MJPEG 60fps       YUY2 21fps
    640 (H)   x  480 (V) pixels    MJPEG 120fps     YUY2 30fps
    352(H)    x  288 (V) pixels    MJPEG 120fps     YUY2 30fps
    320 (H)   x  240 (V) pixels    MJPEG 120fps     YUY2 30fps

    Of course, bandwidth requirements for MJPEG are a _lot_ more when compared to H264, but that can be increased by using 40Mhz channel bandwidth, FEC and more than one wifi adapter I guess. Maybe for people who need very low latency and very quick recovery from lost data, that would be an option?

    This page (http://stardot.com/bandwidth-and-storage-calculator) says around 20Mbit for 720p with 30fps medium quality or 7Mbit for 704x480. So I guess 640x480 or 800x600 with 60fps should be possible. Not HD anymore, but I guess still _a lot_ better image quality than analog transmission and PAL or NTSC. And real 60fps, not interlaced 60 fps.

    • In theory I don't see what not, the raspberry/omx decodes jpeg/mjpeg in hardware and you should be able to consume it with v4l2 / gstreamer uvc.  It is as you say somewhat inefficient but perhaps if you can get enough bandwidth it will give a more reliable signal.

      You still have the problem of where to mount the camera.  A lot of people use fpv to view where a gopro or other 'proper' camera is pointing when doing photography/videos.  I've tried various ways of sticking the raspberry camera to a gopro on a gimbal, none work very well or reliably.  The holy grail IMO for this project will be to get a gopro/clone output straight to the raspberry or similar SBC and from there into wifibroadcast.  This will allow top quality, stabilised, realtime footage. I'm currently trying to find an SBC that supports hdmi in, or else a gopro clone that supports uvc.

      • My dream, but lacking any tech knowledge or skills means it's just a dream haha keep us in the loop and share the love before going commercial haha

      • Hmm, not sure what you mean. My knowledge about linux video stuff is limited :) I thought using v4l to get the MJPEG stream out of the cam, then pipe it into wifibroadcast tx application. No de-coding or re-coding done. Then on the receiving Pi, pipe the output of the rx application into some application that can de-code and display MJPEG.

        Regarding HDMI in: Wouldn't that add even more latency? First the latency from the Gopro HDMI out (80ms or something if I remember correctly) and then the latency from the h264 encoder.

        I think if we want really low latency with h264, we need an SBC that has h264 low latency features (e.g. slices) and also some cam that gives out a video without much latency. I have read somewhere that TI DM368 boards are a good candidate for this. But not open source and not easily available for non-businesses I think.

        • My linux video knowledge is pretty limited too :)  But yes that's essentially what I meant, use v4l/gstreamer on the tx side and pipe it to wifibroadcast tx, then on the rx side pipe rx output to gstreamer/omx.  If you can get away with not using gstreamer on the tx side, all the better.

          For HDMI in, yes potentially, but I'm hoping not too much latency.  The 3dr Solo seems to essentially do this, it looks like it takes the hdmi from the gopro and feeds it into an hdmi in on a custom built companion board based on the i.mx6, then fires this out over wifi to the handheld controller which contains another SBC that in turn forwards the video (possibly with intermediary processing, who knows) over yet another wifi to the tablet display.  It claims to have very low latency, pretty much on par with what people are getting with raspberry pi camera methods.  So I'm really interested in this new board:

           http://www.gateworks.com/product/item/ventana-gw5510-single-board-c...

          Which has an hdmi in, which I hope is married to a hardware h264 encoder.  It supports various wifi/comms options on the mini pcie socket, so this could be a real winner.  HDMI in is difficult, although there are a few other options.  A decent h264 encoder shouldn't need to add much latency.

          Alternatively, UVC could be an option but few gopro like cameras support this, or possibly rtsp but this will involve yet more wireless links between the camera and the raspberry/sbc.

          Then whatever method is chosen you have to perfect the link between the camera and the sbc/wifi without transmitting vibrations.  I suspect a lot of the work around the Solo has been about engineering basically a solid commercial version of the above and it looks like they've done a great job, just not in an open way (understandably).

          • The GW5510 is a nice board, but there is a catch. The h264 encoder on the i.mx6 platform for gstreamer is very limited. It's more like a demo than a real encoder. One would have to write a custom encoder to have any meaningful control over the stream.

            Oh, and you can get 10 RPis for the price of one GW5510.

            • How is it limited?  The 5514 is quad core and each core supports NEON so maybe encoding through x264 is possible.  How much are they?  I've emailed for prices.  If they're $350+ I don't see anyone buying them, surely they can't shoot themselves in the foot that badly.

          • Developer

            By the way, I tested the latency of the solo video link and it was 130ms so it's about the same as what I see on wifi broadcast (115ms ~ 170ms for me) but it's more regular.  befinitiv has investigated the latency and I thinks he has some improvements for it although I haven't looked for or tested them yet.

This reply was deleted.

Activity

DIY Robocars via Twitter
Jan 28
DIY Robocars via Twitter
RT @Heavy02011: ⁦@diyrobocars⁩ : A Home-brew computer club* for Connected Autonomous Driving. talk at #piandmore ⁦@PiAndMore⁩ on Jan 23rd h…
Jan 23
DIY Robocars via Twitter
RT @a1k0n: New blog post! Deep dive into my ceiling light based localization algorithm which runs in 1ms on a Raspberry Pi 3: https://t.co/…
Jan 23
DIY Robocars via Twitter
Great new guide to using @donkey_car https://custom-build-robots.com/donkey-car-e-book-en
Jan 23
DIY Robocars via Twitter
RT @chr1sa: The next @DIYRobocars virtual AI race is tomorrow morn at 9:00am PT. You can watch live on Twitch https://www.meetup.com/DIYRobocars/events/275268196/
Jan 22
DIY Robocars via Twitter
New version of Intel OpenBot! This resolves many of the issues with the first version, including a much smoother tr… https://twitter.com/i/web/status/1352395636369313798
Jan 21
DIY Drones via Twitter
Using ArduRover with an RTK GPS https://ift.tt/2N9I3RO
Jan 18
DIY Drones via Twitter
Jan 18
DIY Robocars via Twitter
Jan 18
DIY Robocars via Twitter
Jan 15
DIY Robocars via Twitter
Jan 15
DIY Drones via Twitter
Jan 14
DIY Robocars via Twitter
RT @Heavy02011: @diyrobocars : A Home-brew computer club* for Connected Autonomous Driving on Jan 23rd, 2021 https://www.meetup.com/Connected-Autonomous-Driving/events/275728684/ #Meetu…
Jan 14
DIY Robocars via Twitter
Jan 14
DIY Robocars via Twitter
RT @Heavy02011: ⁦@diyrobocars⁩ Autonomous Driving Assembly at #rC3. join us at https://rc3.world/rc3/assembly/diyrobocars-f1tenth/ ⁦@f1tenth⁩ ⁦@DAVGtech⁩ ⁦@DWalmroth⁩…
Jan 11
DIY Robocars via Twitter
RT @chr1sa: New car designs coming for our next @DIYRobocars @donkey_car virtual race on the 23rd. Choose any one you want at race time Le…
Jan 11
More…