Hi

Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...

Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.

The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?

- No association: A receiver always receives data as long as he is in range

- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)

- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.

For FPV usage this means:

- No stalling image feeds as with the other WIFI FPV projects

- No risk of disassociation (which equals to blindness)

- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range

The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)

Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...

In the end, what does it cost? Not much. You just need:

2x Raspberry A+

2x 8€ wifi dongles

1x Raspberry camera

1x Some kind of cheap display

Happy to hear your thoughts/rebuild reports :)

See you,

befinitiv.

Views: 70446

Reply to This

Replies to This Discussion

I think it depends on how you want to use it.

Right now it uses inter process communication, with the tun/tap solution it would be network communication, which is more versatile.

The idea materialized when I wasn't sure if befinitiv would implement FEC inside wifibroadcast, because it would allow people to use udpcast or whatever other network application they like. One could also make a transparent wireless bridge with this and connect an ethernet cam or whatever other ethernet device to it.

For me personally, the approach with FEC inside wifibroadcast is fine as I don't need any other functionality and it is simpler.

Oh cool, I must have missed that. How did you do it, patched the tx application?

Here are some interesting links about how to change medium access strategy with ath9k_htc driver.  There already is a patched driver included which allows changing things nicely via sysfs.

https://github.com/vanhoefm/modwifi

http://people.cs.kuleuven.be/~mathy.vanhoef/papers/acsac2014.pdf

http://ath9k-devel.ath9k.narkive.com/2mGfRO52/disabling-physical-sp...

Looks like it just needs to be compiled and then virtual and physical carrier sensing can be disabled completely with:

echo 1 > /sys/kernel/debug/ieee80211/phy*/ath9k_htc/registers/force_channel_idle
echo 1 > /sys/kernel/debug/ieee80211/phy*/ath9k_htc/registers/ignore_virt_cs

If that is disabled, the problem with latency building up because of queued packets at the TX side in case of congested medium will probably disappear. And in general, the traffic will be less "jittery" which should give a smoother stream. To take it even further, one could also alter interframe timings and backoff timings to get more usable bandwidth.

Keep in mind when doing this, you are changing the way Wifi is designed in such a way that it's very unfair to other users of the spectrum. Basically, the card will occupy the channel almost "all the time" without leaving time for other users to transmit frames, similar to analog video transmission (which is the reason why legal TX Power limits are usually lower for analog devices, so you might want to decrease TX power).

Does anybody know if it would be possible to use such a USB board cam that gives out MJPEG with the Raspi?

http://www.aliexpress.com/item/2pieces-1080p-hd-cmos-ov2710-sensor-...

Description says "Linux with UVC" and it can do the following framerates:

Frame rate

1920 (H) x 1080 (V) pixels   MJPEG 30fps       YUY2 6fps
1280 (H) x 1024 (V) pixels   MJPEG 30fps       YUY2 6fps
1280 (H) x  720 (V) pixels    MJPEG 60fps       YUY2 9fps
1024 (H) x  768 (V) pixels    MJPEG 30fps       YUY2 9fps
800 (H)   x  600 (V) pixels    MJPEG 60fps       YUY2 21fps
640 (H)   x  480 (V) pixels    MJPEG 120fps     YUY2 30fps
352(H)    x  288 (V) pixels    MJPEG 120fps     YUY2 30fps
320 (H)   x  240 (V) pixels    MJPEG 120fps     YUY2 30fps

Of course, bandwidth requirements for MJPEG are a _lot_ more when compared to H264, but that can be increased by using 40Mhz channel bandwidth, FEC and more than one wifi adapter I guess. Maybe for people who need very low latency and very quick recovery from lost data, that would be an option?

This page (http://stardot.com/bandwidth-and-storage-calculator) says around 20Mbit for 720p with 30fps medium quality or 7Mbit for 704x480. So I guess 640x480 or 800x600 with 60fps should be possible. Not HD anymore, but I guess still _a lot_ better image quality than analog transmission and PAL or NTSC. And real 60fps, not interlaced 60 fps.

In theory I don't see what not, the raspberry/omx decodes jpeg/mjpeg in hardware and you should be able to consume it with v4l2 / gstreamer uvc.  It is as you say somewhat inefficient but perhaps if you can get enough bandwidth it will give a more reliable signal.

You still have the problem of where to mount the camera.  A lot of people use fpv to view where a gopro or other 'proper' camera is pointing when doing photography/videos.  I've tried various ways of sticking the raspberry camera to a gopro on a gimbal, none work very well or reliably.  The holy grail IMO for this project will be to get a gopro/clone output straight to the raspberry or similar SBC and from there into wifibroadcast.  This will allow top quality, stabilised, realtime footage. I'm currently trying to find an SBC that supports hdmi in, or else a gopro clone that supports uvc.

Hmm, not sure what you mean. My knowledge about linux video stuff is limited :) I thought using v4l to get the MJPEG stream out of the cam, then pipe it into wifibroadcast tx application. No de-coding or re-coding done. Then on the receiving Pi, pipe the output of the rx application into some application that can de-code and display MJPEG.

Regarding HDMI in: Wouldn't that add even more latency? First the latency from the Gopro HDMI out (80ms or something if I remember correctly) and then the latency from the h264 encoder.

I think if we want really low latency with h264, we need an SBC that has h264 low latency features (e.g. slices) and also some cam that gives out a video without much latency. I have read somewhere that TI DM368 boards are a good candidate for this. But not open source and not easily available for non-businesses I think.

My dream, but lacking any tech knowledge or skills means it's just a dream haha keep us in the loop and share the love before going commercial haha

@Nicolas

DM368 dev kits are as readily available as the RPi --> http://www.mouser.com/new/leopardimaging/leopardimaging368/

There would be no problem implementing wifibroadcast on a DM36x. There is no open source hardware video encoder on the market and I don't think there will be at least in the near future.

@Fnoop Dogg

I think the RPi camera is good enough. If you mount good quality optics to the sensor it should be at least as good as a gopro 2.

If I made gopro style camera with quality optics based on the raspberry would you buy it? I could even integrate the wifi module into the case.

Yeah, there are different adaptors available to mount quality lenses. Some people even managed to mount really big zoom objectives to it: http://www.truetex.com/raspberry_pi_13.jpg

http://www.truetex.com/raspberrypi

Yep, just changed the main loop (amongst other things) to write to two cards.  I've seen it work where at range (through a few walls in my house) one channel is clearer than the other and thus continues to give me a clean picture. :)

Reply to Discussion

RSS

Groups

Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service