Hi

Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...

Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.

The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?

- No association: A receiver always receives data as long as he is in range

- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)

- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.

For FPV usage this means:

- No stalling image feeds as with the other WIFI FPV projects

- No risk of disassociation (which equals to blindness)

- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range

The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)

Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...

In the end, what does it cost? Not much. You just need:

2x Raspberry A+

2x 8€ wifi dongles

1x Raspberry camera

1x Some kind of cheap display

Happy to hear your thoughts/rebuild reports :)

See you,

befinitiv.

Views: 70436

Reply to This

Replies to This Discussion

Good if it's at least 720p with decent bitrate.

Hi Andreas, I got similar results to you, between 50-100m.  I think these dongles are junk for tx.  I asked CSL several times what the power these dongles are capable of but they couldn't answer - not even vaguely - I suspect they just package/badge a generic chinese dongle and don't really know anything about it.  The driver disk that comes with it just contains files downloaded from ralink website.

I'm intending to use these as RX only with an Alfa AWUS051NH V2 as TX.  I'm waiting for these to be shipped over here from the UK so will have some preliminary results by Tuesday.

It would be really nice if this technique could be ported to something like an ESP8266 module with a decent frontend added for good range.

Amen!

Very nicely put together quad.  Mine is a complete random mess compared!

Quick update, I have no time for flying at the moment but did one quick flight tonight.  I did the same 1.5km mission as before but increased altitude to 100m to keep it over the hill/horizon.  Interestingly the helical antenna reached out to about 650m, much better than I expected.  The patch antenna was still going strong at 1.5km when the auto mission turned back home.  It was getting some drops from about 1.2km but recovers immediately and I was still getting perfectly strong video at 1.5km.

This is still at 18m rate, 720p.  I'll change the bitrates and framerates higher and lower when I have time and see what the ranges are like for the different configurations.

One thing I've noticed is that the video tx sometimes stopped.  Restarting the rx doesn't make any difference because it's the tx side that's stopped, and unlike with traditional associated/adhoc wifi you can't connect to the raspberry over ssh to restart it, so there's nothing you can do until you can physically reset the raspberry, which is a pain.  It seems to be a bug in the kernel dongle driver so probably worth trying updated drivers.

Also, when running rx in software diversity mode, if one interface disappears (like if it accidentally gets unplugged), the rx program bombs out.  It would be great if it could continue with a single interface.

1280x720 @ 700kbsp in case of 2 fps, 3mbsp otherwise.

I've found where in the pipeline the last frame gets stuck. Raspivid uses fwrite to output the h264 data to stdout. Since fwrite is buffered, not all data received from the encoder has been written out immediately. Therefore I modified raspivid to use a plain "write" function call to get rid of the buffering.

Another issue is the conversion of a stream of bytes into NALU units. Afaik the only way to determine the length of the current NALU is to wait until the start header (0x00000001) of the next NALU. But if you do that you are already one frame too late. Appareantly raspivid (with my change from fwrite to write) always outputs complete NALUs in a single write. Therefore I made a hack that appends the NALU start header at the end of the current write call and removes the start header at the beginning of the next write (and so on).

My changes allowed me to repeat the encoding latency tests without any frame being stuck in the pipeline. This time I worked directly on the Raspi with the camera to really get the encoding latency (without transmission and decoding). I attached a screen to the raspi with a timestamp running on it. In parallel I captured video data from my modified raspivid and timestamped each NALU. In case of 1280x720, 700kbps, 2fps, every second frame a keyframe I measured the encode latency to be somewhere between 70 and 80ms. This of cause also includes the latency of my screen. So the actual latency should be some ms below that.

While my hacks helped to reduce latency a bit (roughly by the time of a frame, in case of 48fps 20ms) it wasn't a big step forward. But at least now we have an understanding what part of the system causes what latency.

cool, great work!

Using the same technique I also timestamped capture and encode. This was just a quick hack so no guarantees for correctness. But:

Encode of a 1280x720 from memory to memory frame takes 10ms (hello_encode)

Capture of a frame using raspistillYUV takes ~150ms

This doen't quite add up and I do not have enough insight into the raspicam modules to explain the discrepancy. However it is very interesting to see that most likely the frame capture is what takes most of the time, not the encoding.

Have you taken a look at the rpicamsrc project --> https://github.com/thaytan/gst-rpicamsrc

It offers the same functionality as raspivid, but can be used with gstreamer without using a pipe. Some people say that using pipes also increases latency because pipes have buffers.

I'm using it, but i have not compared it to raspivid in terms of latency.

raspistill is known to be slow. The reason is it starts up the camera and does some manadatory tasks just for capturing one frame. Most of them will be calibration of the sensor, like exposure.  Frame latency form raspistil and raspivid is not a very fair comparison.

 

Reply to Discussion

RSS

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service