Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...
Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.
The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?
- No association: A receiver always receives data as long as he is in range
- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)
- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.
For FPV usage this means:
- No stalling image feeds as with the other WIFI FPV projects
- No risk of disassociation (which equals to blindness)
- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range
The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)
Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...
In the end, what does it cost? Not much. You just need:
2x Raspberry A+
2x 8€ wifi dongles
1x Raspberry camera
1x Some kind of cheap display
Happy to hear your thoughts/rebuild reports :)
Take a look at the "hello_font" program. It should be straight forward to adapt this for OSD (running on the ground station).
Things to do (on my plan when I have some spare time):
- Transmit telemetry over second port via wifibroadcast to the ground
- Receive telemetry via wifibroadcast and pipe the data into a modified hello_font program to display information.
This should be much better that the "analog" way of drawing into the video:
- Video stays clean
- Data can be recorded and is machine readable
Works as expected: https://youtu.be/EagDJrwleHQ :)
This one transmits the frsky telemetry from the naze32 over wifibroadcast. The battery voltage is displayed on the ground station PI as an overlay onto the video.
i think once few of us have flown the aircraft using this digital video link this should go on the wiki. this was the last missing part of now perfect FPV/autonomous aircrafts. this needs to be on the wiki fast. lot of us are not into Linux programming so wiki becomes essential. (wish was in ardunio :)
Just for information, on this page a found this:
Toshiba HDMI to MIPI CSI2 bridge chip TC358743XBG
could this be used for HDMI input, in theory, right?
Come on dude, read the previous posts before posting. This matter has already been discussed.
This chip will work if someone writes new firmware for the RPi.
I was off grid 2 days and i didn't realized that the matter was closed in that way.
I'm wondering whether RPi 2 is capable of running Mission Planner.
Meanwhile I stumbled upon this GCS --> https://github.com/multigcs/multigcs
The A+ model is quite suitable for UAVs as it's quite a bit smaller than the pi2/B+ and draws less power. It has the same videocore which does most of the heavy lifting, and from what little I've seen so far the tx/rx code from befinitiv doesn't consume much CPU, so it should work OK. I have one lying around so I'll give it a go when I have a moment.
It runs APMPlanner2, if that helps. I always had very unsatisfactory results running MP under wine, even on a reasonably powerful desktop.
Missed this, apologies. I only tried them in 2.4G but they worked fine. I did pick up the adapters befinitiv recommended since he'd done the work on patching their kernels and I wanted to minimise differences from the base platform.
A couple of people have measured the latency of their system very accurately so to save some people some effort I'd like to show the procedure that Jaime used to measure the "lense to screen" latency on his system (which is actually using regular point-to-point wifi but the same procedure works for any system).
What you do is:
1. on the ground station computer (RPi, Ubuntu, Mac, etc) pull up estopwatch.net and click the start button
2. point the vehicle camera at the ground station computer's screen. This should produce an endless image within image tunnel.
3. do a screen capture on the ground station computer
4. the latency is the time difference between any two consecutive times. So in the image below it's 52.885 - 52.707 = 0.178 seconds (or 178ms).