Hi
Over the last couple of months I have been working on a project that might be of interest to you: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/
Basically it is a digital transmission of video data that mimics the (advantageous) properties of an analog link. Although I use cheap WIFI dongles this is not one of the many "I took a raspberry and transmitted my video over WIFI"-projects.
The difference is that I use the cards in injection mode. This allows to send and receive arbitrary WIFI packets. What advantages does this give?
- No association: A receiver always receives data as long as he is in range
- Unidirectional data flow: Normal WIFI uses acknowledgement frames and thus requires a two-way communication channel. Using my project gives the possibility to have an asymmetrical link (->different antenna types for RX and TX)
- Error tolerant: Normal WIFI throws away erroneous frames although they could have contained usable data. My project uses every data it gets.
For FPV usage this means:
- No stalling image feeds as with the other WIFI FPV projects
- No risk of disassociation (which equals to blindness)
- Graceful degradation of camera image instead of stalling (or worse: disassociation) when you are getting out of range
The project is still beta but already usable. On the TX and RX side you can use any linux machine you like. I use on both sides Raspberrys which works just fine. I also ported the whole stack to Android. If I have bystanders I just give them my tablet for joining the FPV fun :)
Using this system I was able to archive a range of 3km without any antenna tracking stuff. At that distance there was still enough power for some more km. But my line of sight was limited to 3km...
In the end, what does it cost? Not much. You just need:
2x Raspberry A+
2x 8€ wifi dongles
1x Raspberry camera
1x Some kind of cheap display
Happy to hear your thoughts/rebuild reports :)
See you,
befinitiv.
Replies
Wolke, sorry to bother you, what board is it? Odroid c1 or raspi?
@Wolke: I'm waiting for a blackmagic micro cinema soon and I also want to try the B101 with it.
How is working the B101 for you? Maybe you could test with a gopro...
Yours gimbal looks nice, do you plan to make the 3dfile avaible to test it?
I've send you a pvm, please check
currently i don not test the b101 much more. i am still waiting a little bit more. if looks that software(firmware and maybe kernel driver) for the b101(Toshiba TC358743XBG bridge) is currently under develop.
https://www.raspberrypi.org/forums/viewtopic.php?f=38&t=120702
i have no gopro here. so the second choice for me is the b12 board.
http://www.auvidea.eu/index.php/theme-styles/2013-06-22-21-23-36/e12 from auvidea.
think this little device keep things simple. i will order one next week to test it with my gh4.
My Black Magic Micro Cinema Gimbal is still under develop. also the design (3d models) are still in work. i need the camera!! to finish an release candidate.
but if it finished it becomes an completely own thread here. all parts become an CC license and will be available for everyone. but i think not only the nylon based gimbal mechanic is interesting. also the control software (gimbal side, groundstation side and also how to control the gimbal, can be become a very interesting project and alternative against some commercial products. so if all things are together including some parts of software which i wrote and also the license questions are solved i will start the tread here. also it becomes an own web site(blog) and github repository to work together with other interested people.
the focus for this gimbal is professional filming. this mean that you can not control all gimbal and camera functions as pilot only. the main design here is to operate the gimbal via an camera operator and also, if available, with focus puller. this mean the gimbal is only fully operable with minimum two persons of personal.
but in germany where i live, at least you have to fly in a two person setup to follow flight rules for commercial used UAS. the pilot must ensure that he can control the UAS each moment of the flight in visual and at least manual mode(if something goes wrong). gps based auto modes are a grey zone. if i understand it right the pilot must ensure to interrupt gps based auto modes at each time, also he have to ensure that he never lost visual (eye) contact to the UAS. so complex camera operations which need a lot of attention are simple not possible(allowed) to operated only from the pilot.
/g
wolke
Hello Wolke,
How do you intend to transmit the stream using this E12 board?
Hi Wolke, sorry I cannot help you with your predicaments, but would you share how you installed drivers for the TP-Link USB dongle?
Many thanks in advance
This board may be of interest
B101 HDMI to CSI-2 bridge for Raspberry Pi is shipping now
Yes, I saw that as well. I wonder if it adds much latency. It seems it would allow you to attach a number of consumer cameras to an RPi (even a go-pro).
hi,
i have one since a few days and test it witch an netcat pipe with and without fifo file pipe. the latency is dependend to your network quality and your software (i use mplayer) at rx side the same with an attached raspicam module at 1080p 30fps.
but the con, my camera which i try to attached at this module do not support 1080p at 30 fps which is in moment the only supported hdmi input mode. also my test camera (lumix gh4).
curios is that if connect a odroid-ci via hdmi at 1080p24hz i get an image. see the screen shot. in background you see mplayer with the odroid desktop. in foreground my terminals to rasperrypi, odroid-setup hdmi configuration and the local mplayer terminal.
the second image show the setup. the third image is my desktop with connected gh4 with 1080p and 25hz. here you see that only the top of the image is filled with colorful stripes. and the lower part of the immage is still the rest of the connection to the odroid-ci desktop. it is not cleaned.
the last image is the gh4 connected to the b101.
i hope that it is possible to fix the 1080p 30hz issue on the software(driver) side. my question, is is there anybody out who can say which is the right part which used by raspivid or which kernel module must patched that more hdmi input modes will available?
ps,
the gimbal on this images is in early development status. it is an 3 axis gimbal for blackmagic micro cinema camera. the camera is still not availeble here. so i can not test the b101 with the micro cinema camera. if the gimbal get all components, it becomes an (companion gimbal :) a name i create for this device) this mean via slip-ring the gimbal computer rp2 is connected to the flightcontroller. beside image processing it also run mavproxy to creante an bidirectional telemetry connection via 5ghz network. also the storm32 controller is connected via mavlink to the flightcontroller. and also not shown, is an teensy3.1 board which can spoke s-bus and will connected via usb to the rp2. the teensy s-bus can communicate to the camera s-bus interface as well to the storm32 controller. in this case no receiver must be used, and you can control the gimbal via network connection. if you attach an simple s-bus receiver we use an s-bus splitter to speak to storm32 and to the camera s-bus interface.
if the gimbal becomes an later development status i will create an own thread here. the gimbal is fully 3D printable. Material is nylon(white parts) and co-polyester(grey part). nylon is extremely strong and, if printed with 20% infill, also extremely lightweight. co-polyester is strong and easy to print. i also print the vibration mount with nylon because it is flexible but with an supper smooth damping factor. there is no reverberate on this material. it act like an oil damper in combination with metal springs.
/g
wolke
Okay, raspberrypi noob here, tried to set this up but I am struggling haha
I copied this into LXTerminal,
and got the following screen shots where it would abort even if I chose yes
as well as also trying to compile it using this
Well a rookie out please haha :)
pi2.jpg
pi3.jpg
pi4.jpg