I am working on a project that wants to put a thermal an a visual camera on a UAV gimbal do some video processing and transmit a fused image in real time to the ground.
Today they are transmitting the GoPro camera image without any video processing. But I will work on a solution to fuse the two images from the two cameras to one image and send that down instead.
I have no prior knowledge for UAVs, but that are not really relevant for my part of the project either. I will only work on the fusion of the images and transmit them. Could as easily been a "on ground" project. I have a background in programming and computer vision.
My idea right now is to take in two video streams to a single board computer on the UAV and using openCV to fuse the images and send it down with the same transmitter and receiver they are using now. Is there any obvious flaw in this idea?
As I am understanding the easiest way of getting a stream from a GoPro is using a wifi connection. Is this true?
Suggestions on single board computers to use is welcome. Will need to be Linux compatible and more powerful than your ordinary raspberry.