So I think we can all agree that a reliable detect-and-avoid system would indeed be the holy grail of the UAV world. The current debacle of UAV regulation hinges on the fact that UAVs indeed would not be able to avoid collisions as they would not be able to sense the presence of other traffic.
With the current technology at hand, it seems to me that vision based systems present the only option for small UAVs due to both cost and weight constraints of radar systems. There are already some research centers that are doing work on this such as these guys:
I am also compelled to start research in this area both due to personal interest and necessity. Having Project Andromeda puts me in a good position where we can develop and test the system on a reliable platform.
I am thinking whether or not an Open-Source version of the same system would be possible. I would set-up the project and provide hosting (most likely as part of the PA website). The target demographic would be people who are interested in digital image processing, SLAM systems and vision based spacial tracking and general robotics.
A great deal of work has already been done as part of the Project Andromeda effort and I will ensure separation of PA code and work done on this project to ensure license integrity (as PA is not under an Open Source license).
Here's a short description of the proposed system:
The picture above is the main camera of the Project Andromeda platform, the Sony FCB-EX20DP.
The PA platform also includes a 1.4Ghz embedded PC with an on-board frame grabber that captures the frames from the FCB-EX20DP in compressed and raw form for transmission to the ground and for processing. The camera is attached to a PT gimbal which allows it to be stabilized. I'm currently using openCV to research different methods.
The proposed system would use a modification of the current Project Andromeda PTZ camera to scan the skies for traffic. It would used dense optical flow fields coupled with feature detection to spot objects moving against the background. It would then use Kalman Filters to track them in 3D space and using mono-SLAM to obtain range values if possible. An alternate version could use a lightweight laser range-finder coupled with the main camera to obtain accurate ranging for objects that are close.
The end result is a system that maintains a repository of objects by tracking them in the surrounding 3D space. The zoom ability allows the system to quickly obtain close-up images of each object. This could either be used to notify a ground-based controller or to match up against an onboard database of objects.
So, this is obviously something I've been thinking about for a while. It could go two ways, I would either develop it closed-source as part of PA, or it would be an open source project that would rely in PA and other similar projects for air-time and reliability testing. The end result would be a Open Source system comprising of the mechanical, electronic and software blueprints necessary for operation.
Is anyone interested in something like this? If you are, or any other thoughts please shout-out below. I'll be following this carefully to see if it's at all feasible as an Open Source project.