Vision Based Collision Detection System

images?q=tbn:ANd9GcQx3YQUh1ZDKmVN1INBnMoHwX0i4y_xvBFqpKBJVWlVGUECXjw&t=1&usg=__u2LTyhMys1MPq5obr34naGjAaZk=

So I think we can all agree that a reliable detect-and-avoid system would indeed be the holy grail of the UAV world. The current debacle of UAV regulation hinges on the fact that UAVs indeed would not be able to avoid collisions as they would not be able to sense the presence of other traffic.

With the current technology at hand, it seems to me that vision based systems present the only option for small UAVs due to both cost and weight constraints of radar systems. There are already some research centers that are doing work on this such as these guys:


I am also compelled to start research in this area both due to personal interest and necessity. Having Project Andromeda puts me in a good position where we can develop and test the system on a reliable platform.

I am thinking whether or not an Open-Source version of the same system would be possible. I would set-up the project and provide hosting (most likely as part of the PA website). The target demographic would be people who are interested in digital image processing, SLAM systems and vision based spacial tracking and general robotics.

A great deal of work has already been done as part of the Project Andromeda effort and I will ensure separation of PA code and work done on this project to ensure license integrity (as PA is not under an Open Source license).

Here's a short description of the proposed system:

The picture above is the main camera of the Project Andromeda platform, the Sony FCB-EX20DP.

The PA platform also includes a 1.4Ghz embedded PC with an on-board frame grabber that captures the frames from the FCB-EX20DP in compressed and raw form for transmission to the ground and for processing. The camera is attached to a PT gimbal which allows it to be stabilized. I'm currently using openCV to research different methods.

The proposed system would use a modification of the current Project Andromeda PTZ camera to scan the skies for traffic. It would used dense optical flow fields coupled with feature detection to spot objects moving against the background. It would then use Kalman Filters to track them in 3D space and using mono-SLAM to obtain range values if possible. An alternate version could use a lightweight laser range-finder coupled with the main camera to obtain accurate ranging for objects that are close.

The end result is a system that maintains a repository of objects by tracking them in the surrounding 3D space. The zoom ability allows the system to quickly obtain close-up images of each object. This could either be used to notify a ground-based controller or to match up against an onboard database of objects.

So, this is obviously something I've been thinking about for a while. It could go two ways, I would either develop it closed-source as part of PA, or it would be an open source project that would rely in PA and other similar projects for air-time and reliability testing. The end result would be a Open Source system comprising of the mechanical, electronic and software blueprints necessary for operation.

Is anyone interested in something like this? If you are, or any other thoughts please shout-out below. I'll be following this carefully to see if it's at all feasible as an Open Source project.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Chris

    Here is the FAA final rule on ADS-B.

    Tom
  • Moderator
    If its anything like here Chris, yes all aircraft including paragliders, hang gliders etc etc etc. I carry a mode S transponder in my balloon already. I can't see any flight anywhere being allowed outside of VLOS without detect,sense and avoid being sorted. Actually thats not true, I can see flight out to maybe 1.5km with a come home very quick or stop flying (parachute) system. In the UK you would have to make a very good safety case for it. That said the CAA chaps are willing to talk which is very positive. So maybe not everywhere but in pockets thought safe enough.
  • Are you sure that ALL aircraft in class A, B and C airspace will require a mode S transponder? Including LSA?
  • I agree with Earl that Mode S provides a much simpler and cheaper solution. All aircraft in class A, B, and C airspace will have to have it by 2020, and I hope they extend that to all airspace. This would allow UAVs with ADS-B "in" a "sense and avoid" capability.

    Keep in mind that any "sense and avoid" system would need to be certified if you want it to fly in the NAS. I doubt there will ever be an open source product certified.

    My $0.02.
  • Thank you Irvin. My main concern is that my time is very limited and I don't think I'd be able to push this forward on top of Project Andromeda all by myself. What I am willing to do however is provide the opportunity for people interested in the topic to be able to work on and test their ideas on a real-world system.

    If you are interested and capable in image processing, vision based systems and anything related - and are interested in something like this, please shout out. I'll see how many people are interested and then we can discuss the details on the forums.
  • Nima,

    The idea of an open source project to implement a 'see and avoid' capability for amateur UAV's should get a lot of support. We have limited equipment, budget, and capabilities - but we do have the need for such a system. We are willing to contribute ideas, critical review, and possibly, system or component testing.

    Let's hope for a successful project.
  • Maybe I am a bit out of my league here but what about IR or thermopile based?

    Roborealm also supports image tracking.
  • Mode S transponders are small and cheep now.
    Of coarse the OTHER plane must have a transponder also.
    Earl
  • Camera placement is yet to be determined.
    Sub pixel disparities are done:
    http://www.google.com/url?sa=t&source=web&cd=1&ved=0CBc...
    My plan is to develop a scalable system, so a hemisphere of sensors can be used in conjunction for both long range and short range. Look at Geof's setup (although for optical flow):
    http://centeye.com/graphics/slides/MAOS2.JPG

    I am using Firefly Small sensors:
    http://centeye.com/pages/products/products.html
    http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.64.6994&rep=rep1&type=pdf
  • How far apart are your cameras placed? I am looking at a system that would eventually match/be close to a Pilot's eyes. To get a usable disparity map at 1nm you'd need cameras placed quite far apart to resolve disparity bigger than a pixel.

    I have looked at both SURF and SIFT and intend to use them for the feature detection component. However this is just an idea. The premise is to promote collaboration, set up a website and repository and also generate awareness of the project through Project Andromeda and other media.
This reply was deleted.