3689672167?profile=original(More info and full post here)

I've been experimenting with putting 360 degree vision, including stereo vision, onto a Crazyflie nano quadrotor to assist with flight in near-Earth and indoor environments. Four stereo boards, each holding two image sensor chips and lenses, together see in all directions except up and down. We developed the image sensor chips and lenses in-house for this work, since there is nothing available elsewhere that is suitable for platforms of this size. The control processor (on the square PCB in the middle) uses optical flow for position control and stereo vision for obstacle avoidance. The system uses a "supervised autonomy" control scheme in which the operator gives high level commands via control sticks (e.g. "move this general direction") and the control system implements the maneuver while avoiding nearby obstacles. All sensing and processing is performed on board. The Crazyflie itself was unmodified other than a few lines of code in it's firmware to get the target Euler angles and throttle from the vision system.

Below is a video from a few flights in an indoor space. This is best viewed on a laptop or desktop computer to see the annotations in the video. The performance is not perfect, but much better than the pure "hover in place" systems I had flown in the past since obstacles are now avoided. I would not have been able to fly in the last room without the vision system to assist me! There are still obvious shortcomings- for example the stereo vision currently does not respond to blank walls- but we'll address this soon...

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Great work Geoffrey! Can you tell us about the pro's and con's of using a stereo system for sense-and-avoid?

  • Frickin awesome!

    Up to now the majority of focus on UAVs has been flight control.  Now it seems the 'second phase' is coming which is companion computers and cameras starting to become aware of the environment around it and process/respond.  Bring on skynet!

  • Thank You for the kind words!

    @Cliff- Wow! I remember those OF units! I'd love to hear what you did with them! As for power, the system here on the Crazyflie pulls around 180mA from a single LiPo.

    @JB- For this one the acquired imagery does not get downlinked. What gets sent down is just high level sense/control information.

  • Very nice!

    Do you also have the ability to see what the cameras are looking at on the GCS or is it only using the imaging on-board?

  • Great job Geoffrey!  We still have one of your original optiflow units (the epoxied lens model!) on a ground vehicle (flying became an issue with payloads/occlusions) for R&D....

    Looks simpler than the AscTec system from last year's CES. How much power are you pulling from that thing?!

  • This has to be the most impressive thing I've seen in awhile. Not only is it fully integrated onboard, but it's tiny!


    Very well done :)

  • Good progress
    http://techcrunch.com/2012/08/19/tc-makers-centeye-creates-insect-l...

    TC Makers: Centeye Creates Insect-Like Flying Robots In A DC Basement
    When we first wandered up to the suburban home that house Centeye Inc., we were a bit confused. Could this be the place where a mad roboticist was bu…
  • Very impressing! Great work Geoffrey! Additionally the system is so tiny. 

  • Developer

    That is really awesome actually.  It's really responsive to objects coming close by.

This reply was deleted.