Visual Odometry For GPS-Denied Flight with a Kinect

The Robust Robotics Group at MIT has developed a system that allows a quadrotor outfitted with a kinect sensor to fly autonomously in GPS-Denied environments, building a map of the environment.

Unlike previous work, our system does NOT rely on external motion capture to localize and control the vehicle. All sensing and computation required for local position control is done onboard the vehicle.

We have collaborated with members of the Robotics and State Estimation Lab at University of Washington to use their RGBD-SLAM algorithms to perform simultaneous localization and mapping (SLAM) and build metrically accurate and visually pleasing models of the environment through which the MAV has flown.



More info can be found here:
http://groups.csail.mit.edu/rrg/index.php?n=Main.VisualOdometryForGPS-DeniedFlight
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Moderator
    Moores law
  • Yeah for realistic quad sensor development!!

    Seriously, the videos from the two groups using external movie studio quality motion tracking are passed around the office now to get a good chuckle from my co-workers. It must be fun for them to live in a dream world of perfect localization, but zero practicality in real life.
This reply was deleted.