Visual-Inertial SLAM for Arducopter platform

Recently we have developed a Visual inertial approach for Hexacopter platform (arducopter) to do SLAM (Localization and mapping----- without GPS or other absolute sensor).  

The vision node is designed to provide full 6DOF vehicle pose measurements, whilst the low-cost inertial system designed in house (less than $40AUD) enables continuous metric mapping and navigation. Indoor and outdoor experiment results are performed, demonstrating the robustness of the proposed
approach in challenging environmental conditions. (The publication is under review and soon the documentation will be updated)

Cheers

 

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • publication can be found at :http://www.araa.asn.au/acra/acra2013/papers/pap135s1-file1.pdf

    Brief notes about kinect at outdoors:

    In outdoors kinect works under shade etc and at under direct sunlight it behaves as a monocular camera (only colour camera with no depth). So, considering this thing in mind, we have developed an approach to work with kinect at outdoors using inertial sensor (a key idea is switching from color-depth information to monocular camera information at conditions like sunlight, limited range etc). Usually in outdoor cluttered environment (e.g. dense forest etc), Kinect works as a color-depth (or color-only sensor) which is handy to get 3D maps etc.

    http://www.araa.asn.au/acra/acra2013/papers/pap135s1-file1.pdf
  • Thanks for your interest...... Sooner or later i will make it open-source (as i am a big supporter of open-source :) )

    We are using a separate IMU, as we require high data rates for position estimation of hexacopter.

    The Kinect provides a 3D point cloud and is not omni-directional (in visualization you can move around the map to see the point cloud). Regarding the data fusion we do use the Inertial attitude information with Kinect based processing algorithm in EKF filter.

    Please do write to me, if you want to know more about it :) .... i will be glad to answer

    Cheers

  • Awesome! I purchased a Kinect but have yet to implement it on anything. Looking forward as well to more information and hopefully this will end up being open source.

    It looks like the data when he moves the 3D mapping information is only "forward' looking, so imaging must be done using copter yaw?

  • Wow!  I was hoping this would be coming, but not this soon!

    You state you are using an inertial system designed in-house, so you are not using the APM for the inertial measurements?

    Will the source code be released?  

    How do you get the Kinect device to look in all directions, I thought they were directional.  You just yaw the whole copter around?

This reply was deleted.