Dear community members,

During this ICRA we presented our work on "Uncertainty-aware Receding Horizon Exploration and Mapping using Aerial Robots".

This work is now accompanied with an open sourced ROS toolbox available at: 

These videos present the planner in operation:


In case you find it interesting, we would be happy to support you in order to fully integrate it in your research.


E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Some details:

    * perception system: stereo visual - synchronized with IMU / software synchronization with IMU works just fine (in fact the video in darkness is like that)

    * processing power (all processing takes place onboard): Intel NUC i7

    * autopilot for low-level controls (attitude, thrust): Pixhawk/PX4

    * middleware: ROS

    * localization: visual-inertial odometry (GPS-denied)

    @Gary: thanks a lot! Answer above

    @Hugues: not sure I got your comment but if you refer to the fact that it takes more time to process the answer is yes it does (as opposed to fully deterministic approaches) but also ensures (locally) more consistent mapping. No free lunch.

    @Andreas: thanks! Answer above.

  • Very impressive. I take it this is structure from motion? What hardware does the heavy lifting? How would it perform outdoors at indefinite range?

  • MR60

    this experiment shows very well why nature decoupled vision/perceptive organs from a conscencious brain decisions (to move there or there). It is faster and more efficient.

  • Really excellent work guys,

    Would really like to know more about the sensor and computing hardware you used and whether it is all on board the copter or not.

    Best Regards,


This reply was deleted.