Dear community members,
During this ICRA we presented our work on "Uncertainty-aware Receding Horizon Exploration and Mapping using Aerial Robots".
This work is now accompanied with an open sourced ROS toolbox available at: https://github.com/unr-arl/rhem_planner
These videos present the planner in operation:
In case you find it interesting, we would be happy to support you in order to fully integrate it in your research.
Best,
Kostas
Comments
Some details:
* perception system: stereo visual - synchronized with IMU / software synchronization with IMU works just fine (in fact the video in darkness is like that)
* processing power (all processing takes place onboard): Intel NUC i7
* autopilot for low-level controls (attitude, thrust): Pixhawk/PX4
* middleware: ROS
* localization: visual-inertial odometry (GPS-denied)
@Gary: thanks a lot! Answer above
@Hugues: not sure I got your comment but if you refer to the fact that it takes more time to process the answer is yes it does (as opposed to fully deterministic approaches) but also ensures (locally) more consistent mapping. No free lunch.
@Andreas: thanks! Answer above.
Very impressive. I take it this is structure from motion? What hardware does the heavy lifting? How would it perform outdoors at indefinite range?
this experiment shows very well why nature decoupled vision/perceptive organs from a conscencious brain decisions (to move there or there). It is faster and more efficient.
Really excellent work guys,
Would really like to know more about the sensor and computing hardware you used and whether it is all on board the copter or not.
Best Regards,
Gary