With all the talk of integrating drones into civilian airspace, there is need for better safety and GPS-agnostic navigation methods - visual navigation and obstacle avoidance is paramount to integrating drones (micro-aerial-vehicles, MAVs) into our cities and elsewhere, because external navigation aids cannot be relied on in every situation, and neither can pilot experience.
Visual navigation is the solution to these challenges, and we present an aerial robot designed from scratch and ground up to meet these requirements. We demonstrate several facets of the system, including visual-inertial SLAM for state estimation, dense realtime volumetric mapping, obstacle avoidance and continuous path-planning.
In search of better extensibility, and better fitness for the research goals I had pitted myself towards, I started working as a part of the open-source PX4 Autopilot development community in 2013. Aware of the limitations in the field, I started Artemis, my research project on visual navigation for aerial vehicles in 2014. Today, I'm happy to present our intermediary (and satisfying!) results.
At its very core, Project Artemis is a research project which aims to provide improved navigation solutions for UAVs.
All of the Artemis MAVs follow the same basic, distributed system architecture. There is a high-level onboard computer, and a low-level embedded flight controller, typically a PX4 autopilot board, or similar derivative. The middleware of choice on the high-level companion computer is ROS (Robot Operating System) and the PX4 Middleware on the deeply embedded controller.
Software
Visual Navigation
Multiple cameras provide proprioceptive information about the environment, used for mapping and localisation. Forward stereo cameras are used to compute depth images in realtime.
All cameras are synchronised in time with respect to each other, and to the IMU (Inertial Measurement Unit) of the flight controller. Depth images are inserted into a volumetric mapping framework based on an Octree representation, and a 3D map of the environment is built incrementally onboard the vehicle.
We also use a SLAM (Simultaneous Localisation and Mapping) technique on our robot. The system continuously constructs a sparse map of the environment which is optimised in the background. Visual SLAM is globally consistent, and centimetre-level accurate unlike GPS, and works both indoors and outdoors. Tight fusion with time-synchronised inertial measurements greatly increases robustness and accuracy.
State Estimation
The system is designed to navigate using all available sensors in the environment, which includes both GPS and vision outdoors and pure vision indoors. Since sensor availability is not guaranteed, a modular sensor fusion approach using a hybrid Kalman filter with fault detection is used to maintain a robust state estimate. Motivation to use all the information from all the sensors is that even if a particular subset or module were to fail, the overall system performance would not be compromised.
Obstacle Avoidance
The global volumetric map is used to continuously compute a collision-free trajectory for the vehicle. In operator-assist mode, the motion planner only intervenes if the operator’s high-level position commands could lead to a possible collision. In autonomous modes, the planner computes optimal trajectories based on a next-best-view analysis in order to optimise 3D reconstruction. The planner sends its commands to the minimum-snap trajectory controller running on the low-level flight controller, which computes motor outputs.
It is important to point out that this can be achieved *today* with open-source systems, albeit with some perseverance and experience. Better documentation on how to achieve a relatively close reproduction of our results is underway. It will be made available soon via the UASys website (http://www.uasys.io/research) and the PX4 Autopilot developer wiki (http://dev.px4.io)
Our open-sourced portions of the software stack are available here : www.github.com/ProjectArtemis
I will also be presenting a talk on Project Artemis and our software stack at Embedded Linux Conference at San Diego, CA. Please attend if you'd like to get an in-depth view into the system's workings! The presentation will aim to accelerate the introduction to the current state of the aerial vehicle market, and the several limitations that it faces due to limited technological breakthroughs in terms of consumer systems. Newcomers and existing developers / system integrators will get a chance to understand these limitations, and what embedded Linux systems can do for the field, including but not limited to visual (GPS-denied) navigation, mapping, obstacle avoidance and high-definition video streaming. The talk also aims to encourage the current open-source development communities, and how they can contribute better to improving the current state-of-the-art, be it with cross-middleware interoperability, modular and reusable software design or inexpensive and extensible hardware design.
Slides are available here : http://events.linuxfoundation.org/sites/events/files/slides/artemis_elc16_final.pdf
Learn more about my session at http://sched.co/6DAs and register to attend at http://bit.ly/1ZuUtiu.
Stay updated! -
Wesite : http://www.uasys.io
GitHub : https://www.github.com/mhkabir
Instagram : https://www.instagram.com/uasys/
Twitter : https://twitter.com/UASysOfficial
Facebook : https://www.facebook.com/UASys
Cheers,
Kabir