Developer

With all the talk of integrating drones into civilian airspace, there is need for better safety and GPS-agnostic navigation methods - visual navigation and obstacle avoidance is paramount to integrating drones (micro-aerial-vehicles, MAVs) into our cities and elsewhere, because external navigation aids cannot be relied on in every situation, and neither can pilot experience.

Visual navigation is the solution to these challenges, and we present an aerial robot designed from scratch and ground up to meet these requirements. We demonstrate several facets of the system, including visual-inertial SLAM for state estimation, dense realtime volumetric mapping, obstacle avoidance and continuous path-planning.

In search of better extensibility, and better fitness for the research goals I had pitted myself towards, I started working as a part of the open-source PX4 Autopilot development community in 2013. Aware of the limitations in the field, I started Artemis, my research project on visual navigation for aerial vehicles in 2014. Today, I'm happy to present our intermediary (and satisfying!) results.

At its very core, Project Artemis is a research project which aims to provide improved navigation solutions for UAVs.

Screenshot-from-2016-03-18-071002.png?width=973

All of the Artemis MAVs follow the same basic, distributed  system architecture. There is a high-level onboard computer, and a low-level embedded flight controller, typically a PX4 autopilot board, or similar derivative. The middleware of choice on the high-level companion computer is ROS (Robot Operating System) and the PX4 Middleware on the deeply embedded controller.


Software

Visual Navigation

Multiple cameras provide proprioceptive information about the environment, used for mapping and localisation. Forward stereo cameras are used to compute depth images in realtime.

disparity

All cameras are synchronised in time with respect to each other, and to the IMU (Inertial Measurement Unit) of the flight controller. Depth images are inserted into a volumetric mapping framework based on an Octree representation, and a 3D map of the environment is built incrementally onboard the vehicle.

3689686568?profile=original

We also use a SLAM (Simultaneous Localisation and Mapping) technique on our robot. The system continuously constructs a sparse map of the environment which is optimised in the background. Visual SLAM is globally consistent, and centimetre-level accurate unlike GPS, and works both indoors and outdoors. Tight fusion with time-synchronised inertial measurements greatly increases robustness and accuracy.

mapping

State Estimation

image.09Q5DY

The system is designed to navigate using all available sensors in the environment, which includes both GPS and vision outdoors and pure vision indoors. Since sensor availability is not guaranteed, a modular sensor fusion approach using a hybrid Kalman filter with fault detection is used to maintain a robust state estimate. Motivation to use all the information from all the sensors is that even if a particular subset or module were to fail, the overall system performance would not be compromised.

Obstacle Avoidance

The global volumetric map is used to continuously compute a collision-free trajectory for the vehicle. In operator-assist mode, the motion planner only intervenes if the operator’s high-level position commands could lead to a possible collision. In autonomous modes, the planner computes optimal trajectories based on a next-best-view analysis in order to optimise 3D reconstruction. The planner sends its commands to the minimum-snap trajectory controller running on the low-level flight controller, which computes motor outputs.

It is important to point out that this can be achieved *today* with open-source systems, albeit with some perseverance and experience. Better documentation on how to achieve a relatively close reproduction of our results is underway. It will be made available soon via the UASys website (http://www.uasys.io/research) and the PX4 Autopilot developer wiki (http://dev.px4.io)

Our open-sourced portions of the software stack are available here : www.github.com/ProjectArtemis

I will also be presenting a talk on Project Artemis and our software stack at Embedded Linux Conference at San Diego, CA. Please attend if you'd like to get an in-depth view into the system's workings! The presentation will aim to accelerate the introduction to the current state of the aerial vehicle market, and the several limitations that it faces due to limited technological breakthroughs in terms of consumer systems. Newcomers and existing developers / system integrators will get a chance to understand these limitations, and what embedded Linux systems can do for the field, including but not limited to visual (GPS-denied) navigation, mapping, obstacle avoidance and high-definition video streaming. The talk also aims to encourage the current open-source development communities, and how they can contribute better to improving the current state-of-the-art, be it with cross-middleware interoperability, modular and reusable software design or inexpensive and extensible hardware design.

Slides are available here : http://events.linuxfoundation.org/sites/events/files/slides/artemis_elc16_final.pdf

Learn more about my session at http://sched.co/6DAs and register to attend at http://bit.ly/1ZuUtiu

3689686609?profile=original

3689686489?profile=original

Stay updated! -
Wesite : http://www.uasys.io
GitHub : https://www.github.com/mhkabir
Instagram : https://www.instagram.com/uasys/
Twitter : https://twitter.com/UASysOfficial
Facebook : https://www.facebook.com/UASys

Cheers,

Kabir

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • @kabir: This is really great work.  Could you give us some camera recommendations?  I'm working on getting semi-direct visual odometry working.  The SVO wiki uses the Matrix Vision Bluefox, is that still a good camera?  It's a bit tricky for my purchasing because I can't find a US based supplier.  I've looked at the 0.3mp FireFly mv mono, but that still costs $295. 

  • Could you send a link to the i7 board you used?

  • @Kabir

    i read your slide and learn your navigation pipeline, why do you mount a bottom camera to the quadrotor?

    what role is played in the navigation pipeline? i think the opticalflow sensor maybe can replace the fuction of

    bottom view. so what will be happened if the bottom camera is removed?

  • Donation Done :-)

    Keep on this great work Kabir !!

  • Developer

    Thanks Andy! Will implement your suggestion now.

  • Developer

    Done  well only a small donation but i hope it all helps

    Just a thought.... possibly a link to that Donate section in the home page  header would be useful, so its visible right away.

    I think this is an awesome project :)

    regards

    Andy

    regards

    Andy

  • Developer

    Hi Andy,

    If you go to www.uasys.io/research there is a donate button at the bottom of the page. Thanks!

  • Developer

    Wow, this is Genius!

    If you cant get hold of your TX1 I would be happy to make a small donation towards one, if there is somewhere I can donate? 

    regards

    Andy

  • Moderator

    I'm with you on the pain & struggle surrounding certain computer boards, I feel I (partly) wasted many hours trying to make something work and in reflection it wasn't really worth the brute force fight. I do hope that one day something the size of an Rpi appears, also piggybacking on the development there, but with replaceable Soc, mini-pcie, couple of more camera inputs, usb 3. I would probably pay $99 for a rpi like this.

  • Developer
    The VI-sensor was pulled from production. I'm working on a different FPGA based system, and I hope to share results soon. The primary challenge of using an FPGA is the amount of man-hours that need to go into that to get something going. Not exactly a one-man job :)

    Concurrently, I'm working on Snapdragon and Jetson platforms as well. Currently though, an i7 system ends up either cheaper or equal to either of those two. The total compute power and flexibility available on the i7 is very nice to work with. Trust me, I've worked with many, many processing architectures and their associated BSPs - and most are a pain.

    BTW, anyone has a TX1 module they're willing to donate for development work?
This reply was deleted.