By Lorenz Meier
(crossposted from here with permission)
Due to public demand the slides for the PX4 Update talk ELC 2016 are now online and as they lack the narrative, this post details most of the content.
The PX4 project has gone a long way since its humble beginnings in 2008. While it was originally focused on open hardware (Pixhawk) and flight control (PX4) and produced a communication protocol (MAVLink) and ground control station (QGroundControl) “by accident” to support these, the project has set out in 2015 to tackle the next challenges: Vision based, GPS-denied flight and obstacle avoidance.
PX4 is an independent, international and professionally run open source effort with full-time staff driving the main development and quality assurance. However, it always did benefit greatly from ETH research lab hosting it, the Computer Vision and Geometry Lab of ETH Zurich. While it has been slightly odd in the past that a vision lab did sport its own flight control project, this now becomes a critical factor for success in 2016: With computer vision becoming as important as flight control on drones, PX4 is perfectly set up to deliver a full stack including obstacle avoidance in 2016. We will do so in a collaborative effort within Dronecode where possible, but are confident to have all the knowledge in-house to be able to drive it.
2015 was a very intense year for the PX4 development with contributions skyrocketing compared to previous years.
Linux / POSIX Support
PX4 has added broad POSIX support in 2015, adding support for Linux, Mac OS (helpful for development) and QuRT. The Snapdragon flight is the first fully supported Linux target and RPI2 and Navio are supported on a bench-test level as well.
While Qualcomm offered a PX4 port to their customers already mid 2015, the PX4 dev team and Qualcomm collaborated on a second generation of this SDK which now is completely open source and on PX4 Firmware master.
PX4 supports VTOL since 2014 and extended the initial tail sitter support to tilt rotors and quad planes. While initial focus was on core flight control and safety-critical features, the development team has shifted its attention towards mission-critical features like ensuring smooth missions including forward and back transitions in all combinations.
Dronekit was recently added as a Dronecode project and now offers initial support for PX4. This is particularly critical for Dronekit as its ambition of becoming a standard drone API will require it over time to be entirely MAVLink generic.
The current range of drone simulators is relatively fragmented, with each autopilot project maintaining its own simulation backends. This is not only inefficient and a lot of parallel effort, but also undesirable for the open source ecosystem: It should be possible to combine a simulator with an autopilot stack freely, which will help to drive both software components individually to perfection. The PX4 project and theAutoQuad project have recently started to collaborate on a MAVLink API for simulation. Simulationkit is still in its infancy, but already good enough to allow both projects to share common simulation infrastructure. The three videos below show that very different simulators and very different vehicle types can be supported with a relatively simple set of MAVLink messages.
Baseline jMAVSim cross-platform simulator:
High-fidelity Gazebo simulating a Quad:
Gazebo simulating a VTOL plane in a fully autonomous mission:
Complete Drone Stack
A drone software stack in 2016 looks very differently from one in 2015: GPS-denied navigation based on computer vision (optical flow and visual-inertial odometry) and obstacle avoidance are becoming increasingly important. The PX4 dev team has abroad background in research (video) in these topics since 2012. We also pioneered using vision sensors for avoidance with the Autonomous Systems Lab (paper video). While being pure research throughout the last years, we have now started to work on vision based flight and obstacle avoidance in the context of the open source project enabling PX4 to be embedded in a complete stack to build a drone with vision based positioning and obstacle avoidance.
This involves the ability to simulate vision based localisation accurately, as shown in this optical flow example.
Or our recent work on obstacle avoidance is operational in simulation, and will find its way very soon into outdoor flights:
While avoidance is necessary for safety, it doesn’t create user delight – it does not make it easier to navigate the drone, but just safer. This is where planning kicks in: It takes load off the operator, allowing her to just specify where the drone should be going, but not how:
2015 was a great year for PX4, but also the last year where the project will focus entirely on flight control. A lot of new challenges are waiting to be tackled, but with more than 8 years experience on vision-enabled drones and quickly increasing broad industry support and adoption the project is set up to handle it.
We would like to thank our open source collaborators: The APM dev team on several years of great collaboration on the PX4 middleware and Pixhawk, the AutoQuad dev team on great beginnings on simulation, ETH Zurich and the Computer Vision and Geometry Lab for its ongoing support of the independent open source project and AUAV, Zubax Robotics and our corporate industry partners for their ongoing support and great collaboration.
Although very exciting, but users do not know how to use.
that vtol, so smooth.