We just realized how prevailing our system from 2009 still is: Marker based position hold and vision based pattern detection, all onboard, with a full companion architecture. Its great to start to see more of that more broadly, in particular in the open source space.
PX4 has been supporting tailsitters [video], tilt rotors [video] and quad planes since over half a year now and is constantly refining the performance. This video from Marco Robustini shows the latest testing results.
The video below (enable audio!) shows a detailed overview of the new Pixracer flight controller.
It is the 4th generation of the Pixhawk flight controller family (make: FMUv4) and like the first generations designed by the Pixhawk Open Hardware team in collaboration with an international dev team. It supports the PX4 and APM flight stacks. If you like to try PX4 on it, follow the user guide.
PX4 has been flying Tailsitters, Tiltrotors and Quad Planes since last summer (instructions here), but we didn't get around to add support to our 3D Gazebo based simulator until now. As its standard-compliant MAVLink its easy to reuse and we've made it part of Dronecode.
The video shows a physically simulated transition of a Quad plane with Gazebo and QGroundControl side by side.
The PX4 flight stack has an unique architecture: It uses a single codebase and mission logic for all vehicle types.
This allows us to easily support VTOL vehicles of any kind: Tailsitters with just two rotors, Tiltrotors and quad planes [INSTRUCTIONS TO BUILD YOUR OWN HERE]. This has been flying great since summer and our latest release has the full support for all three airframes in it. You can reach the dev team via the mailing list or chat if you're interested to try it.
The PX4 development has in the past been focused on smaller planes and wings not requiring a runway, but with lasers available for some time now it adding runway support made sense. We focused on high accuracy, as can be seen in the video. The plane is nice composite model and a Kickstarter project, the Albatross UAV.
Progress on VTOL for the PX4 Flight Core: Full forward and back transitions now with tailsitters and tilt-rotors.
We have now full VTOL support for tilt-rotors, in addition to tailsitters, which were introduced six months ago (video). People interested in flying these can find contact details and build logs on this page. You can follow us on G+ or on Twitter for general PX4 platform updates.
This is a preview of the upcoming high fidelity simulation capabilities and native ROS / Linux port of the PX4 flight stack. It includes simulated cameras, inertial sensors, flight physics and even sensors like laser rangefinders. The same software running during software in the loop simulation can also be executed on a Linux computer for live flight, and leverages the ROS middleware, logging, replay, data streaming and 3D visualization infrastructure.
Call for VTOL test pilots! The PX4 standard firmware (master branch, downloadable via QGroundControl) supports first VTOL vehicle (duo-rotor, tail sitter). This video demonstrates the transition from hover - fixed wing - hover. Contact: http://px4.io/vtol.
Additional vehicles are currently in testing, including a BirdsEyeView FireFly Y6 (tilt rotor) and a Quadshot (quad tail sitter). The main benefit during initial bring-up of VTOL support of the TBS Caipirinha is its small size and high operational safety, combined with the inherent ruggedness of a flying wing airframe.
As the PX4 flight stack (running with the PX4 middleware on Pixhawk) is unified platform without separate multicopter or plane codebases and using the same command & control (arming, logging, safety) state machines and mission management for all platforms, its perfectly suited for VTOL development. Adding VTOLs didn't mean to introduce a new vehicle category, rather, it mostly meant adding a new mixer, a VTOL transition controller and some minor tweaks.
Note these are initial results leveraging existing multicopter and fixed wing controllers spiced with a VTOL transition controller - the dev team is meanwhile experimenting with model predictive controllers. But even this initial baseline implementation works quite well and is ready for some flying.
Having started as a crazy idea for just a small side project, the optical flow sensor quickly hacked together (we conduct our research on other topics) by a group of PhDs is now adopted by more and more systems. And because its an useful tool for research (not so much a research contribution), it made even a short paper. Optical flow is a pretty old and basic technique, but quite robust if the camera runs at very high rate - and the main success factor making the AR.Drone so robust and easy to fly.
Aside from the easy integration into different systems, the main benefit of a standalone design is the extremely low latency of the velocity output. In our latest Firmware version currently in testing the camera sensor runs at 450 Hz, which makes the output despite the relatively basic algorithm extremely robust. And it means we can bump the maximum speed to ~3 m/s per meter altitude, or in other words: Allow a maximum speed of 30 m/s at 10 m altitude and still allow a smooth precision landing at 3 m/s at 1 m altitude.
Because the ground distance noise feeds into the velocity noise, the next big step will be the integration of laser based altitude estimates - and with the Lidar-Lite having a really low price tag, it will be an ideal combination. The ultrasound ranging was in fact so far the biggest limitation and with this removed, we think an optical flow sensor should become the default in addition to GPS today - its a great complimentary technology to improve robustness and accuracy.
And by becoming a default, the costs of making the module will come down to a level where its a no-brainer to get one. Besides our own system a number of autopilot systems have been successfully interfaced (AutoQuad, MikroKopter and lately ArduCopter).
This video shows a scenario called urban canyon, where the multi path reflections of the environment and the blocked sky view by buildings make GPS reception extremely challenging and much less accurate than on an open field:
And of course there are many more cool videos on Youtube - I just couldn't embed all of them here.
The very low cost ruggedized case MAX 505S (complete series) nicely fits the Iris with props unmounted. This is a custom / personal DIY setup, not a product - although we would guess that 3DR will be offering something along these lines. The case is water-tight and has a pressure valve. It comes with pluckable foam which can be manually brought to shape. The RC transmitter could also be fitted underneath, but in this example we decided to keep it external to have a clean 'all things Iris' box. Iris has a classic (nicely machined, black anozided) prop mount which tightly fits the APC SF props. We're working with 3DR to speed up rotor mounting / unmounting with a simplified 'quick mount'. The larger version of the case, MAX 505, would have plenty of space for additional items, but also has obviously more volume. MAX 505S dimensions should fit cabin luggage requirements of most airlines and Iris + accessories + case should be below 6 kg.
This video illustrates a bit better why we believe the safety switch, buzzer and multicolor led help to greatly improve ground and air safety. The video just showcases a few new key safety features and is not a complete reference / mapping, as blink patterns might differ depending on the flight stack running and/or user settings. Some details:
The safety pushbutton indicates safety on with a slow blink pulse and becomes solid when armed
The main led shows the breathe pattern if disarmed and becomes solid when armed
Arming is only possible after the safety has been disengaged. This is to prevent accidental arming via RC
On arming, the buzzer first emits the arming tune, and then the props are slowly spun up to (a configurable) idle speed after a short delay. User tests show that users can disarm fast enough if having accidentally armed.
There are two distinct low battery patterns for low and critically low battery. Since the buzzer is driven with 32V, it can be easily heard from a distance, so even when not looking at the GCS and battery voltage, there is now an intuitive warning, in time to land safely.
We believe that this will greatly help to improve situational awareness of the pilot and prevent a range of potential ground and air accidents.
The above picture shows PX4 Airspeed (in fact its so new this is even a picture of a (proven) prototype), a small, digital airspeed sensor. If features a Measurement Specialties (coincidentally swiss, too) 4525DO sensor, with 1 psi measurement range (roughly up to 100 m/s or 360 km/h or 223 mp/h). Its resolution of 0.84 Pa is quite good, and delivered as 14 bit data from a 24 bit delta-sigma ADC. It also measures temperature to allow to calculate true airspeed from indicated airspeed using the MS5611 static pressure sensor on Pixhawk. As the temperature is not influenced by the heat of nearby processing components, it is much closer to the air temperature than with the previous analog sensor setup. It comes with M3 / 6-32 mounting holes. It is supported on all PX4 autopilot generation boards (and in the PX4 flight stack as well as on APM on PX4). Andrew Tridgell has intensively test flown the sensor and will share his experience soon.
This is just the fanciest peripheral, but maybe not the most popular one: The high-brightness multicolor led featured inside PX4 is also available as external option, and it is even supported by the APM 2.x boards.
Last but not least a peripheral only available for Pixhawk and simple yet convenient: An USB port extender, which allows to mount the micro USB port outside of the fuselage / shell. In contrast to normal USB extenders this one is for DF13 plugs and thus does not require the enormous radius of a normal USB cable in the fuselage.
All these peripherals are intended to improve the ease of use and robustness when working with a PX4 based system, and will be available shortly.
We would be very interested in feedback on the multicolor led. There are a few considerations to add:
We know the DJI pattern, and believe it is too complex. We would be looking for something simpler
It would be beneficial to limit the number of blink patterns and colors to a minimum
It will be impossible to map all system states, so the core and important points need to be prioritized and indicated
We are looking for feedback in particular on these aspects:
What are the system properties you would like to see visualized (arming status, low battery, gps lock, mode, etc)
What do you consider suitable blinking patterns and how many different ones? (e.g. breathe, steady on, fast blink)
How should patterns and colors be assigned to things to show? (e.g. one breathe for disarmed, steady on for armed and color for something else? Or yellow disarmed, green armed and pattern for something else?)
Any other ideas / considerations?
We will not be able to suit everybody's needs. But we want to hear from the community what matters and see if we can pick up some new good ideas to get this 'right'. We're looking forward to lots of feedback (and opinions).
The new PX4 Pixhawk module is an evolvement of the existing FMU and IO modules and completely compatible. The main difference is the target audience: While the FMU and IO stack is super small (the size of an average 8 ch RC receiver) but in some ways almost too densely packed, Pixhawk has more space, more serial ports and more PWM outputs.
As the above picture shows, there are two groups of servo connectors, one main group of 8 outputs which are wired through the backup processor, and an auxiliary group of 6 outputs directly wired to the main processor. The port labeled "RC" can take normal PPM sum or Futaba S.Bus inputs, the port labeled "SB" can read RSSI our output S.Bus to servos. A Spektrum satellite compatible port is on top (labeled SPKT/DSM).
The basic operation is the same, and the software is shared. Inside Pixhawk a FMUv2 and an IOv2 do their duties on a single board (and developers will find that the software will refer to FMUv2 and IOv2)
The main differences between old and new are:
14 PWM outputs vs. 12 PWM (old)
All PWM outputs on servo connectors (old: 8 on servo, 4 on DF13)
5 serial ports vs. 4 (with some double functionality, so only 3 in some configurations on old version)
256 KB RAM and 2 MB flash vs 192 KB RAM and 1 MB flash (old)
Modernized sensor suite (latest generation)
High-power buzzer driver (old: VBAT driven, not as loud)
High-power multicolor led (old: only external BlinkM support)
Support for panel-mounted USB extension (old: not present)
Revised, improved power architecture
Better protection on all input / output pins against shorts and over voltage
Better sensing of power rails (internal and external, e.g. servo voltage)
Support for Spektrum Satellite pairing (needed some manual wiring work in v1, but also software-supported)
No more solid state relays on v2 (was not really used)
Connectors easier to disconnect in case, as the surrounding plastic helps to place the fingers correctly (more on this in a separate post)
Case prevents one-off failure operation of servo connectors
The new unit is consirably larger, has the same height, but offers in general more handling convenience.
External power supply similar to existing 3DR power brick (every unit comes with a free module)
Both generations offer the same backup / override processor that allows failover to manual if the autopilot fails in fixed wing setups. For software developers the differences are nicely abstracted in the PX4 middleware, and can be sensed / configured at runtime.
Almost exactly one year after the first PX4 announcement, we would like to introduce our newest member of the family, Pixhawk! For those familiar with the existing PX4 electronics, it is the all-in-one board combining PX4FMU + PX4IO, combined with a processor and sensor update and a number of new features. The current board revisions will however remain in full service and active development and are fully compatible. Pixhawk is designed for improved ease of use and reliability while offering unprecedented safety features compared to existing solutions.
Pixhawk is designed by the PX4 open hardware project and manufactured by 3D Robotics. It features the latest processor and sensor technology from ST Microelectronics which delivers incredible performance and reliability at low price points.
The flexible PX4 middleware running on the NuttX Real-Time Operating System brings multithreading and the convenience of a Unix / Linux like programming environment to the open source autopilot domain, while the custom PX4 driver layer ensures tight timing. These facilities and additional headroom on RAM and flash will allow Pixhawk the addition of completely new functionalities like programmatic scripting of autopilot operations.
The PX4 project offers its own complete flight control stack, and projects such as APM:Copter and APM:Plane have ported their software to run as flight control applications. This allows existing APM users to seamlessly transition to the new Pixhawk hardware and lowers the barriers to entry for new users to participate in the exciting world of autonomous vehicles.
The flagship Pixhawk module will be accompanied by new peripheral options, including a digital airspeed sensor, support for an external multi-color LED indicator and an external magnetometer. All peripherals are automatically detected and configured.
32 bit ARM Cortex M4 Processor running NuttX RTOS
14 PWM / Servo outputs (8 with failsafe and manual override, 6 auxiliary,
Abundant connectivity options for additional peripherals (UART, I2C, CAN)
Integrated backup system for in-flight recovery and manual override with
dedicated processor and stand-alone power supply
Backup system integrates mixing, providing consistent autopilot and manual
override mixing modes
Redundant power supply inputs and automatic failover
External safety switch
Multicolor LED main visual indicator
High-power, multi-tone piezo audio indicator
microSD card for long-time high-rate logging
32bit STM32F427 Cortex M4 core with FPU
256 KB RAM
2 MB Flash
32 bit STM32F103 failsafe co-processor
ST Micro L3GD20H 16 bit gyroscope
ST Micro LSM303D 14 bit accelerometer / magnetometer
MEAS MS5611 barometer
5x UART (serial ports), one high-power capable, 2x with HW flow control
Servo rail high-power (up to 10V) and high-current ready (10A +)
All peripheral outputs over-current protected, all inputs ESD protected
Monitoring of system and servo rails, over current status monitoring of peripherals
Weight: 38g (1.31oz)
Width: 50mm (1.96")
Thickness: 15.5mm (.613")
Length: 81.5mm (3.21")
This announcement is a service to our users and developers to allow them to plan their hardware roadmaps in time, and to show what we're currently working on. The board will not be immediately available, but 3D Robotics is taking pre-orders for Pixhawk now, and will begin shipping in late October [Update 11/11: the current expected ship date is late Nov]. The price is $199.99.
Our new video shows position hold (the position can be moved on the fly, so its actually full position control) using the PX4FMU autopilot plus PX4FLOW on the AR.Drone 2.0 frame. We have created a set of example apps to show how flow can be used for position control - these are however just textbook like examples, no bells and whistles, no complete sensor fusion and no attempt to max out on performance.
This post is also to announce the availability of the PX4FLOW open source firmware. We are only releasing it now since we wanted to do all restructuring necessary before we release it, so that any open source contribution won't be left behind because the structure changes dramatically. We invested a full person month into getting things clean, nice and simple, and we hope developers will appreciate the effort.
A quick guide is available here for early adopters / testers interested to fly this immediately. As for any beta releases, you will need to have the PX4 toolchain installed (installation guide).
Just some very early results, but here we go with our first autonomous onboard video. This is the PX4 native stack flying a stock Bormatec Camflyer Q (estimation and control used in the video contributed by James Goppert). PX4FMU + PX4io fits fine, but we are already working with Bormatec on a wider heavy-duty version (photos of the first prototype). Always keep in mind that the design goal of PX4 is to provide a modular autopilot platform not only for our own purposes, but also to allow others to build on. While working closely with the APM dev team making sure that APM runs fine on it, this allows us to experiment and push the state of the art (certainly on the platform, just starting on flight control), offering a flight control app on the same hardware for every purpose. For those who noted: Yes, QGroundControl has some cool new features coming, the AHRS display was contributed by Soren Kuula, and much more (not shown in the video) is coming from Michael Carpenter, with some really important design guidance by Michael Oborne and Craig Elders experience going in as well.
The image shows the successful flight of an extremely simple fixed wing controller tutorial (flies GPS waypoints or in manual mode, has support for MAVLink parameters, runs automatically synchronized to the attitude filter). Simple can mean many things, in this case we are referring to simple for hobbyists or students interested in estimation and control. Developing on PX4 is simple in the same fashion as writing a "Hello World!" application from scratch is simple on a normal Unix or Windows machine (the equivalent is the PX4 "Hello Sky" tutorial). To write flight-control code there is no need to dig through the whole codebase, or to mess with the main loop (and introducing unintended side effects).
Of course it is still flexible enough to run monolithic designs, and APM Plane and Copter are fully operational (thanks to Andrew Tridgell and Randy Mackay) and are the recommended flight app at this point for average users.
The open source autopilot field has evolved recently quite a bit, and while everybody these days can hack together a flying quadrotor within a few days, taking a "simple" hardware platform like Arduino or Maple means just reinventing the wheel, since people have done it already. Not many of these new projects ever make it even close to what e.g. APM provides. To really improve over the state of the art, its important to focus on the flight code and to make it fly better, instead of creating yet another half-done piece of hardware.
The whole rationale behind PX4 is different: Similar to VxWorks for the automotive and aerospace industries (guess what the Curiosity rover runs and guess what most likely controls the vehicle you're driving), we're trying to provide a real-time, flexible base platform (based on NuttX, which is POSIX-inspired like VxWorks) and add common library blocks like mixers, estimators, sensor drivers and controllers to it. But if a new developer want to drive a different platform (a rover, boat, blimp) with it, or run his own flight controller, there is no need to rip everything apart - it can just be added and run as application, without affecting other users of the platform.
For the same reason the linked tutorials also look so similar to normal Unix programming examples. And there is a quite successful case study for this platform based approach - if you look at the PX4 interprocess communication you will note that working with PX4 is in many ways similar to working with ROS (Robot Operating System) in classic robotics.
If you want to get your hands dirty, here the most relevant links from the text again:
Finally the last missing board from the PX4 series has been released, the PX4FLOW smart camera module. It can replace GPS in indoor and outdoor applications and provides a metric position close to the ground with only very little drift. It is essentially a microcontroller hooked up to an automotive-grade machine vision sensor that can be freely programmed.
Given the wide distribution of the AR.Drone one might ask what the purpose is, and the answer is simple and well illustrated by the video: You can use this board with PX4FMU and the PX4 native autopilot stack, but you can also interface it with any other system, including Linux onboard computers running ROS. It works indoors and outdoors and brings the same stability the AR.Drone shows in flight to any aerial robot.
It has a native resolution of 752×480 pixels and calculates optical flow on a 4x binned and cropped area at 250 Hz (bright, outdoors), giving it a very high light sensitivy. Unlike many mouse sensors, it also works indoors and in low outdoor light conditions without the need for an illumination LED at 120 Hz (dark, indoors).
168 MHz Cortex M4F CPU (128 + 64 KB RAM)
752×480 MT9V034 image sensor
L3GD20 3D Gyro
16 mm M12 lens (with IR block filter)
As the aerial image with overlaid trajectory shows, the position estimate is very accurate. This is without GPS, captured in flight at 1.6 m altitude and in one pass.
The work on this module has been accepted at the International Conference on Robotics and Automation (ICRA 2013) in Karlsruhe, Germany: Dominik Honegger, Lorenz Meier, Petri Tanskanen and Marc Pollefeys. An Open Source and Open Hardware Embedded Metric Optical Flow CMOS Camera for Indoor and Outdoor Applications, ICRA2013
The module has been developed by Samuel Zihlmann, Laurens Mackay, Dominik Honegger, Petri Transkanen and Lorenz Meier.
Frequently Asked Questions
When will it be available? - It is available here and starts shipping next Monday, according to 3D Robotics
Is the software available? - The software will be made available shortly open-source licensed. The module comes pre-flashed with the latest state.
Will you offer a low-cost version? - The module has been designed to meet scientific standards and provide a baseline for what one can achieve with a reduced design. A low cost version is not planned, but we're interested to hear if someone is willing to contribute a cell-phone camera based design.
Can I hook it up to robot XY? - Almost for sure, as it outputs the flow already in m/s and in MAVLink format