As many probably know, this year's Sparkfun Autonomous Vehicle Competition (AVC) is similar to last year's except that a new challenge has been added in which we can gain extra points by finding and popping 1m red balloons. It's pretty difficult to do (see failed attempts #1, #2, #3) but as you can see from the video above it's finally getting the balloons.
The setup is detailed on this page on the developer wiki (scroll down to the Red Balloon in the Example Projects section) but in any case, it uses a Pixhawk running a modified version of AC3.2 (code here, AC3.2 testing thread here). The Pixhawk has all the regular ArduCopter functionality but in addition it accepts commands from an external navigation computer, an Odroid U3 when it is in the Guided flight mode, or in AUTO and running a newly created MAVLink NAV_GUIDED mission command. This NAV_GUIDED command takes some special arguments like a timeout (in seconds), a min and max altitude, and a maximum distance and if the Odroid takes too long to find the balloon, or gives crazy commands that cause the copter to fly away, the Pixhawk retakes control.
So while the Odroid is in control, it first requests the pixhawk to rotate the copter slowly around while it pulls images from the Logitech C920 web cam and uses OpenCV to search for blobs of red in the images. During the search it keeps track of the largest red blob it saw so after the copter has rotate 360 degrees it sends commands to point the vehicle in the direction of that blob and then sends 3D velocity requests 5 times per second to guide the copter towards the top of the balloon. Leonard Hall helped me greatly in improving the control algorithm both for the new velocity controller that was added to ArduCopter and also with the strategy the Odroid should use to best get to the balloon..
All of the communication between Pixhawk and Odroid uses Kevin Hester's Drone API and Tridge's MavProxy. It's relatively easy to use and allowed me to write all the code in Python. That code is all here.
A couple of extra things that came out of this was I managed to use Python's MultiProcessing to allow the image capture and image processing to run in separate processes (so they run in parallel). It seems that unless images are pulled from the camera at 15 ~ 30hz, they become extremely laggy. In one test I measured a 1.3 second lag in the images from the camera! With MultiProcessing that's down to less than 0.2 seconds and there are 2 more cores still free on the Odroid so there's room for more sophisticated work to be done!
For testing, at Tridge's suggestion, I set-up a balloon simulator that works within SITL. It used the simulated copter's position and attitude and then created a simulated image of the horizon and the balloon (if it was within the field of view of the simulated camera). This fake image was then fed into the image recognition python code making it an end-to-end test. This was all possible because both the Odroid and our Simulator run on Ubuntu. Having the simulator allowed a lot of bench testing before any real flights happened.
If you want to see a really impressive use of image processing in ArduPlane, check out Michael Darling's video here of one arduplane following another. Michael's set-up is similar in that he uses the same camera and an external nav computer (a beagle bone black) and OpenCV. I'm not sure what communication method he used between the BBB and the Pixhawk.
Also I hope that over the next year we will enable more image processing computers to work with ArduCopter (and ArduPlane, ArduRover) and maybe lead to the safe precision landings Jesper Andersen suggested in this blog post.
All feedback welcome!