We've now had a few days to get back into a normal routine after the excitement of the Outback Challenge last week, so I thought this is a good time to write up a report on how things went for team I was part of, CanberraUAV. There have been plenty of posts already about the competition results, so I will concentrate on the technical details of our OBC entry, what went right and what went badly wrong.
For comparison, you might like to have a look at the debrief I wrote two years ago for the 2012 OBC. In 2012 there were far fewer teams, and nobody won the grand prize, although CanberraUAV went quite close. This year there were more than 3 times as many teams and 4 teams completed the criterion for the challenge, with the winner coming down to points. That means the Outback Challenge is well and truly "done", which reflects the huge advances in amateur UAV technology over the last few years.
The drama for CanberraUAV really started several weeks before the challenge. Our primary competition aircraft was the "Bushmaster", a custom built tail dragger with 50cc petrol motor designed by our chief pilot, Jack Pittar. Jack had designed the aircraft to have both plenty of room inside for whatever payload the rest of the team came up with, and to easily fit in the back of a station wagon. To achieve this he had designed it with a novel folding fuselage. Jack (along with several other team members) had put hundreds of hours into building the Bushmaster and had done a great job. It was beautifully laid out inside, and really was a purpose built Outback Challenge aircraft.
Just a couple of days after our D3 deliverable was sent in we had an unfortunate accident. A member of our local flying club was flying his CAP232 aerobatic plane at the same time we we were doing a test mission, and he misjudged a loop. The CAP232 loop went right through the rear of the Bushmaster fuselage, slicing it off. The Bushmaster hit the ground at 180 km/h, with predictable results.
Jack and Greg set to work on a huge effort to build a new Bushmaster, but we didn't manage to get it done in time. Luckily the OBC organisers allowed us to switch to our backup aircraft, a VQ Porter 2.7m RTF with some customisations. We had included the Porter in our D2 deliverable just in case it was needed, and already had plenty of autonomous hours up on it as we had been using it as a test aircraft for all our systems. It was the same basic style as the Bushmaster (a petrol powered tail dragger), but was a bit more cramped inside for fitting our equipment and used a bit smaller engine (a DLE35).
Strategy
Our basic strategy this year was the same as in 2012. We would search at a relatively low altitude (100m AGL this year, compared to 90m AGL in 2012), with a interleaved "mow the lawn" pattern. This year we setup the search with 60% overlap compared with 50% last year, and we had a longer turn around area at the end of each search leg to ensure the aircraft got back on track fully before it entered the search area. With that search pattern and a target airspeed of 28m/s we expected to cover the whole search area in 20 minutes, then we would cover it again in the reverse direction (with a 50m offset) if we didn't find Joe on the first pass.
As with 2012 we used an on-board computer to autonomously find Joe. This year we used an Odroid-XU, which is a quad core system running at 1.6GHz. That gave us a lot more CPU power than in 2012 (when we used a pandaboard), which allowed us to use more CPU intensive image recognition code. We did the first level histogram scanning at the full camera resolution this year (1280x960), whereas in 2012 we had run the first level scan at 640x480 to save CPU. That is why we were happy to fly a bit higher this year.
While the basic approach to image recognition was the same, we had improved the details of the implementation a lot in the last two years, with hundreds of small changes to the image scoring, communications and user interface. Using our 2012 image data as a guide (along with numerous test flights at our local flying field) we had refined the cuav code to provide much better object discrimination, and also to cope better with communication failures. We were pretty confident it could find Joe very reliably.
The takeoff
When the running order was drawn from a hat we were the 2nd last team on the list, so we ended up flying on the Friday. We'd been up early each morning anyway in case the order was changed (which does happen sometimes), and we'd actually been called out to the airfield on the Thursday afternoon, but didn't end up flying then due to high wind.
Our time to takeoff finally came just before 8am Friday morning. As with 2012 we did an auto takeoff, using the new tail-dragger takeoff coded added to APM:Plane just a couple of months before.
Unfortunately the takeoff did not go as planned. In fact, we were darn lucky the plane got into the air at all! As soon as Jack flicked the switch on the transmitter to put the plane into AUTO it started veering left on the runway, and nearly scraped a wing as it limped it's way into the air. A couple of seconds later it came quite close to the tents where the OBC organisers and our GCS was located.
It did get off the ground though, and missed the tents while it climbed, finally switching to normal navigation to the 2nd waypoint once it got to an altitude of 40m. Jack had his finger on the switch on the transmitter which would have taken manual control and very nearly aborted the takeoff. This would have to go down as one of the worst takeoffs in OBC history.
So why did it go so badly wrong? My initial analysis when I looked at the logs later was that the wind had pushed the plane sideways. After examining the logs more carefully though I discovered that while the wind did play a part, the biggest issue was the compass. For some reason the compass offsets we had loaded were quite different from the ones we had been testing with in all our flights in Canberra. I still don't know why the offsets changed, although the fact that we had compass learning enabled almost certainly played a part. We'd done dozens of auto takeoffs in Canberra with no issues, with the plane tracking beautifully down the center line of the runway. To have it go so badly wrong for the flight that matters was a disappointment.
I've decided that the best way to fix the issue for future flights is to make the auto takeoff code completely independent of the compass. Normally the compass is needed to get an initial yaw for the aircraft while it is not moving (as our hobby-grade GPS sensors can't give yaw when not moving), and that initial yaw is used to set the ground heading for the takeoff. That means that with the current code, any compass error at the time you change into AUTO will directly impact the takeoff.
Because takeoff is quite a short process (usually 20s or less), we can take an alternative approach. The gyros won't drift much in 20s, so what we can do is just keep a constant gyro heading until the aircraft is moving fast enough to guarantee a good GPS ground track. At that point we can add whatever gyro based yaw error has built up to the GPS ground course and use that as our navigation heading for the rest of the takeoff. That should make us completely independent of compass for takeoff, which should solve the problem for everyone, rather than just fixing it for our aircraft. I'm working on a set of patches to implement this, and expect it to be in the next release.
Note that once the initial takeoff is complete the compass plays almost no role in the flight of a fixed wing plane if you have the EKF enabled, unless you lose GPS lock. The EKF rejected our compass as being inconsistent, and happily got correct yaw from the fusion of GPS velocity and other sensors for the rest of the flight.
The search
After the takeoff things went much more smoothly. The plane headed off to the search area as planned, and tracked the mission extremely well. It is interesting to compare the navigation accuracy of this years mission compared to the 2012 mission. In 2012 we were using a simple vector field navigation algorithm, whereas we now use the L1 navigation code. This year we also used Paul's EKF for attitude and position estimation, and the TECS controller for speed/height control. The differences are really remarkable. We were quite pleased with how our Mugin flew in 2012, but this year it was massively better. The tracking along each leg of the search was right down the line, despite the 15 knot winds.
Another big difference from 2012 is that we were using the new terrain following capability that we had added this year. In 2012 we used a python script to generate our waypoints and that script automatically added intermediate waypoints to follow the terrain of the search area. This year we just set all waypoints as 100 meters AGL and let the autopilot do its job. That made things a lot simpler and also resulted in better geo-referencing and search area coverage.
On our GCS imaging display we had 4 main windows up. One is a "live" view from the camera. That is setup to only update if there is plenty of spare bandwidth on our radio links, and is really just there to give us something to watch while the imaging code does its job, plus to give us some situational awareness of what the aircraft is tracking over.
The second window is the map, which shows the mission flight path, the geo-fence, two plane icons (AHRS position estimate and GPS position estimate) plus overlays of thumbnails from the image recognition system of any "interesting" objects the Odroid has found.
The 3rd window is the "Mosaic" window. That shows a grid of thumbnails from the image recognition system, and has menus and hot-keys to allow us to control the image recognition process and sort the thumbnails in various ways. We expected Joe to have a image score of over 1000, but we set the threshold for thumbnails to display in the mosaic as 500. Setting a lower threshold means we get shown a lot of not very interesting thumbnails (things like oddly shaped tree stumps and patches of dirt that are a bit out of the ordinary), but at least it means we can see the system is working, and it stops the search being quite so boring.
The final window is a still image window. That is filled with a full resolution image of whatever image we have selected for download from the aircraft, allowing us to get some context around a thumbnail if we want it. Often it is much easier to work out what an object is if you can see the surroundings.
On top of those imaging windows we also have the usual set of flight control windows. We had 3 laptops setup in our GCS tent, along with a 40" LCD TV for one of the laptops (my thinkpad). Stephen ran the flight monitoring on his laptop, and Matt helped with the imaging and the antenna tracker from his laptop. Splitting the task up in this way helped prevent overload of any one person, which made the whole experience more manageable.
On Stephens laptop he had the main MAVProxy console showing the status of the key parts of the autopilot, plus graphs showing the consistency of the various subsystems. The ardupilot code on Pixhawk has a lot of redundancy at both the sensor level and at the higher level algorithm level, and it is useful to plot graphs showing if we get any significant discrepancies, giving us a warning of a potential failure. For this flight everything went smoothly (apart from the takeoff) but it is nice to have enough screens to show this sort of information anyway. It also makes the whole setup look a bit more professional to have a few fancy graphs going :-)
About 20 minutes after takeoff the imaging system spotted Joe lying in his favourite position in a peanut field. We had covered nearly all of the first pass across the search area so we were glad to finally spot him. The images were very clear, so we didn't need to spend any time discussing if it really was Joe or not.
We clicked the "Show Position" menu item we'd added to the MAVProxy map, which pops up a window giving the position in various coordinate systems. Greg passed this to the judges who quickly confirmed it was within the required 100 meters of Joe.
The bottle drop
That initial position was only a rough estimate though. To refine the position we used a new trick we'd added to MAVProxy. The "wp movemulti" command allowed us to move and rotate a pre-defined confirmation flight pattern over the top of Joe. That setup the plane to fly a butterfly pattern over Joe at an altitude of 80 meters, passing over him with wings level. That gives an optimal set of images for our image recognition system to work with, and the tight turns allow the wind estimation algorithm in ardupilot to get an accurate idea of wind direction and speed.
In the weeks leading up to the competition we had spent a lot of time refining our bottle drop system. We realised the key to a good drop is timing consistency and accurate wind drift estimation.
To improve the timing consistency we had changed our bottle release mechanism to be in two stages. The bottle was held to the bottom of the Porter by two servos. One servo held the main weight of the bottle inside a plywood harness, while the second servo was attached to the top of the parachute by a wire, using a glider release mechanism.
The idea is that when approaching Joe for the drop the main servo releases first, which leaves the bottle hanging beneath the plane held by the glider release servo with the parachute unfurled. The wind drags the bottle at an angle behind the plane for 2 seconds before the 2nd servo is released. The result is much more consistent timing of the release, as there is no uncertainty in how long the bottle tumbles before the chute unfolds.
The second part of the bottle drop problem is good wind drift estimation. We had decided to use a small parachute as we were not certain the bottle would withstand the impact with the ground without a chute. That meant significant wind drift, which meant we really had to know the wind direction and speed quite accurately, and also needed some data on how fast the bottle would drift in the wind.
In the weeks before the competition we did a lot of bottle drops, but the really key one was a drop just a week before we left for Kingaroy. That was the first drop in completely still conditions, which meant it finally gave us a "zero wind" data point, which was important in calculating the wind drift. Combining that drop with some previous drop results we came up with a very simple formula giving the wind drift distance for a drop from 80 meters as 5 times the wind speed in meters/second, as long as we dropped directly into the wind.
We had added a new feature to APM:Plane to allow an exact "acceptance radius" for an individual waypoint to be set, overriding the global WP_RADIUS parameter. So once we had the wind speed and direction it was just a matter of asking MAVProxy to rotate the drop mission waypoints to match the wind (which was coming from -121 degrees) and to set the acceptance radius for the drop to 35 meters (70 meters for zero wind, minus 7 times 5 for the 7m/s wind the EKF had estimated).
The plane then slowed to 20m/s for the drop, and did a 350m approach to ensure the wings were nice and level at the drop point. As the drop happened we could see the parachute unfurl in the camera view from the aircraft, which was a nice confirmation that the bottle had been successfully dropped.
The mission was setup for the aircraft to go back to the butterfly confirmation pattern after the drop. We had done that to allow us to see where the bottle had actually landed relative to Joe, in case we wanted to do a 2nd drop. We had 3 bottles ready (one on the plane, two back at the airfield), and were ready to fit a new bottle and adjust the drop point if we'd been too far off.
As soon as we did the confirmation pass it became clear we didn't need to drop a 2nd bottle. We measured the distance of the bottle to Joe as under 3 meters using the imaging system (the judges measured it officially as 2.6m), so we asked the plane to come back and land.
The landing
This year we used a laser rangefinder (a SF/02 from LightWare) to help with the landing. Using a rangefinder really helps ensure the flare is at the right altitude and produces much more consistent landings.
The only real drama we had with the landing was that we came in a bit fast, and ballooned more than it should have on the flare. The issue was that we hadn't used a shallow enough approach. Combined with the quite strong (14 knot) cross-wind it was an "interesting" landing.
We also should have set the landing point a bit closer to the end of the runway. We had put it quite a way along the runway as we weren't sure if the laser rangefinder would pick up anything strange as it crossed over the road and airport boundary fence, but in hindsight we'd put the touchdown point a bit too close to the geofence. Full size glider operations running at the same time meant only part of the runway was available for OBC teams to use.
The landing was entirely successful, and was probably better than a manual landing would have been by me in the same wind conditions (I'm only a mediocre pilot, and landings are my weakest point), but we certainly can do better. Paul Riseborough is already looking at ways to improve the autoland to hopefully produce something that produces a round of applause from spectators in future landings.
Radio performance
Another part of the mission that is worth looking at is the radio performance. We had two radio links to the aircraft - one was a RFD900 900MHz radio, and that performed absolutely flawlessly as usual. We had around 40 dB of fade margin at a range of over 6km, which is absolutely huge. Every team that flew in the OBC this year used a RFD900, which is a great credit to Seppo and the team at RFDesign.
The 2nd radio was a Ubiquity Rocket M5, which is a 5.8GHz ethernet bridge. We used an active antenna tracker this year for the 5.8GHz link, with a 28dBi MIMO antenna on the ground, and a 10dBi MIMO omni antenna in the aircraft (the protrusion from the top of the fuselage is for the antenna). The 5.8GHz link gave us lots of bandwidth for the images, but was not nearly as reliable as the RFD900 link. It dropped out 6 times over the course of the mission, with the longest dropout lasting just over a minute. The dropouts were primarily caused by magnetometer calibration on the antenna tracker - during the mission we had to add some manual trim to the tracker to improve the alignment. That worked, but really we should have used a ground antenna with a bit less gain (maybe around 24dBi) to give us a wider beam width.
Another alternative would have been to use a lower frequency. The 5.8GHz Rocket gives fantastic bandwidth, but we don't really need that much bandwidth for our system. The Robota team used 2.4GHz Ubiquity radios and much simpler antennas and ended up with a much better link than we had. The difference in path loss between 2.4GHz and 5.8GHz is quite significant.
The reason we didn't use the 2.4GHz gear is that we do most of our testing at a local MAAA flying club, and we know that if someone crashes their expensive model while we have a powerful 2.4GHz radio running then there will always be the thought that our radio may have caused interference with their 2.4GHz RC link.
So we're now looking into the possibility of using a 900MHz ethernet bridge. The Ubiquity M900 looks like a real possibility. It doesn't offer nearly as much bandwidth as the 5.8GHz or 2.4GHz radios as Australia only has 13MHz of spectrum available in the 900MHz band for ISM use, but that should still be enough for our application. We have heard that the spread spectrum M900 doesn't significantly interfere with the RFD900 in the same band (as the RFD900 is a TDM frequency hopping radio), but we have yet to test that theory.
Alternatively we may use two RFD900s in the same aircraft, with different numbers of hopping channels and different air data rates to minimise interference. One would be dedicated to telemetry and the other to image data. A RFD900 at 128kbps should be plenty for our cuav imaging system as long as the "live camera view" window is set to quite a low resolution and update rate.
Team cooperation
One of the most notable things about this years competition was just how friendly the discussions between the teams were. The competition has a great spirit of cooperation and it really is a fantastic experience to work closely with so many UAV developers from all over the world.
I don't really have time to go through all the teams that attended, but I do want to mention some of the highlights for me. Top of the list would have to be meeting Ben and Daniel Dyer from team SFWA. They put in an absolutely incredible effort to build their own autopilot from scratch. Their build log at http://au.tono.my/log/index.html is incredible to read, and shows just what can be achieved in a short time with enough talent. It was fantastic that they completed the challenge (the first team to ever do so) and I look forward to seeing how they take their system forward.
I'd also like to offer a tip of the hat to Team Swiss Fang. They used the PX4 native stack on a Pixhawk and it was fantastic to see how far they pushed that autopilot stack in the lead up to the competition. That is one of the things that competitions like the OBC can do for an autopilot - push it to much higher levels.
Team OpenUAS also deserves a mention, and I was especially pleased to meet Christophe who is one of the key people behind the Paparrazzi autopilot. Paparrazzi is a real pioneer in the field of amateur autopilots. Many of the posts we make on "ardupilot has just achieved X" on diydrones could reasonably be responded to by saying "Paparrazzi did that 3 years ago". The OpenUAS team had bad luck in both the 2012 competition and again this year. This time round it was an airspeed sensor failure which led to a crash soon after takeoff which is really tragic given the effort they have put in and the pedigree of their autopilot stack.
The Robota team also did very well, coming in second behind our team. Particularly impressive was the performance of the Goose autopilot on a quite small foam plane in the wind over Kingaroy. The automatic landing was fantastic. The Robota team used a much simpler approach, just using a 2.4GHz Ubiquity link to send a digital video stream to 3 video monitors and having 3 people staring at those screens to find Joe. Extremely simple, but it worked. They were let down a bit by the drop accuracy in the wind, but a great effort done with style.
I was absolutely delighted when Team Thunder, who were also running APM:Plane, completed the challenge, coming in 4th place. They built their system partly on the image recognition code we had released, which is exactly what we hoped would happen. We want to see UAV developers building on each others work to make better and better systems, so having Team Thunder complete the mission was great.
Overall ardupilot really shone in the competition. Over half the teams that flew in the competition were running ardupilot. Our community has shown that we can put together systems that compete with the best in the world. We've come a long way in the last few years and I'm sure there is a bright future for more developments in the UAV S&R space that will see ardupilot saving lives on a regular basis.
Thank you
On behalf of CanberraUAV I'd like to offer a huge thank you to the OBC organisers for the massive effort they have put in over so many years to run the competition. Back in 2007 when the competition started it took real insight for Rod Walker and Jon Roberts to see that a competition of this nature could push amateur UAV technology ahead so much, and it took a massive amount of perseverance to keep it going to the point that teams were able to finally complete the competition. The OBC has had a huge impact on amateur UAV technology.
We'd also like to thank our sponsors, 3DRobotics, who have been a huge help for CanberraUAV. We really couldn't have done it without you. Working with 3DR on this sort of technology is a great pleasure.
Next steps
Completing the Outback Challenge isn't the end for CanberraUAV and we are already starting discussions on what we want to do next. I've posted some ideas on our mailing list and we would welcome suggestions from anyone who wants to take part. We've come a long way, but we're not yet at the point where putting together an effective S&R aircraft is easy.
Comments
Really appreciate this summary and the tremendous work everybody put into hosting and participating in this competition.
It has been a driving force behind a lot of the evolution that has taken place over the past few years that has been of considerable benefit to us all.
I truly hope a next generation competition can be put together with clear, but challenging and achievable goals.
As far as Search and rescue is concerned this was an excellent first step, for the next one it will start to be real.
And you guys have already developed working foundations of some of the most important technologies.
Best To You All and thank you for all your effort.
Gary
Amaizing, thank's for share details with us. :). and represent Us (APM users) so good ;)
Who knew the 3DR Radio had evolved into the RFD900 or there was a 5.8Ghz ethernet bridge. Feeds & speeds continue to evolve in a parallel universe, while the news focuses on wearable dingbats & 3D printers.
Well done sir, nice achievement for the CUAV team! Thanks for sharing the events of the day!
Excellent Tridge. About the TO. I don't trust any auto calibration of a raw sensor in the field, ie flying. Excellent work on the drop.
Legend
Well done, fellas - thanks for the write-up, Tridge.
@Mark, yes we improved the geo-referencing accuracy a lot, mostly by simplifying the handling of the timing code. There are several different clocks involved:
all of these time sources are not quite aligned with each other. We had some overly complex code to try to handle that in 2012, which we simplified a lot for this year. That resulted in much better geo-referencing. It also helps that the EKF gives better attitude estimates, which helps in projecting the image onto the ground.