Interview: Andrew Bugera on the 2008 AUVSI Student UAV contest

(Left to Right) Don Hatch, Tim Herd
Performing an engine check before the competition practice flight.
Location: Webster Field


We've written a bit about this year's AUVSI student UAV contest, about which there isn't much information on the web. One of the participants, Andrew Bugera, who chairs the University of Manitoba UAV group (which won $1,800 for Honorable Mention in Mission Performance and a Prize Barrel for automatic takeoff), kindly wrote in to offer to explain more about the contest from a team's perspective.

Here's my Q&A with Andrew:

Q: Was your UAV essentially the one you used last year and described in this paper? What was the overall budget for your UAV?

The aircraft itself was the same as the one flown in 2007. We bought a newer autopilot revision (Micropilot mp2128g) in order to use certain camera stabilization and tracking features. We changed our entire imaging payload this year in response to the competition rules (see question 4).

While it's difficult to pin down the exact cost of the aircraft and other components, I would estimate that we spent $3000-$5000 to build the plane and outfit it with an autopilot and other items. Travel costs make up the remainder of our project budget. In addition to the monetary costs of the project, we also had to dedicate a lot of time.

Q: Last year's used an off-the-shelf autopilot but it appears that the team did most of its custom work on image processing and transmission, using an on-board computer. Was that what you focused on this year, too?


Basically, yes. Because the autopilot has worked well for us in the past (especially in 2006 when we were awarded Best Flight for mission performance and First Place overall), we were more concerned with implementing a good imaging system. In 2006, we used a still camera and pressed the shutter release button with a servo running on a timer. In 2007, we attempted to interface directly with the camera over its USB interface using gPhoto. Unfortunately, with only one software developer on the team, there wasn't enough time to test the system before the competition. An unknown error resulted in no photos being captured during our competition flight.

This year, we were again focused on the imaging side (with some time spent dealing with the new autopilot features we were trying to use) but in a different way.

Q: What was your contest strategy this year? How did you design your UAV for that?

One of the aspects the judges added to the competition starting in 2007 was the opportunity for us to provide "live" (during flight) target location information if possible. This year we decided to switch to using two fixed-lens analog video cameras: one wide-angle and the other telephoto. The idea was to get a wider view of the field and pick out a target location with the wide camera and then aim the telephoto camera with a gimbal at that potential target to get a better look. These video streams were sent to the ground station for our imaging operator.

Q: What's the hardest part of the contest? The easiest?

The "easiest" part of the competition (or at least the one we sometimes take for granted) is the autonomous navigation aspect. Although this is mostly a solved problem and we can use off-the-shelf autopilots to control the aircraft, sometimes the competition prompts us to do things a little bit differently than the manufacturers intended. In general, using off-the-shelf components makes development easier but you still need to make sure they are suitable for your purpose.

On the technical side, managing the interactions between many (sometimes poorly-documented) systems in order to create your desired product is probably the most difficult part. The way I've tried to deal with that over the past two years is to recruit team members from multiple disciplines (Computer, Electrical, and Mechanical Engineering and Computer Science). With this varied knowledge at our disposal, making decisions becomes much easier.

An additional challenge that this competition introduces is transportation. We need to be able to disassemble our plane and bring it with us as airline luggage. The overall trip usually takes 12-14 hours from Winnipeg to Lexington Park, so we're usually pretty tired when we finally get to our hotel.

Q: Can you tell us how your team did in the contest? What worked? What didn't?

We haven't received the official scores yet. From the prize money awarded, we didn't do as well as we have in the past.

The aircraft was stable and had no aerodynamic problems. Because we've been using this aircraft for the last two years, it is fairly well-tested and we are comfortable with its performance. Unfortunately, we had a small issue at the competition which caused bigger issues.

As one of the safety requirements of the competition, if an aircraft can no longer detect the signal from either the RC safety pilot (required to be standing by in case of autopilot failure or other problem) or from the ground control station, it must perform a series of actions. Our particular implementation waits 30 seconds and then flies toward the takeoff location. After an additional 3 minutes without signal, the aircraft is required to cut the throttle and fully deflect all control surfaces to effect a "minimum energy descent".

At one point during the competition, we momentarily lost the connection to the ground station and this failure routine activated. Rather than take the risk of a "flight termination", we elected to transfer control to our safety pilot. This almost certainly resulted in a loss of points because we were no longer performing the mission autonomously.

On the positive side, our imaging system was still running through most of the flight. We were able to see a few targets in the incoming video streams and provide the judges with at least minimal information.


(Left to Right) Tim Herd, Andrew Bugera, Shunjie Lau, Mo Ran Wang
(Not pictured: Andrew Oliver, Ashley Keep, Don Hatch, Rashed Minhaz)
Reassembling the UAV after returning from the competition.
Location: University of Manitoba



Q: What tips did you pick up from the winning teams?


Unfortunately, I didn't have as much time this year to chat with the other teams as I have in the past. We did have some interesting conversations with our friends from the University of Texas at Arlington regarding potential solutions to our radio problem. We also got to see two teams using the open source Paparazzi autopilot: Utah State University and Université de Sherbrooke.

Once the 2008 journal papers are published, we'll all get to take a closer look at what the other teams designed and see if any of their ideas inspire some changes for next year. One of the best parts of this competition is the opportunity to talk to very smart people from around the world.

Q: What will you do differently next year?

Practice, practice, practice! :)
The weather this year was not cooperative for flight testing. A long winter and rainy spring ate up many of the days the team had available to run test flights. Testing subsystems before the competition is very important, though.

One of the other comments that the judges made this year was that the use of checklists is a very good idea. As in a manned aircraft, they suggested we include checklists for failure conditions as well. Knowing what to do in advance *when* something goes wrong during your flight would be a positive thing.

Other that that, we'll take a look at the updated rules when they are published and decide what direction to go for the next competition.

Views: 410

Comment

You need to be a member of DIY Drones to add comments!

Join DIY Drones

Groups

Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service