Georgia Tech Aerial Robotics(GTAR) wins American Helicopter Society(AHS) Autonomous MAV Challenge

Georgia Tech Aerial Robotics Club is team of undergraduate and graduate students that compete in aerial vehicle competitions yearly. The MAV Challenge was last week in Virginia Beach. 

The mission consisted of taking of from a helipad and searching a known search area(20 ft squared) for a target. Once located the vehicle must hover over it for approximately thirty seconds. The vehicle must then transition back to the helipad and land. Our team was able to complete the mission fully autonomous. The arena dimensions are 100ft x 35 ft.  The vehicle also is limited to 500 grams and under 1.5ft in any dimension.

A vehicle was built around a powerful computer and sized with the Georgia Tech Electric UAV Flight Time Calculator( This helped us increase our flight time from 2-3 minutes(last years vehicle) to 8-10 minutes(this year). The tool also helped us determine right size propellers and battery to use. 

We use an APM 2.6 board for sensors data. We wrote code to send all the sensor information(very easily I might add thanks to all the contributors) to our computer that would process the information and then send back motor commands to the APM. The only time the APM does any processing is when the vehicle is in manual mode(stabilize mode). We are very happy with the APM and integration time was very quick. Possibly in the future we might switch to pixhawk board but we will have to see if we can determine the increase in performance is worth extra costs(maybe 3DR could sponsor our team :) ). 

Other sensors that were used are a monocular camera and maxbotic sonar. We originally used the apm to process the sonar but found the analog line to be noisy when compared with using the digital line(feed thru ttl usb). 

Without further adieu here is video from competition(first video with ground station view, second raw footage):


We'd like to thank APM contributors for making it easy to work with the ArduPilot. Our website is here( which is a work in progress.

Views: 1416

Comment by Randy on May 14, 2015 at 8:12pm

Congrats! Nicely done.

Comment by Bim on May 15, 2015 at 1:51am

If I understand your web page correctly - your drone carried the Gigabyte Brix computer and was still under 500 gr?

How much did the computer weigh?

Comment by Steve on May 15, 2015 at 3:49am

Somewhere between 230-240 grams. It also draws about 2-2.5A or 1-1.5 power draw of one motor.

Comment by Paul Marsh on May 15, 2015 at 4:40am

Nice demonstration.  Just curious--why the cyclogyro in your graphic?  Something coming down the pike?  :)

Comment by Steve on May 15, 2015 at 4:58am

The figure is from the rules. I don't know why they decided to use that. A team from Maryland built a cyclocopter and flew it this year. It was a smaller version of this:

It flew pretty good but the demonstration was a manual one only. Hopefully next year's competition they will add some autonomous features.

Comment by Paul Marsh on May 15, 2015 at 5:05am

Thanks for the response and the link--very interesting to see.  In any case, well done yourselves with the MAV Challenge win!


Comment by Bim on May 15, 2015 at 6:36am

Steve, this is really impressive!

If you're willing to share your quadcopter build I'm sure a lot of us would be happy to see it :)

Comment by Steve on May 15, 2015 at 6:46am

Some of it is mentioned on the website. Its pretty much a flying i7 with very minimal add-ons. The frame is carbon rods from hobbyking(super light). Motors are baby-beasts(from HK) with afro-escs(12A). The i7 was stripped of its case and a 3D printed case was made. The motors are mounted with 3d printed mounts.

We are using apm 2.6 with(2.9.1b firmware). The firmware was modified to be able to sent all imu data over serial and if in auto mode listen for motor commands.

i7 is running Georgia Tech UAV Simulation Tool which runs the Visual SLAM, and Target Identification and some other processes. Target Identification is a haar-trained classifier.

As mentioned above we are using maxbotic sonar and a Pointgrey Firefly-MV. We are in process of integrating a new USB3.0 camera from Basler. 

let me know if you have any other questions


Comment by John Dennings on May 15, 2015 at 8:31am

Congratulations! How did you navigate absent GPS,  Lidar or optical flow? You mention Visual SLAM: So was it done visually with reference to the "tiles" and lines on the floor, with sonar for altitude?

Comment by Steve on May 15, 2015 at 8:46am
We do not use lines/tiles per se. In our state vector we keep 16 3D feature points that help us navigate. We use a corner harris method with some additional steps to pick the features to track. We also have to add/remove features from the state vector as they come in and out of view. Sonar is indeed used for alt. And for help getting dist to feature points.
no lidar or optical flow..we have used lidar on larger vehicles but they are too heavy. We have considered adding pixflow sensor but funds ran short. For this work we dont have absolute measurement for position so we do get drift. We have done some work that gives us a absolute reference indoors from vision but it requires a priori image data.


You need to be a member of DIY Drones to add comments!

Join DIY Drones

© 2020   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service