Georgia Tech Aerial Robotics Club is team of undergraduate and graduate students that compete in aerial vehicle competitions yearly. The MAV Challenge was last week in Virginia Beach. 


The mission consisted of taking of from a helipad and searching a known search area(20 ft squared) for a target. Once located the vehicle must hover over it for approximately thirty seconds. The vehicle must then transition back to the helipad and land. Our team was able to complete the mission fully autonomous. The arena dimensions are 100ft x 35 ft.  The vehicle also is limited to 500 grams and under 1.5ft in any dimension.

A vehicle was built around a powerful computer and sized with the Georgia Tech Electric UAV Flight Time Calculator( This helped us increase our flight time from 2-3 minutes(last years vehicle) to 8-10 minutes(this year). The tool also helped us determine right size propellers and battery to use. 

We use an APM 2.6 board for sensors data. We wrote code to send all the sensor information(very easily I might add thanks to all the contributors) to our computer that would process the information and then send back motor commands to the APM. The only time the APM does any processing is when the vehicle is in manual mode(stabilize mode). We are very happy with the APM and integration time was very quick. Possibly in the future we might switch to pixhawk board but we will have to see if we can determine the increase in performance is worth extra costs(maybe 3DR could sponsor our team :) ). 

Other sensors that were used are a monocular camera and maxbotic sonar. We originally used the apm to process the sonar but found the analog line to be noisy when compared with using the digital line(feed thru ttl usb). 

Without further adieu here is video from competition(first video with ground station view, second raw footage):


We'd like to thank APM contributors for making it easy to work with the ArduPilot. Our website is here( which is a work in progress.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • @Steve  (I am sorry if I met you before and forgot :)

    Great. Definitely shoot me a message. Maybe I can coordinate one of our IR-LOCK tests to line up with fly-day. 

  • @Thomas we love to get together(again). We've all got to see some of the very neat stuff you are doing. We'd love  to get our hands on IR-lock. I'll email you and try and get a meeting together..or at least a fly-day

  • That's my school! :) Congratulations. You make alumni proud. 

    Let me know if I can hang out with you guys on campus sometime. (thomas at irlock dot com)


  • We do not use lines/tiles per se. In our state vector we keep 16 3D feature points that help us navigate. We use a corner harris method with some additional steps to pick the features to track. We also have to add/remove features from the state vector as they come in and out of view. Sonar is indeed used for alt. And for help getting dist to feature points.
    no lidar or optical flow..we have used lidar on larger vehicles but they are too heavy. We have considered adding pixflow sensor but funds ran short. For this work we dont have absolute measurement for position so we do get drift. We have done some work that gives us a absolute reference indoors from vision but it requires a priori image data. is parked
  • Congratulations! How did you navigate absent GPS,  Lidar or optical flow? You mention Visual SLAM: So was it done visually with reference to the "tiles" and lines on the floor, with sonar for altitude?

  • Some of it is mentioned on the website. Its pretty much a flying i7 with very minimal add-ons. The frame is carbon rods from hobbyking(super light). Motors are baby-beasts(from HK) with afro-escs(12A). The i7 was stripped of its case and a 3D printed case was made. The motors are mounted with 3d printed mounts.

    We are using apm 2.6 with(2.9.1b firmware). The firmware was modified to be able to sent all imu data over serial and if in auto mode listen for motor commands.

    i7 is running Georgia Tech UAV Simulation Tool which runs the Visual SLAM, and Target Identification and some other processes. Target Identification is a haar-trained classifier.

    As mentioned above we are using maxbotic sonar and a Pointgrey Firefly-MV. We are in process of integrating a new USB3.0 camera from Basler. 

    let me know if you have any other questions


  • Steve, this is really impressive!

    If you're willing to share your quadcopter build I'm sure a lot of us would be happy to see it :)

  • Thanks for the response and the link--very interesting to see.  In any case, well done yourselves with the MAV Challenge win!


  • The figure is from the rules. I don't know why they decided to use that. A team from Maryland built a cyclocopter and flew it this year. It was a smaller version of this:

    It flew pretty good but the demonstration was a manual one only. Hopefully next year's competition they will add some autonomous features.

  • Nice demonstration.  Just curious--why the cyclogyro in your graphic?  Something coming down the pike?  :)

This reply was deleted.