3D Robotics

How college Blimp UAV contests work

The main existing US Blimp UAV competition is the Indoor Aerial Robot Competition (IARC) held every year since 2005 at Drexel University in Phladelphia. The contest is designed mostly as an exercise in remote control, both human and computer. The blimps appear to be all RC, with wireless cameras pointing down. The robotics part is either an image processing and control task, where the ground-based computer analyzes the video and transmits commands to the blimp via a trainer cord to a standard RC transmitter, or one in which a human operator does the same thing (looking only at the video stream). Examples of the computer-driven tasks are line following and fighting gusts of wind, while the human-driven tasks include following a maze and spotting objects in a certain order. The basic platform is a $850 52" envelope with three motors, servo vectoring (meaning the motors are on a shaft that tilts them up or down), and no onboard intelligence, either processing or sensors (it's just RC, with a wireless camera). The contest description includes the possibility that teams could add ultrasonic sensors or gauge speed and direction with optical flow calculations from the camera, and some have done that with ultrasonic and IR sensors, but all the data is sent to the ground computers, processed there, and turned into RC commands back to the blimp. Here's a paper that describes how one team's blimp works. The main difference between this approach and the one we're pursuing with our Minimum and Maximum Blimp UAVs is that we're entirely focused on onboard intelligence and sensing. Our blimps have two-way wireless links, but not conventional RC ones, and they're not designed for manual control. At the moment, we're making the autonomous navigation job easier with ground-based beacons, but the aim is to do away with those eventually and navigate like our outdoor UAVs do, entirely on their own. Here's a video from last year's competition, which was held in April (presumably this year's will be announced soon--I've emailed for more info):
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • It's an interesting competion. I have worked with the Bryn Mawr group to add image processing functions to their Python-based Myro robot programming environment, and I think Drexel has some of our new boards. The image processing component sounds intimidating, but it's really not bad once you start working with pixels and become familiar some higher level image processing functions. The Robocup competitions also make extensive use of video, and some of the leagues require onboard video processing (e.g. the 4-legged league). I wouldn't be surprised to see onboard processing becoming a requirement in some classes of the competition as it further matures.
This reply was deleted.