Five seniors at Portland State University recently finished a 6 month capstone project:  to develop a computer-vision system for a quadcopter.  As you can see from the video, our vision system tracks a fluorescent pink circle on the ground and tries to maintain a stable hover over that target.  The code can be easily changed to track virtually any color, however.  The system runs on a Raspberry Pi attached to the copter, has one webcam pointing straight down (with a fisheye lens for maximal viewing angle), and when Raspbian boots it launches MAVProxy, an open-source Python mavlink wrapper.  The computer vision code is just a module that gets imported into the MAVProxy environment.  The copter takes autonomous control if the copter is placed into the ALT_HOLD flightmode with the RC controller AND the target is present in the frame.    

Our code can be found on Git at dwrtz/MAVProxy.  For those of you already familiar with MAVProxy, you'll want to grab from the 'modules' folder.  The file in the root of the repo contains tips for configuring your vision tracking system, including the code dependencies and instructions on how to use the stuff in /modules/vision_utils.  We hope that drone enthusiasts with Python skills will take our code and improve upon it.

The git repo will undergo improvements in the coming weeks, including a copy of the technical report and links to all the parts we used.  Until then, any questions can be directed at  Replies may be delayed in the next week as I'm going on vacation, but will be answered eventually.


E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Hi Colin, I have a similar result with yours(large oscillation). Did you solve it?
    I do a similar job, but my webcam is on the ground, quadcopter is my tracking target. control the quadcopter to hover in the center of image. After long time testing, unfortunately, the quadcopter still has large oscillation. I doubt it's PID settings or time delay. Have any tips? Thanks!
  • Micro USB

  • Hi Colin,
    I wanted to know how have you connected raspberry pi with APM ?? which pin in APM is plugged to raspberry pi?

    Thanks &bb Regards
  • Hi Colin,
    I wanted to know how have played raspberry pi with APM ?? which pin in APM is plugged to raspberry pi?

    Thanks &bb Regards
  • sorry....forgot to mention that i plan not to use rasp pi due to certain this project (detect and release mechanism) still possible to be done??

  • Hi Colin, thanks for your feedback. Got it....but right now im actually thinking of editing your script into another DIY project which is to detect a red-color target (whle streaming real time video)  and trigger the "mav_cmd_do_set_servo" to release a sandbomb. Can i have your advice on how to edit the script? coz im still new in using kinda stuck in writing the algorithm. Help pls...thx.

  • No it can't Alexander.  It's been awhile since I thought about this project, but if you wanted to stream real time video back to a base station (and obviously wanted to go DIY), you would probably want to add another system to the copter, like another Raspberry Pi, that was solely responsible for the video link, even add another camera.  We were working the RPi to its limits as is, I feel.

  • hi, just wanna ask if ur project can display image (real flight video) during the flight?

  • Very nice project team colin.

    A little observation, i feel it's having too many oscillations trying to hover directly over the tracked object, maybe it's PID settings. Could it be made better?

    Just my own opinion.

    Very cool project guys, well done......

  • Developer

    Great work, thanks for sharing this!

    I've had a quick look at your code, and if you'd like me to merge your module into the upstream release then that would be possible with a few changes.

    I see you've changed the main loop a bit. Is that still needed? I suspect you were trying to avoid it using a tty on startup. What I did to avoid that was to start like this from rc.local:

     screen -s /bin/bash -d -m --aircraft=test --master /dev/serial/by-id/usb-FTDI*

    that allows you to later login and connect to the console for debugging by doing "screen -x" to attach to the running screen session, which is pretty useful.

    The other thing that would help is to put all your module pieces go into their own subdirectory in modules/, perhaps modules/PSU_Vision/. Just put a in there and any python code needed in that directory. Put any libraries you need in a lib/ subdirectory.

    Thanks for making your work available!

    Cheers, Tridge

This reply was deleted.