Open-source computer-vision module developed for MAVProxy

Five seniors at Portland State University recently finished a 6 month capstone project:  to develop a computer-vision system for a quadcopter.  As you can see from the video, our vision system tracks a fluorescent pink circle on the ground and tries to maintain a stable hover over that target.  The code can be easily changed to track virtually any color, however.  The system runs on a Raspberry Pi attached to the copter, has one webcam pointing straight down (with a fisheye lens for maximal viewing angle), and when Raspbian boots it launches MAVProxy, an open-source Python mavlink wrapper.  The computer vision code is just a module that gets imported into the MAVProxy environment.  The copter takes autonomous control if the copter is placed into the ALT_HOLD flightmode with the RC controller AND the target is present in the frame.    

Our code can be found on Git at dwrtz/MAVProxy.  For those of you already familiar with MAVProxy, you'll want to grab mavproxy_vision.py from the 'modules' folder.  The README.vision file in the root of the repo contains tips for configuring your vision tracking system, including the code dependencies and instructions on how to use the stuff in /modules/vision_utils.  We hope that drone enthusiasts with Python skills will take our code and improve upon it.

The git repo will undergo improvements in the coming weeks, including a copy of the technical report and links to all the parts we used.  Until then, any questions can be directed at colindoolittle@gmail.com.  Replies may be delayed in the next week as I'm going on vacation, but will be answered eventually.

Thanks! 

Views: 6222


Developer
Comment by Pete Hollands on June 21, 2013 at 4:05pm

Developer
Comment by Pat Hickey on June 21, 2013 at 7:18pm

Great work Colin and team!

You forgot to attach a photo, here's one I had from your talk. (The webcam is rotated forwards in this picture, in operation it faces down.)


Developer
Comment by Andrew Tridgell on June 21, 2013 at 7:56pm

Great work, thanks for sharing this!

I've had a quick look at your code, and if you'd like me to merge your module into the upstream release then that would be possible with a few changes.

I see you've changed the main loop a bit. Is that still needed? I suspect you were trying to avoid it using a tty on startup. What I did to avoid that was to start mavproxy.py like this from rc.local:

 screen -s /bin/bash -d -m mavproxy.py --aircraft=test --master /dev/serial/by-id/usb-FTDI*

that allows you to later login and connect to the console for debugging by doing "screen -x" to attach to the running screen session, which is pretty useful.

The other thing that would help is to put all your module pieces go into their own subdirectory in modules/, perhaps modules/PSU_Vision/. Just put a __init__.py in there and any python code needed in that directory. Put any libraries you need in a lib/ subdirectory.

Thanks for making your work available!

Cheers, Tridge

Comment by Yusuf Onajobi on June 21, 2013 at 10:12pm

Very nice project team colin.

A little observation, i feel it's having too many oscillations trying to hover directly over the tracked object, maybe it's PID settings. Could it be made better?

Just my own opinion.

Very cool project guys, well done......

Comment by Alexander on March 8, 2014 at 2:18am

hi, just wanna ask if ur project can display image (real flight video) during the flight?

Comment by Colin Doolittle on March 8, 2014 at 1:17pm

No it can't Alexander.  It's been awhile since I thought about this project, but if you wanted to stream real time video back to a base station (and obviously wanted to go DIY), you would probably want to add another system to the copter, like another Raspberry Pi, that was solely responsible for the video link, even add another camera.  We were working the RPi to its limits as is, I feel.

Comment by Alexander on March 9, 2014 at 1:07am

Hi Colin, thanks for your feedback. Got it....but right now im actually thinking of editing your script into another DIY project which is to detect a red-color target (whle streaming real time video)  and trigger the "mav_cmd_do_set_servo" to release a sandbomb. Can i have your advice on how to edit the script? coz im still new in using mavproxy......so kinda stuck in writing the algorithm. Help pls...thx.

Comment by Alexander on March 9, 2014 at 1:13am

sorry....forgot to mention that i plan not to use rasp pi due to certain reason....is this project (detect and release mechanism) still possible to be done??

Comment by chethan j on May 18, 2014 at 9:16pm
Hi Colin,
I wanted to know how have played raspberry pi with APM ?? which pin in APM is plugged to raspberry pi?

Thanks &bb Regards
chethan
Comment by chethan j on May 18, 2014 at 9:18pm
Hi Colin,
I wanted to know how have you connected raspberry pi with APM ?? which pin in APM is plugged to raspberry pi?

Thanks &bb Regards
chethan

Comment

You need to be a member of DIY Drones to add comments!

Join DIY Drones

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service