For my master thesis I have built a quadcopter based on the Aeroquad platform for doing visual guided autonomous landing. All high processing is done on-board, using a BeagleBoard-XM for high level and Arduino for low level computations. OpenCV is used for recognizing of the marker and Kalman filter is used for robust tracking of the maker position input. The Kalman filter does also altitude estimation for position and velocity. The high level software is written in python and the low level software uses the AeroQuad flight software 3.2 with modifications.
Video of the landing:
https://www.youtube.com/watch?v=9otkmpm6RUQ
Comments
hey Pal,
howsome job! I'm currently developing a code similar to your's but for another application and I have a few questions. What type of traking do you use? Is it a hsv hreshold + CvFindContour or something more like a CAMshift with histograms ?
Is your controller based on PIDs loops or LQR ? And finaly, is your code or paper public ?
Thanks
Antoine.
Thanks for the support !
Sean Headrick: It was the initial plan was to do so. But I chose to go for the camera in a fixed position at first and try to implement if I had the time, unfortunately I did not. However , the system does compensate for its given attitude with measurements from the gyro and ultra sonic sensor.
Very cool. This is the type of thing a delivery drone would need to identify a landing zone.
Nice!
well done!
Awesome!