Part 2 of my Tutorial about Precision Landing, using a Raspberry Pi, OpenCv, Arducopter and Dronekit.
> Plus my first Shout Out... at the very end!
> Plus the first Coding Challenge: submit your precision landing video and the best one will be in my next shout out!
Watch Part 1 here
Comments
Hi Tiziano, thanks for the kind words. Just to clarify: this video was created after the flight completed. I combine raw action cam footage with a HUD overlay. The HUD is generated directly from the flight data log. I have a collection of python/opencv scripts that automatically time-syncs the data with the video and then renders out the video + hud + original audio. I use this system to review the performance of the flight computer. For example, when everything is perfect, the synthetic horizon line should stay pegged with the real horizon line. If the EKF has an error of a 1/2 degree even, that shows up visually. I can also use it to monitor the sequencing and performance of higher level scripted task like an auto-approach and land.
I work at the University of Minnesota UAV lab and we were lucky enough to have an airborne insect trapping project this summer so that gave me the opportunity to fly around sunset, up to sunset + 30 minutes and get some nice still shots and video footage.
Hi Tiziano, thanks for making your videos! Aruco markers are amazingly cool! Here's one of my best precision landings flown with an x-uav talon): https://www.youtube.com/watch?v=Su3sL4JhcQc
Sorry, I didn't use aruco markers for any of this ... but I did use lots of python and opencv to make the video. (And I had to do a camera calibration and work out frames of reference and coordinate systems and all that fun stuff.)
For anyone who is interested, the onboard autopilot is developed in house at the U of MN UAV lab. We fly our own EKF and flight code. Big chunks of the core system are written in python and run onboard using a beaglebone linux computer. We are doing our small part to add some variety to the open-source UAV gene pool.