Having seen that the ardone is using a lookdown camera for stability, I though I would put together a simple bit of code to look at a video stream and find point of interest. They seem to have a grid of 25 points that they lock on to points of interest but I have just picked any 100 points. It is this high because each frame is processed separately with no attempt to find a tracking point in frame n +1 close to where a point was in frame n.In my sample vid ( from micrsoft) it finds points OK and it can be seen that there is tracking , i.e. a point on the skier is followed across several frames. Video came from here http://support.microsoft.com/kb/119383 .I have seen examples of using cameras for horizon estimation but I think the look down camera has great potential for drone stabilisation.There are some built in functions in opencv that calculate so calledd optical flow and I think they combine feature detection across sequential frames. That might be the way to goAny suggestions, alternative code, sample videos, etc would be welcome.To run my code you need python2.6 and opencv ( I am on windows). You could change the code to use a usb camera but my laptop goes so slooooowwwww when I try to use the camera.
The AR.Drone combines the look-down optical flow with altitude data from the ultrasonic sensors, so it can scale the detected movement appropriately depending on height.
Plenty of research has been done on using optical flow for stabilization of UAVs, google for some of Mandyam Srinivasan's publications to get initial pointers.
If you're using OpenCV: have a look at the fback (Farneback 2-frame dense optical flow) sample applications included in OpenCV 2.0. These are fairly fast and give dense optical flow. Another option is the Lukas Kanade tracker (lkdemo) but you likely won't have the resources to be running this on your UAV platform.
Also, there is a commercial product: Robbe's helicommand that uses a downward looking camera to hold position on RC helicopters. I wonder what sort of tracking algorithm they're using. There is also a cheaper, plagiarised version from China: the KDS Flymentor.
Replies
I believe that the camera included in the WIImote is capable (to some degree) of tracking a number of points.
Mungewell.
http://www.mil.ufl.edu/mav/avcaaf/research/vision/
Those links look great. Here is an academic one that looks good.
http://www.mil.ufl.edu/~nechyba/mav/
They do the processing on the GS.
http://www.roborealm.com/help/Read%20AVI.php
http://www.roborealm.com/help/Optical_Flow.php
http://www.roborealm.com/help/Skyline.php
http://www.roborealm.com/help/Center%20of%20Gravity.php
http://www.roborealm.com/help/Reception_Quality.php
http://www.roborealm.com/help/Path_Planning.php
Andrew.
If you're using OpenCV: have a look at the fback (Farneback 2-frame dense optical flow) sample applications included in OpenCV 2.0. These are fairly fast and give dense optical flow. Another option is the Lukas Kanade tracker (lkdemo) but you likely won't have the resources to be running this on your UAV platform.
Also, there is a commercial product: Robbe's helicommand that uses a downward looking camera to hold position on RC helicopters. I wonder what sort of tracking algorithm they're using. There is also a cheaper, plagiarised version from China: the KDS Flymentor.