I've been fascinated by optical flow ever since I learned that you can do some pretty good navigation work with the chip taken from an old optical mouse. I think this is probably going to be one of the methods we use to take Blimpduino navigation to the next level (it's ideal for any unknown environment, as well as GPS-denied ones). As the picture above suggests, optical flow is the method that many animals, especially insects, use to navigate in complex space.
Here's a writeup from Hizook about the latest research in the field. The authors describe the advantages of this technique:
"The main benefit of this method is that the information about the optics and scene depth is wrapped into a model that is easily learned from video from the robot. No calibration grids or complicated lens distortion models are required! The primary limitation from a practical standpoint is the assumption of constant scene depth, which would break down if, for example, the robot turned to directly face a very close wall, departed from the ground plane, or moved from a narrow corridor into a large open area. "
Here's a video that shows how it works:
CVPR 2009 from Richard Roberts on Vimeo.
[Fly image at the top from the Max Planck Society]
Comments
http://www.roadnarrows.com/robotics/store/manufacturers/sensefly
Abey, that was very impressive, love to hear you on a future podcast! Amazing.
And a video of our latest results in this domain (merging GPS nav with OF obstacle avoidance): https://www.youtube.com/watch?v=OmkTZxOe-bE
Yea...NOW I understand !