3D Robotics

Latest research on optical flow

I've been fascinated by optical flow ever since I learned that you can do some pretty good navigation work with the chip taken from an old optical mouse. I think this is probably going to be one of the methods we use to take Blimpduino navigation to the next level (it's ideal for any unknown environment, as well as GPS-denied ones). As the picture above suggests, optical flow is the method that many animals, especially insects, use to navigate in complex space.


Here's a writeup from Hizook about the latest research in the field. The authors describe the advantages of this technique:


"The main benefit of this method is that the information about the optics and scene depth is wrapped into a model that is easily learned from video from the robot. No calibration grids or complicated lens distortion models are required! The primary limitation from a practical standpoint is the assumption of constant scene depth, which would break down if, for example, the robot turned to directly face a very close wall, departed from the ground plane, or moved from a narrow corridor into a large open area. "


Here's a video that shows how it works:


CVPR 2009 from Richard Roberts on Vimeo.


[Fly image at the top from the Max Planck Society]

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • I agree, it takes a fair amount of silicon to implement it. Some of the vision guys I work with have been using the Xilinx FPGA version of it, same as in the following article:

  • As Reed said, sometimes it works & sometimes it doesn't but maybe someday it'll be practical.
  • Yes, it is very important not only on Earth but for planetary UAVs too.
  • thank you for posting this, I have also found optic flow a very interesting idea and love to read more on it.
This reply was deleted.