Adding optical flow to the mix

Having started as a crazy idea for just a small side project, the optical flow sensor quickly hacked together (we conduct our research on other topics) by a group of PhDs is now adopted by more and more systems. And because its an useful tool for research (not so much a research contribution), it made even a short paper. Optical flow is a pretty old and basic technique, but quite robust if the camera runs at very high rate - and the main success factor making the AR.Drone so robust and easy to fly.

Aside from the easy integration into different systems, the main benefit of a standalone design is the extremely low latency of the velocity output. In our latest Firmware version currently in testing the camera sensor runs at 450 Hz, which makes the output despite the relatively basic algorithm extremely robust. And it means we can bump the maximum speed to ~3 m/s per meter altitude, or in other words: Allow a maximum speed of 30 m/s at 10 m altitude and still allow a smooth precision landing at 3 m/s at 1 m altitude.

Because the ground distance noise feeds into the velocity noise, the next big step will be the integration of laser based altitude estimates - and with the Lidar-Lite having a really low price tag, it will be an ideal combination. The ultrasound ranging was in fact so far the biggest limitation and with this removed, we think an optical flow sensor should become the default in addition to GPS today - its a great complimentary technology to improve robustness and accuracy.

And by becoming a default, the costs of making the module will come down to a level where its a no-brainer to get one. Besides our own system a number of autopilot systems have been successfully interfaced (AutoQuad, MikroKopter and lately ArduCopter).

This video shows a scenario called urban canyon, where the multi path reflections of the environment and the blocked sky view by buildings make GPS reception extremely challenging and much less accurate than on an open field:

And of course there are many more cool videos on Youtube - I just couldn't embed all of them here.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Greg Dronsky i´m working this week trying to impleemnt the PX4FLOW on ArduCopter i´ll upload the result soon.

  • The company that makes the very small cameras has been making automotive grade sensors for a number of years. Other issues are involved such as temperature variants.

    Machine Vision sensors are most notably designed to be used with some form of illumination. This is why you are noticing fair light sensitivity using natural light.

    The Aptina sensor you are using did not state QE across the spectral band for which it was designed, just a standard light sensitivity in Lumens, which is OK for this type of sensor. Dark Current was not stated, however, its stated HDR(High Dynamic Range) is suspect due to piece wise voltage signaling.

    If I were to test this sensor using EMVA 1288 standards( I would not be surprised that this sensor would not go beyond 80 or 90db. Doing testing of various sensors and cameras we have found most sensor manufacturers over state their sensors capabilities. We have found this to be especially true of Sony & Aptina sensors at this time.

    I am installing an EMVA 1288 test center in Ohio, at the University of Dayton, in conjunction with their E/O program shortly. This will be the first test and certification equipment in North America for EMVA 1288. I will see if I can get a camera with this sensor in it, test it against Aptina's spec sheet.

    AIA( has adopted the standard for North America.

  • Developer

    @Vega: Some interesting claims you're making there. What would be your comment about the light sensitivity of these (anonymous) products compared to an automotive machine vision sensor like the MT9V034 used in this flow sensor?

  • There are some very small CMOS cameras with integrated thin chip lenses that we currently sell in other markets that will bring the size of your current configuration down a magnitude. The CMOS cameras have LVDS output, so they are digital in this respect.

    The cameras have been used in parallel processing configurations to increase resolution and to provide panoramic views. These cameras run at 40fps, but can be run at up to 100fps. We have others that will run to 10,000fps.

    We have been in the industrial machine & robotics vision markets the past 12 years.

  • Its in fact extremely easy to integrate and work with (as the numerous videos from many different autopilot projects illustrate). However, support is not yet in any released version of ArduCopter - but I'm sure there will be soon.

  • When this might be implemented in to arducopter? Its really amazing feature! I have px4flow sensor, but its quite complicated to work with.

This reply was deleted.