For a long time I've been wanting to make an ultra minimalist vision / optical flow sensor for the hobbyist and experimentalist community. I've been pursuing this as a small IR&D pet project at Centeye. We're almost there.


The above photo shows one of these sensors next to a millimeter scale. The part count is small- One of our 64x64 custom image sensors, an Atmel ATmega644 processor, several resistors and capacitors, and some lightweight flat optics we developed. Two complete sensors are shown, including with mounted optics (yes it's that thin!). Total mass is about 440mg. The primary interface is via I2C/TWI, which will allow many sensors to be hooked up to a common bus. A secondary connector includes the interface with the ISP for uploading firmware.


We chose to use an ATmega processor since they are loved by hardware hackers and are easy to use. Ideally for a single sensor, one can upload any number of different "application firmwares" to the sensor to make it whatever one wants, limited by just the processor and the base resolution. One firmware will turn it into an optical flow sensor . Another firmware will let it track bright lights. Yet another firmware could turn it into something else. Or someone could write their own firmware, whether by tweaking existing source code (yes I plan to share it) or writing something completely new.


An ATmega644 may not sound like much for image processing- 64kB flash, 4k SRAM, 2k EEPROM, 20MHz max. Neither does a 64x64 array. But the reality is if you are witty you really don't need at lot of resolution or processing power to get some nice results. (We once did an altitude hold demo with just 16 pixels an 1MIPS back in 2001.)


We've already made our first batch of these (about 20) and handed them out to a few close collaborators. Based on feedback we are preparing our second run. The new sensors will be slightly larger and heavier (thicker PCB) but more rigid, and use strictly 0.1" headers for all IO and power (including programming). Mass should still be under a gram.


We also have an even smaller version in the works, shown below with a chip mounted and wire bonded (sorry about the mess). This board uses ATtiny and the 7mm x 8mm board alone weighs about 95mg. I think we can get a whole sensor made for about 120mg, if only I had the time! (Maybe some brave person here would like to take a stab at programming it???)


E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Fantastic. What does it cost to have these single run chip fabs?
  • I wouldn't call an arduino + a 16x16 sensor silly. Try averaging the 16x16 array down to a 1x16 or 16x1 array of rectangular pixels- essentially averaging rows or columns. Then feed the 16 element "line image" into a 1D optical flow algorithm. The result is a projection of the 2D optical flow vector onto a vector representing the array. 1D algorithms typically require a fraction of the computational power per pixel as a 2D algorithm (depending on the algorithm of course) and you've reduced the number of pixels by 16. You've just reduced the computational load by a factor of maybe forty. You can do this twice, once in the X direction and once in the Y, and crudely reconstruct the 2D vector if you wish. But the truth is you can do altitude hold, terrain following, and even some obstacle avoidance with just a 1D OF measurement.
  • This look awsome, (and a bit of an upgrade from my silly arduino nano + 16x16pixel optical mouse sensor)
  • Maik, good question.

    We haven't tried this particular sensor over grass yet. We have tried it over asphalt (exploring tire slip measurement) and it sensed that texture. However we did fly earlier sensors over grass and they worked fine. Obviously the sensors could not pick up every blade of grass, but generally there is enough lower spatial frequency texture variations for the sensor to pick up motion- no lawn is perfectly uniform. We've never flown over water- that would be a worthwhile challenge! We did fly an earlier sensor over snow though and that worked.

    One consideration is that the "earlier" sensor we flew had a different focal plane architecture and, more important, an optical flow algorithm that more gracefully handled low contrast environments. Those were insect vision inspired algorithms, as opposed to more traditional algorithms like Lucas Kanade or Image Interpolation (both variants of gradient methods).

    We can actually make an application firmware version that uses these insect inspired methods. On the plus side they are quite sensitive and fast. On the negative side they are a bit noisy and not appropriate for, say, odometry.
  • And atleast Paparazzi has a pretty good wind estimation, which might make it sufficiently precise for a soft landing. Cool stuff.
    Geoffrey, how well does the OF algorith cope with flying above water or grass? I could imagine that it's a way too uniform surface to detect optical flow. Then again, I can't imagine it working at all with that kind of resolution and computing power ;)
  • It looks like two people already are interested in programming the Tiny. Awesome!!
    Xander- the earlier altitude hold demos were on a fixed-wing aircraft. Essentially we assumed a (somewhat) constant ground speed and thus OF is inversely proportional to height. The aircraft tended to fly lower into the wing and higher with it, but at least it kept off the ground which was most important.
  • Sounds fantastic!

    How were you able to measure altitude while looking down? Does that require the sensor to notice a sort of zooming in and out effect of the features it sees? Seems like that would be tough while also moving.

    I look forward to hearing about pricing.
  • Its been a while since I've been to the Centeye website. I'm glad to hear of the progress being made.

    I'd also be willing to try programming one, if you're still looking for volunteers.

    - Roy
  • For 20 grams you could have a whole array of these things looking all around! More seriously, one of these sensors should support terrain following (though we haven't tried tried that for a long time) and *might* be able to provide obstacle avoidance against larger obstacles, though the latter will take some more algorithm development.
    Regarding landing: Back in 2001 we tried just that- we put a sensor on a fixed-wing foam Wingo (remember those?) and aimed it down. We could control altitude by applying a proportional control rule to the elevator or throttle based on the downward measured optical flow. It actually worked pretty well. Then we tried very simple take-off and landing. For take-off, we would push the throttle up, then the Wingo would take off and stop ascending when it reached the set altitude (more or less). For landing, we had the elevator cause the Wingo to flare up a bit before landing. These of course were simple toy tests and not what you need. But to answer your question- yes I think one of these sensors could do this. One thing about landing is that the optical flow will generally be forward-backwards, so that one can use a one-dimensional algorithm and fewer pixels, thus giving us greater frame rate for a given processor.
    Regarding data stream: Right now we have just a single firmware prototype with two operating modes. The first mode outputs six bytes: X OF, Y OF, X integrated OF, Y integrated OF, X low-passed OF, Y low-passed OF. The second mode outputs just two bytes: X integrated OF and Y integrated OF, but with the integral computed in a different manner. So basically once you've set up the sensor (sending it a few commands by I2C) you just do I2C reads from it and read out those bytes. Of course, if you hack the firmware you can provide other options.
  • Geoff: what kind of data do you hand up stream from your sensor? Is it a 2d vector representing the sensed flow rate in "sensor" space?
This reply was deleted.