Thank You everyone who came here after reading my blog post from today (and Thank You to Chris Anderson for providing a forum like this one!).

At Centeye, we've been developing optical flow / vision sensors and integrating them on RC aircraft test beds for years now. Check out our website for details if you wish. In 2001 we obtained altitude hold on a fixed-wing aircraft (a foam Wingo, remember those?) using an early "Ladybug" sensor, and later in 2003 basic obstacle avoidance using a scratch-built aircraft. Some of this work was described in a Wired Online article way back in 2004. Ever since way back then many people have expressed an interest in these sensors.

Now, partially due to our participation in the Harvard RoboBee project and partially from our own funds, we are in a position to develop a portfolio of academic/experimentalist/consumer class sensors. The current batch of silicon vision sensors (yes- we design our own silicon!) is currently in fabrication, and should be finished sometime in February.

I would like to find out whether people here would be interested in such sensors. More important, what would you like? I know, it is a very broad question, but I would like to have this discussion.

For our first version, I am envisioning a true minimalist design using one of our imagers and an Atmel AVR for processing. We could probably make the whole thing weigh less then a gram. If fully functional, the sensor will operate off a single LiPo battery- the vision sensor will have it's own integrated linear regulator. The AVR itself can be programmed with different firmware implement a different sensor using the same piece of hardware. I envision this being useful for air vehicles, ground vehicles, and non-robotic uses as well (sensor networks? home automation?)

The interface will likely be via I2C or SPI- I am partial to I2C because it allows multiple sensors to be hooked up on the same four-wire bus (ground, power, clock, data) but SPI is faster.

I'll start with this. I have more thoughts which I'll add on based on interest- I'm hoping all of you wear me out here!

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –


  • Developer
    I think these sensors could be very useful for accomplishing lots fo UAV tasks. I would be interested in them for a variety of airframe platforms. For inexpensive near term sensors my priorities would be -

    Fixed wing functions: Heading hold for auto-takeoff, pattern recognition (runway) and spot tracking for auto-land

    Quad/helicopter functions: OF for stationkeeping

    I would prefer a platform that allows tweaking.
  • Hello Geoffrey
    Great work! As far as the wishlist is concerned, I'd like to use it on a fixed wing aircraft for

    1) altitude hold
    2) obstacle avoidance
    3) autonomous landing

    I would not care about it being a turnkey system. I prefer to tinker with things, but it's my personal view. A 50$ price per sensor could be OK if a single sensor allows to do at least one of the three functions above. I've some question about the inner working:

    a) what kind of signal does it output ? x / y component of OF ? is it an analogic or digital signal ?
    b) does it assume the optical flow is uniform across the whole image as do the sensors inside optical mices ?
    c) what happens if there are different flows in the seen image ? (different directions / different speeds)
    d) what happens if the sensor is looking at a fixed (sufficiently textured) image and it turns around the axis perpendicular to the image plane ?
    e) (similar to question c)) what happens if the sensor is "walking" down a corridor, i.e. it sees flows with different directions on both sides of the image ?

    My congratulations again and sorry for boring you with these question.
    Thank you
  • Developer
    I think your idea of the minimalist design sounds like a good idea. It gives people the flexibility to integrate the information in with whatever other sensor data they have to come up with a complete picture.

    I also vote for the I2C interface. If you keep a cache of say the last 32 values that might make it more convenient for CPUs to come and get the values at a time that's convenient for them.

    and keep it cheap! especially if we need to buy 9 of them!
  • Your first version sounds perfect and wow that is lightweight. The main factor which would determine its success would be the software interface I think. It needs to be easy to implement. Could a sensor of this kind output a stream of forward & back translation, side to side translation and altitude values (translational movements in the xyz planes) for simple connection to a flight controller board or do you need a 360 degree ring of cameras to do this?
    I can envisage something like this in an outdoor quadcopter eliminating the need for GPS position hold and aiding the barometric pressure hold. It could fight against wind gusts just like a hoverfly.
    Wishlist: An optical flow system designed for a quadcopter/multicopter weighing about 10g with larger lens based camera which outputs translation in the xyz planes, updating at ~50Hz. I wonder if you could use high quality mobile phone cameras for this?
This reply was deleted.


DIY Robocars via Twitter
RT @chr1sa: Just a week to go before our next @DIYRobocars race at @circuitlaunch, complete with famous Brazilian BBQ. It's free, fun for k…
DIY Robocars via Twitter
How to use the new @donkey_car graphical UI to edit driving data for better training
Nov 28
DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
Nov 26
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
Nov 25
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Nov 23
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord:
Nov 23
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Nov 23
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch
Oct 23