Thank You everyone who came here after reading my blog post from today (and Thank You to Chris Anderson for providing a forum like this one!).

At Centeye, we've been developing optical flow / vision sensors and integrating them on RC aircraft test beds for years now. Check out our website for details if you wish. In 2001 we obtained altitude hold on a fixed-wing aircraft (a foam Wingo, remember those?) using an early "Ladybug" sensor, and later in 2003 basic obstacle avoidance using a scratch-built aircraft. Some of this work was described in a Wired Online article way back in 2004. Ever since way back then many people have expressed an interest in these sensors.

Now, partially due to our participation in the Harvard RoboBee project and partially from our own funds, we are in a position to develop a portfolio of academic/experimentalist/consumer class sensors. The current batch of silicon vision sensors (yes- we design our own silicon!) is currently in fabrication, and should be finished sometime in February.

I would like to find out whether people here would be interested in such sensors. More important, what would you like? I know, it is a very broad question, but I would like to have this discussion.

For our first version, I am envisioning a true minimalist design using one of our imagers and an Atmel AVR for processing. We could probably make the whole thing weigh less then a gram. If fully functional, the sensor will operate off a single LiPo battery- the vision sensor will have it's own integrated linear regulator. The AVR itself can be programmed with different firmware implement a different sensor using the same piece of hardware. I envision this being useful for air vehicles, ground vehicles, and non-robotic uses as well (sensor networks? home automation?)

The interface will likely be via I2C or SPI- I am partial to I2C because it allows multiple sensors to be hooked up on the same four-wire bus (ground, power, clock, data) but SPI is faster.

I'll start with this. I have more thoughts which I'll add on based on interest- I'm hoping all of you wear me out here!

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Developer
    I think these sensors could be very useful for accomplishing lots fo UAV tasks. I would be interested in them for a variety of airframe platforms. For inexpensive near term sensors my priorities would be -

    Fixed wing functions: Heading hold for auto-takeoff, pattern recognition (runway) and spot tracking for auto-land

    Quad/helicopter functions: OF for stationkeeping

    I would prefer a platform that allows tweaking.
  • Hello Geoffrey
    Great work! As far as the wishlist is concerned, I'd like to use it on a fixed wing aircraft for

    1) altitude hold
    2) obstacle avoidance
    3) autonomous landing

    I would not care about it being a turnkey system. I prefer to tinker with things, but it's my personal view. A 50$ price per sensor could be OK if a single sensor allows to do at least one of the three functions above. I've some question about the inner working:

    a) what kind of signal does it output ? x / y component of OF ? is it an analogic or digital signal ?
    b) does it assume the optical flow is uniform across the whole image as do the sensors inside optical mices ?
    c) what happens if there are different flows in the seen image ? (different directions / different speeds)
    d) what happens if the sensor is looking at a fixed (sufficiently textured) image and it turns around the axis perpendicular to the image plane ?
    e) (similar to question c)) what happens if the sensor is "walking" down a corridor, i.e. it sees flows with different directions on both sides of the image ?

    My congratulations again and sorry for boring you with these question.
    Thank you
  • Developer
    I think your idea of the minimalist design sounds like a good idea. It gives people the flexibility to integrate the information in with whatever other sensor data they have to come up with a complete picture.

    I also vote for the I2C interface. If you keep a cache of say the last 32 values that might make it more convenient for CPUs to come and get the values at a time that's convenient for them.

    and keep it cheap! especially if we need to buy 9 of them!
  • Your first version sounds perfect and wow that is lightweight. The main factor which would determine its success would be the software interface I think. It needs to be easy to implement. Could a sensor of this kind output a stream of forward & back translation, side to side translation and altitude values (translational movements in the xyz planes) for simple connection to a flight controller board or do you need a 360 degree ring of cameras to do this?
    I can envisage something like this in an outdoor quadcopter eliminating the need for GPS position hold and aiding the barometric pressure hold. It could fight against wind gusts just like a hoverfly.
    Wishlist: An optical flow system designed for a quadcopter/multicopter weighing about 10g with larger lens based camera which outputs translation in the xyz planes, updating at ~50Hz. I wonder if you could use high quality mobile phone cameras for this?
This reply was deleted.

Activity

DIY Robocars via Twitter
RT @TinkerGen_: "The Tinkergen MARK ($199) is my new favorite starter robocar. It’s got everything — computer vision, deep learning, sensor…
Monday
DIY Robocars via Twitter
Monday
DIY Robocars via Twitter
RT @roboton_io: Join our FREE Sumo Competition 🤖🏆 👉 https://roboton.io/ranking/vsc2020 #sumo #robot #edtech #competition #games4ed https://t.co/WOx…
Nov 16
DIY Drones via Twitter
First impressions of Tinkergen MARK robocar https://ift.tt/36IeZHc
Nov 16
DIY Robocars via Twitter
Our review of the @TinkerGen_ MARK robocar, which is the best on the market right now https://diyrobocars.com/2020/11/15/first-impressions-of-tinkergen-mark-robocar/ https://t.co/ENIlU5SfZ2
Nov 15
DIY Robocars via Twitter
RT @Ingmar_Stapel: I have now explained the OpenBot project in great detail on my blog with 12 articles step by step. I hope you enjoy read…
Nov 15
DIY Robocars via Twitter
RT @DAVGtech: This is a must attend. Click the link, follow link to read the story, sign up. #chaos2020 #digitalconnection #digitalworld ht…
Nov 15
DIY Robocars via Twitter
RT @a1k0n: Got a new chassis for outdoor races (hobbyking Quantum Vandal) but I totally didn't expect that it might cause problems for my g…
Nov 11
DIY Drones via Twitter
First impressions of the Intel OpenBot https://ift.tt/36qkVV4
Nov 10
DIY Robocars via Twitter
Nov 9
DIY Robocars via Twitter
Excellent use of cardboard instead of 3D printing! https://twitter.com/Ingmar_Stapel/status/1324960595318333441
Nov 7
DIY Robocars via Twitter
RT @chr1sa: We've got a record 50 teams competing in this month's @DIYRobocars @donkey_car virtual AI car race. Starting today at 10:00am…
Nov 7
DIY Robocars via Twitter
Nov 6
DIY Robocars via Twitter
RT @a1k0n: Car's view, using a fisheye camera. The ceiling light tracking algorithm gave me some ideas to improve ConeSLAM, and having grou…
Nov 5
DIY Robocars via Twitter
RT @a1k0n: To get ground truth I measured the rug, found the pixel coordinates of its corners, calibrated my phone camera with my standard…
Nov 5
DIY Robocars via Twitter
RT @a1k0n: @DIYRobocars is back in December, but outside. Time to reinvestigate ConeSLAM! I rigged up a quick and dirty ground-truth captur…
Nov 5
More…