3D Robotics

Centeye optical flow copters on TechCrunch!

3689474027?profile=original
Our friend Geoff Barrows, who has been working with us on his great optical flow boards, just got covered by TechCrunch! UPDATE: They've disabled the cool video for embedding, but you can see it here

When we first wandered up to the suburban split-level that houses Centeye Inc., we were a bit confused. Could this be the place where a mad roboticist was building tiny robots with insect eyes and brains that could interact with their environment? We rang the doorbell and weren’t disappointed.

Founded by Geoffrey Barrows, Centeye is dedicated to computer vision. They make little electronic eyes that are cheap to reproduce and “see” only a few thousand pixels. He has a staff of two engineers who work with him on designing and building chips and has just released the open source Arudeye board, a tiny Arduino board with camera built-in.

Barrows does everything from his basement. Recent advances in fabrication allow him and his staff to design chips on a computer at home and then send the plans to manufacturers in Asia. They can then mass produce their eyes, driving down the cost per unit to a few dollars. They don’t need a big lab because everything is done remotely.

Their robots are actually proofs-of-concept but they’re really cool. The little helicopters use Centeye eyes to remain stationary in space and other models can avoid objects as they move. Because each eye takes in a small part of the scene, not much computing power is needed to process each bit of input. Like insects, the brain doesn’t have to work very hard to get a lot done.

Centeye has contracts with DARPA but is trying to commercialize their hardware with the Arduino offerings. It’s fascinating to see makers in their own habitat and even more exciting to see them make cool stuff in the oddest of places. Check out the video for more information and you can watch all of our TC Makers episodes here

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Cool.  It's fascinating to think insect circuits will get cracked enough to have their functions put on a chip. 

    Maybe one day, electronics will actually have insect parts grown and embedded.  Need a camera?  Grow it and hook it up.  Need a flapping mechanism?  Oh, never mind.  Damn that MIT press, back to reality for me.

  • Fusing optical and inertial- for that demo we didn't use a gyro, but of course it makes sense to use them since they are so small and cheap now. A simple way to fuse the two is to simply use the gyro to compute the rotational optical component and subtract that out to get the leftover translational component. That can be pretty tricky though for four reasons- the two sensory modes can have somewhat different transient responses, you need to make sure that the gyro samplings are perfectly timed with the image acquisition, vibrations can affect one or other sensor, and as the image field of view gets larger the pixels at the edge are closer together (in angle) than those at the center, so you'll need a different scaling factor. In practice we've never been able to remove more than 90% of the rotational component with our particular hardware.

  • I've ocassionally dreamed of initially fusing optical with inertial, a genetic search for the optimum algorithm/network which is then turned into a chip that only uses optical and can adapt on the fly.  Carcker Jacks promised that PhD but it hasnt arrived yet and they wont return my calls, so Centeye will probably beat me to it.   

  • Good comments! Monroe- I love your enthusiasm! Carl- that copter just had "hover in place" by flying in formation with nearby objects, so when I moved my hands nearby the copter just tried to keep in formation. The result looks pretty cool though. Regarding the MIT garage- true that demo involved flight along a predetermined path, and with a-priori knowledge of the environment, but no doubt it was a lot of work to get that demo working! A slower moving drone would definitely have an easier time- slower moving means more time available and less energy expended to alter the flight path to avoid danger. If you consider both how insects are tiny and can turn on a dime (literally), it is not surprising they can meander through the tightest environments...

  • You are right Alex  this thing could probably negotiate the parking garage  easier than a fixed wing .Unless it could hoover a fixed wing would have to maintain a certain air speed or it would fall to the ground

  • @Carl: I would imagine that it might be easier to do with a 'copter since it doesn't move so fast and therefore doesn't have to make such quick decisions.

  • Alex this thing is totally cool The Mit fixed wing flew a pre programmed pattern it would have flown the same pattern

    had it been in a foot ball field The "Eye has to change a focus on an objict or it might be a form of sonar like Jack Crossfire mentioned what I thought made this unique the guy moved his hands up and it moved out of the way on it's own (unlesss there was some one out of view controlling it?

  • I wonder if it could navigate the parking garage like the MIT fixed-wing craft did?

  • I should google the names in my inbox more often.  The trick is it's not turning or changing position on its own.  You can get a stationary hover out of sonar, too.  A preprogrammed set of expected scenes for each location & each heading would be required.

  • I thought it was noteworthy that it's analog.  I did a search on NN and found something where they also mention that bio inspired computing relies mostly on analog.  I'd like to know more. 

This reply was deleted.