Geoffrey L. Barrows's Posts (26)

Sort by

Arduino Pro Mini optical flow sensor

3689384408?profile=original

This is a follow-on to an earlier post about an Arduino-based optical flow sensor prototype. Here I actually constructed a complete sensor that could actually be integrated into a robotic platform. To achieve a smaller size, I used an Arduino Pro Mini board from Sparkfun, and added a "shield" to interface that with a Centeye image sensor and optics. The breakout of parts is shown above- The complete sensor is shown next to a US Quarter. You can also see the individual blue Arduino board and the green "shield" board. On the right are individual "sensor heads" which are basically small PCBs holding the image sensor chip (here the "FireflySmall"), optional optics, and two capacitors. The sensor head plugs into the green shield board via a Hirose DF30 board to board connector. Three sensor heads are shown- on top is a sensor head with optics, in the middle the board-to-board connector side, and on the bottom the image sensor chip side.

 

I wrote a simple Arduino script to grab pixels from the vision chip (configuring it to grab rectangular pixels), compute 1D optical flow using a variation of Srinivasan's "image interpolation algorithm", and dump a display of the optical flow to the serial monitor. (Some of you may know Professor Mandyam Srinivasan as the Australian biologist who has studied honey bee navigation, in particular how honey bees use optical flow to close control loops using simple but elegant heuristics.) A simple video of the sensor is shown below. I've also attached the Arduino script code, in case anyone is interested.

 

 

PIO12Firefly_ProMini_LinearOF.pde

Read more…

Make your own plastic mini lens

3689382699?profile=original

One of my side projects has been to develop an open-source visual motion / optical flow sensor that may be usable in all sorts of devices, whether robotic related or not. I've found though that one of the most difficult parts of developing tiny vision sensors is finding the right optics. Ideally we want something that is cheap, easy to mount, has a decent field of view, and yet creates a reasonable image. For higher resolution images (hundreds of pixels across or more) and image sensors four or more millimeters wide, there are many excellent lens assemblies available from companies like Sunex. (We've used and still use their products in higher resolution sensor designs, always with success.) However there is really nothing available for image sensors a millimeter wide. Printed pinholes do work, but don't let through as much light. This post highlights a little project to design a decent lens for the Faraya64 image sensor, whose focal plane measures about 1.1mm across.

 

The basic concept (shown below) is to make a simple "plano convex" lens, with the curved surface (e.g. "bump") facing outward and the flat section resting on an image sensor. I decided on acrylic as an optical material, since it is light, easily formed, and for our purposes is as good as glass. An opaque "stop", perhaps made of paint or a piece of black plastic, would fit over over the lens and allow light to reach the image sensor through only the center of the front bump.

 

3689382738?profile=original

For the front surface I decided to use a spherical shape- this is a pretty traditional shape for 99% of all lenses out there. I used the ray tracing and optimizing capabilities of Zemax, a commercial optics design software package, to find an optimal shape. I found that to provide the 1.1mm image sensor chip with a 60 degree field of view, the dimensions shown above were optimal: The front bump would be a sphere with a radius of 0.61mm, while the total thickness would be 1.6mm. I also allowed the optimizer to try aspherical conic shapes (e.g. parabolas, hyberbolas, ellipses) but found that I did not gain much performance over that of a sphere and so decided to keep it simple.

 

The two figures below show two plots generated by Zemax. First is the lens layout, showing the 0, 10, 20, and 30 degree off-axis rays. Second is the spot diagram showing how the image would be blurred from ideal, for red, green, and blue rays! The pitch between pixels on the Faraya64 image sensor is about 17.1 microns, so the RMS blurring shown here was not quite optimal but was good enough for me to go on.

3689382766?profile=original3689382805?profile=original

 

I decided next to first try forming a lens by hand by stamping it. This technique won't match the precision from the Zemax simualtion, but I thought it would be fun. Using Alibre, I designed a simple stamp that would produce a lens of the above shape. I had this stamp CNC machined from aluminum by the company FirstCut.

3689382398?profile=original

The stamp came back, but it needed more finishing. Basically the part of the mold that forms the spherical lens bump needed to be polished to "optical quality". I didn't really know what I was doing, but I did manage to find a way to polish this area! I used a Dremel tool, a toothpick, and some polish (Simichrome Polish by Gesswein), put a dab of polish in the hold, and used the Dremel tool to spin the toothpick. After about 5-10 minutes of spinning and moving the toothpick around, I managed to get the spherical surface polished to a decent quality, as shown below. I also cut channels to let the plastic escape as the stamp was being pressed down. The result is shown below.

 

3689382861?profile=originalNow comes the fun part- the pressing! I don't have any photos from this step, but I'll try to describe the technique that worked best. I placed a pyrex dish on a hot plate, and set the heat to low-medium. After a few minutes I then placed a piece of aluminum foil on the hot plate, and a tiny piece of acrylic onto the aluminum foil. A few minutes later the acrylic piece got soft, and I pressed down onto it with the stamp. I lifted the stamp out, and the pressed acrylic piece (and the aluminum foil) came up with it. The picture below shows the business end of the stamp, the pressed piece of acrylic, and a U.S. Quarter for size comparison. If you look carefully, you can see the spherical bump in the middle of the piece of acrylic. You can also see extra plastic around the outer side of the lens- I cut that away with a utility knife.

 

3689382796?profile=originalI'd say that about 40% of the lenses I pressed turned out OK. Below and at the very top of this post are front and back pictures of some of the final lenses produced.


3689382882?profile=originalThe next step is to mount the lens on a chip. Basically I used a UV curable optical adhesive (Norland 63) to glue the lens directly onto the chip. Then I used black modeling paint to form the "opaque enclosure". Finally I added more optical adhesive to encapsulate everything (except for the lens bump). The result is shown below- you'll see the lens mounted onto a Firefly chip (similar to the Faraya) and that mounted on a small 9mm x 9mm PCB. If you look closely you can also see the wire bonds that connect the chip to the PCB.

 

3689382917?profile=originalHow was the image quality? Actually much better than I expected, especially since the flat side of the lens was not perfect. The optical adhesive's index of refraction is similar to that of acrylic, so I think together they helped smooth out some of the imperfections. Below is a screen shot- the quality is certainly good enough for 32x32 images, and I think with refining can be further improved.

 

3689382925?profile=originalSo now the final step is to automate this process. Rather than press molding by hand, injection molding is the way to go for quantity. Below is a family mold with four slightly different variations of the same lens. This is currently being fabricated by Protomold. I should get parts back in the second week of January. I'll report back on this in a few weeks!

3689382946?profile=original

 

Read more…



I'm learning to love things open source! To see what the fuss was about, I obtained an Arudino Duemilanove board from Sparkfun and decided to play around with it. It didn't take very long for me to assemble a (very) simple optical flow sensor using this board and one of my 16x16 Tam vision chips.

The circuit is very simple- the only electronic components were the Arduino board, a 16x16 Tam vision chip I developed at Centeye, and a single bypass capacitor. The vision chip and the bypass capacitor reside on a one inch (25.4mm) square breakout board. This particular Tam chip is pretty simple to operate- aside from the power signals, it requires two digital inputs, clock and reset counter, and generates one analog pixel output. A counter on the chip determines which pixel is to be output (by row/column) at the output analog line. Every time the clock is pulsed, the counter increments and the next pixel is selected. Pixels are read out one row at a time. The pixel circuits themselves operate in continuous time and are always generating a voltage in response to light. The counter merely determines which pixel voltage is sent to a voltage buffer before being sent off-chip.

A simple Arduino program reads out the pixel signals, digitizes them with the Arduino/Atmel's ADC, and computes a simple one-dimensional optical flow measurement. For the optical flow algorithm, I chose a variation of the Hassenstein Reichardt algorithm, an venerable algorithm from the 1950's that was one of the first proposed neural models for visual motion sensing. The Arduino program then dumps the simple running graph of the optical flow onto the serial dump terminal.

The optical flow algorithm is very simple. Let pA and pB be the signals output by pixels A and B respectively. Let lp( ) be a simple low-pass filter function, which can be implemented as a running average. The optical flow estimated from pixels A and B is merely lp(pA*lp(pB)-pB*lp(pA)), with the outer low pass filter having a longer time constant than the inner low pass filters. If we have an array of pixels A, B, C, D, and so on, then we compute this algorithm once for pA and pB, then again for pB and pC, and again for pC and pD, and so on, and average the results. This certainly isn't the best algorithm one could use, but it was very simple to throw together and I was actually curious to see how it would work.

For this implementation, I'm only reading in the middle 8x8 block of pixels and row-averaging them to form an eight-pixel line image. Thus the optical flow output you see is really from eight pixels worth of image data, or seven individual optical flow measurements averaged together as described in the last paragraph.

The first video above shows the response when the 16x16 Tam chip is exposed to light and a moving card casts a moving shadow across the chip. The second video shows the response when a lens is placed over the chip, so that the image of my moving hand is tracked. The pictures below show the two complete sensor setups, with and without lens, and a close-up of the Tam chip on it's breakout board.

The purpose of this experiment was to see how easy it would be to throw together a simple optical flow sensor using an Arduino board and a simple image sensor chip. The results are certainly crude, but the concept works. I think with some more work a decent Arduino-based sensor can be made, and it could be as easy to hack as pretty much any other Arduino project. (Arduino rocks!)

For those that are curious, I have another post on another forum that shows simple ASCII images taken from the image sensor, and discusses the operation of the chip in greater detail.

(Note: The "Tam" chip here is similar to but not the same as the "Tamalpais" chip used in the recent post on the 125mg sensor. Both are 16x16, but the Tam has larger pixels and is simpler to operate while the Tamalpais is smaller and has better on-board amplification. There is a story behind both names...)




Read more…







As an exercise in size reduction, we have prototyped a complete optical flow sensor in a 125 milligram and 7mm x 7mm package. This mass includes optics, image sensing, and all processing. Below is a video and two close-up photographs. In the video, note the green vector indicating measured optical flow as a result of image motion.

Image sensor: Centeye Tamalpais 16x16 pixel image sensor (only an 8x8 block is being used), 1.3mm x 4.1mm, focal plane about 0.3mm x 0.3mm.

Optics: Proprietary printed pinhole, about 25 microns wide

Processor: Atmel ATtiny84

Optical flow algorithm: Modified "Image Interplation" algorithm, originally developed by Prof. Mandyam Srinivasan (well known for his research on honey bee vision and navigation).

Frame rate: About 20Hz.

This work is being performed as part of Centeye's participation in the Harvard University Robobees project, an NSF-funded project to build a robotic bee. The final target mass for the complete vision system (including processing) will be on the order of between 10mg to 25mg, and will include omnidirectional sensing as well as algorithms to detect flowers. Obviously we still have some more work to do!

We documented the construction of this sensor, with lots of photographs, in case anyone is interested.


Read more…


For a long time I've been wanting to make an ultra minimalist vision / optical flow sensor for the hobbyist and experimentalist community. I've been pursuing this as a small IR&D pet project at Centeye. We're almost there.


The above photo shows one of these sensors next to a millimeter scale. The part count is small- One of our 64x64 custom image sensors, an Atmel ATmega644 processor, several resistors and capacitors, and some lightweight flat optics we developed. Two complete sensors are shown, including with mounted optics (yes it's that thin!). Total mass is about 440mg. The primary interface is via I2C/TWI, which will allow many sensors to be hooked up to a common bus. A secondary connector includes the interface with the ISP for uploading firmware.


We chose to use an ATmega processor since they are loved by hardware hackers and are easy to use. Ideally for a single sensor, one can upload any number of different "application firmwares" to the sensor to make it whatever one wants, limited by just the processor and the base resolution. One firmware will turn it into an optical flow sensor . Another firmware will let it track bright lights. Yet another firmware could turn it into something else. Or someone could write their own firmware, whether by tweaking existing source code (yes I plan to share it) or writing something completely new.


An ATmega644 may not sound like much for image processing- 64kB flash, 4k SRAM, 2k EEPROM, 20MHz max. Neither does a 64x64 array. But the reality is if you are witty you really don't need at lot of resolution or processing power to get some nice results. (We once did an altitude hold demo with just 16 pixels an 1MIPS back in 2001.)


We've already made our first batch of these (about 20) and handed them out to a few close collaborators. Based on feedback we are preparing our second run. The new sensors will be slightly larger and heavier (thicker PCB) but more rigid, and use strictly 0.1" headers for all IO and power (including programming). Mass should still be under a gram.


We also have an even smaller version in the works, shown below with a chip mounted and wire bonded (sorry about the mess). This board uses ATtiny and the 7mm x 8mm board alone weighs about 95mg. I think we can get a whole sensor made for about 120mg, if only I had the time! (Maybe some brave person here would like to take a stab at programming it???)


Read more…

3689338769?profile=original


Find more videos like this on DIY Drones

Find more videos like this on DIY Drones
Some of you may have seen Centeye's old website showing our earlier work flying optical flow sensors on small RC-class aircraft. Much of this work was sponsored by DARPA and the U.S. Air Force. More recently we have been hacking an eFlite Blade mCX, a very stable small 7" contra-rotating coaxial helicopter.The helicopter is a basic mCX airframe, minus the front canopy, and with the out-of-box green receiver / controller board replaced with one of our own design. Our own board sports an Atmel AVR32 microcontroller and an AT86RF230 wireless chip as well as carbon resistor strips and transistor circuits to implement the swashplate servos and rotor speed controllers. We have also integrated into the board a 6DOF IMU using standard MEMS components.In front of our controller board is a sensor ring with eight custom designed vision sensors mounted on a flexible circuit board and a processor board having another AVR32. They are stacked vertically via 0.8mm board-to-board connectors- Thank You cell phone industry! The processor board operates the eight vision sensors (which form an nice parallel system), acquires the imagery, computes optical flow, and sends high level control signals to the controller board. The whole sensor ring, including processor, flexible ring, vision sensors, and optics together weigh about 3 grams.Using a variation of control algorithms developed by Sean Humbert's lab at the University of Maryland at College Park, we were able to have this helicopter "hover in place" for up to six minutes straight. We could even perturb the helicopter slightly by manually moving it, and it would attempt to return to its original position. We have been able to get this demonstration working in a variety of room sizes and illumination levels. For these experiments, we did not use the IMU- the helicopter held its position (including yaw angle) using purely visual information. The man in the videos above is Travis Young, who has been executing the control aspects of this project at Centeye.Just to make it clear- All sensing, processing, and control is being performed on the helicopter. There is no human in the loop in these videos.Centeye is participating in the NSF-funded Harvard RoboBees project, led by Harvard EECS professor Rob Wood. As part of this project, we will be building vision sensors weighing on the order of tens of milligrams. If all goes well, we should have our first prototype at this scale by this summer!The RoboBee project will also let us do something that I personally have been wanting to do for a long time- to develop a portfolio of purely consumer/academic/hobbyist vision sensors that I can get into the hands of people like the members of this community! I'll be starting a thread soon in the "Sensors and IMUs" forum where I'd enjoy discussing this with everyone.
Read more…