Detecting and locating lights using an Arduino and an image sensor

I've been experimenting with using an Arduino-powered vision system to detect and locate point light sources in an environment. The hardware setup is an Arduino Duemilanove, a Centeye Stonyman image sensor chip, and a printed pinhole. The Arduino acquires a 16x16 window of pixels centered underneath the pinhole, which covers a good part of the hemisphere field of view in front of the sensor. (This setup is part of a new ArduEye system that will be released soon...)

The algorithm determines that a pixel is a point light source if the four following conditions are met: First, the pixel must be brighter than it's eight neighbors. Second, the pixel's intensity must be greater than an "intensity threshold". Third, the pixel must be brighter, by a "convexity threshold", than the average of it's upper and lower neighbors. Fourth, the pixel must similarly be brighter, by the same threshold, than the average of it's left and right neighbors. The algorithm detects up to ten points of light. The Arduino script then dumps the detected light locations to the Arduino serial monitor.

A 16x16 resolution may not seem like much when spread out over a wide field of view. So to boost accuracy we use a well-known "hyperacuity" technique to refine the pixel position estimate to a precision of about a tenth of a pixel. The picture below shows the technique: If a point of light exists at a pixel, the algorithm constructs a curve using that pixel's intensity and the left and right intensities, then interpolates using a second order Lagrange polynomial, and computes the maxima of that polynomial. This gives us "h", a subpixel refinement value that we then add to the pixel's whole-valued horizontal position. The algorithm then does something similar to refine the vertical position using the intensities above and below the pixel in question. (Those of you who have studied SIFT feature descriptors should recognize this technique.) The nice thing about this technique is that you can get the precision of a 140x140 image for "light tracking" without exceeding the Arduino's 2kB memory limit.

The algorithm takes about 30 milliseconds to acquire a 16x16 image and another 2 or 3 milliseconds to locate the lights.

The first video shows detection of a single point light source, both with and without hyperacuity position refinement. When I add a flashlight, a second point is detected. The second video shows detection of three lights (dining room pendant lamps) including when they are dimmed way down.

It would be interesting to hack such a sensor with a quadrotor or another robotic platform- Bright lights could serve as markers, or even targets, for navigation. Perhaps each quad rotor could have an LED attached to it, and then the quad rotors could be programmed to fly in formation or (if you are brave) pursue each other.

With additional programming, that sensor could also implement optical flow computations much like I had done in a previous post.

SOURCE CODE AND PCB FILES:

The main Arduino sketch file can be found here: LightTracker_v1.zip

You will still need library files to run it. I've put these, as well as support documentation and Eagle files for the PCBs, in the downloads section of a Google Code project file, located here: http://code.google.com/p/ardueye-rocket-libraries/downloads/list

Views: 5247

Comment by Gareth Rens on January 22, 2012 at 11:38pm

very very cool! Im thinking a landing pad with auto charge station that the copter could find using the ArduEye!

Comment by Geoffrey L. Barrows on January 23, 2012 at 8:46am

I added links, at the bottom of the post, to the source code, PCB files, and other documentation, for those who are interested.

@Gareth- Yes- this should work. If the ArduEye is mounted on the bottom, pointed to the ground, you should be able to pick up the charger station from a pretty broad range of positions. The only caveat of course (which applies to all uses) is that the light needs to be clearly brighter than the surrounding area- a modest LED will work in a dark environment, but will never work outside! If this is being used outside in sunlight, though, it might be interesting to experiment with some sort of reflective glitter that, say, reflects sunlight to be brighter than the ground.

Comment by ctech4285 on January 23, 2012 at 10:14am

Gareth Rens;

yes thats perfect

would this sensor be able to distinguish a bright IR led from the noisy light in outdoor condition? maybe we could pulse the light source and that would eliminate false positives in a noisy enviroment

Comment by ctech4285 on January 23, 2012 at 10:18am

never mind the IR idea, the sun put outs more energy at that range then the visible

http://en.wikipedia.org/wiki/File:Solar_Spectrum.png

what is the wavelenght/sensitivity of the camera?

Comment by Geoffrey L. Barrows on January 23, 2012 at 10:33am

The sun is super bright- the difference in light levels between high sunlight and twilight can be something like 4 or 5 orders of magnitude, and the difference between sunlight and starlight a factor of over a million. So that means in bright environments any light source for you to track needs to be correspondingly brighter. It's pretty amazing when you think about it.

That said, the image sensor is silicon and thus has the same generic spectral response as silicon- it will detect the visible spectrum and a bit of near infrared e.g. a generic IR remote control.

Comment by Gareth Rens on January 23, 2012 at 10:41am

Like ctech said, would modulating the light at a very specific frequency and coding in a feature to look for that specific frequency not work? What about using a strobe? Im far from a math-lete, so please correct me if im being a moron :)

Comment by Gareth Rens on January 23, 2012 at 10:44am

or even a visible green laser. Could an algorithm be written when it follows the vertical green line until it gets shorter?

Then once it reaches a pre-defined length relative to the vertical, the copter lands.

?

Comment by ctech4285 on January 23, 2012 at 10:48am

we would be looking at the ground so we are only seeing reflected light that should give us a few order of magnitude. we could restrict the field of view to make sure the sun is not in there

a 1W LED at 5m might do it, what do you think?

Comment by ctech4285 on January 23, 2012 at 10:57am

Gareth Rens;

well you cant tell the lengh of light beam that easily, we would have to use ultrasound for that

we can get the position withing +-5m via GPS, then we have the light finder seeing the light from the charging station. that means we need something like 25-45deg light cone. you could do it with a laser but they are more expensive and need a lens. then we fly until the light source is directly underneath. get the height with ultrasound, orientation with the magnetometer. and now we have a very accurate fix on the target and can land. the feet make contact with the the platform, charging begins.

Comment by Geoffrey L. Barrows on January 23, 2012 at 11:42am

Sorry I missed the pulsing light question. Yes you can do that. Though the current Arduino's ADC is pretty slow (10ksps) so you may have to modify the code to sample a single pixel at a high rate until you can decode it. Alternatively, that shield board has space to mount an external ADC which will boost you to 200ksps. Also one can wait for the Arduino Due with it's faster ARM processor.

In the sunlight, a 1W LED will get swamped out. Using filters might help, but I don't know what is available in that regard. A small shiny ball, say a white Christmas ornament, might help out in that it can reflect the sunlight off. Another possibility may be a bright white spot surrounded by a larger black circle. I haven't tried that though.

It sounds, though, that for outdoor use you might be better off with a method that looks for landmarks rather than a bright light. We're also playing with cell phone camera lenses that yield a narrower field of view and higher resolution image. That would take more work (putting the fixed pattern calibration mask into Flash) but might be a solution.

Comment

You need to be a member of DIY Drones to add comments!

Join DIY Drones

Groups

Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2018   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service