Note that Marcy vision is our quadrotor's fuselage with a board from an
abandoned ground rover, complete with H bridges, servo headers, & a
900Mhz radio.


So the Logitec C510 has fixed exposure.  Search through the kernel for
V4L2_CID_EXPOSURE_AUTO.  Unfortunately, there's no way to manually set
the exposure & it picks a value which is too bright, but it does get

Decided to see if locked exposure was possible on our other webcams &
sure enough, the oldest one, a ZStar ZC0301 cam, had a fixed gain
option.  It picks an exposure which just happens to work.  It could
track the LED in ambient light.  Its framerate could only reach 15fps.

Unfortunately, after all that hype, the shutter speed on all the cameras
in fixed exposure is too fast to get the continuous ring out of Marcy 1
the object tracker needs.  So fixed exposure without any adjustment
isn't very useful.

With the useless fixed exposure & the lack of clockcycles to process
1280x1024, the Logitec isn't the best camera & we blew $44.  For that
money, we could have built a 3D 640x480 camera.  Still forging ahead
with the Logitec because it cost money.

The Logitec was a squeeze play.  It was drastically reduced & we didn't
expect it to last.  The $16 webcams would probably have been better.
Maybe the CPU power to do object recognition at 1280x1024 will come,

There aren't many stories about how the iPhone is doing object
recognition.  It must have a hardware absdiff comparer & use very low
resolution images.

Well, highschool geometry applied to the imaging gets very good X, Y, Z
coordinates of the object.  The main problem is the weather.


Got separate servo mounts for every camera.


Swapped in the ZStar.


Tracking the LED with the Logitec.

The 1st campaign with machine vision was a real pain to set up & a
complete mess at tracking.  It locked on to the moon & lights.  The
servos jittered like crazy.  The weather was actually calm enough to get
some good data.  Spent most of the battery trying to get vision to lock
on to the vehicle.  A camera pair would definitely aid in separating out
the background.

The 2nd campaign went better.  Put in different PWM code for the camera
turret, which is very precise.  Pointed the camera at a dark area.  It
locked on the vehicle & never lost it.  Someday, the algorithm should
weed out ambient lights.

Never spent much time on PWM code, since it's obsolete, the real
solution is making all the servos & ESC's use I2C, & the fail safe modes
are real busters.  Suspect most people use a single timer & alarms for
the PWM loop.  It's noisy when 2 alarms get too close together.  For
camera pointing, a dedicated timer for each servo is required.

We could swap out the electronics for servos.  Developing ESC's is a lot
more complicated.  That takes over current detection, loss of signal
detection, stall timeouts, hand tuning back EMF voltage.

Camera turrets which automatically track objects are cool.  There's 1
more camera mount which could use some object recognition.


Made during the 1st period Major Marcy deleted us.  Always knew whatever
caused that judgement meant She would never see us in real life.
Indeed, She never talked to us in real life, had 6 suitors, broke up
with all of them, & moved away.  You were weighed, measured, & found
lacking before the 1st pitch.

Anyways, have some video of the 1st flights, from the camera turret. 

1st autopilot tests show real stability & real feedback to the machine
vision.  Marcy vision is producing better position information than
we've ever had before.

Tried 1 & 2 LED's.  1 LED at 3.3V worked just as well as 2.  Haven't
used chroma keying because it lowers the framerate, colors all saturate
to white on the camera, colors are less different from a black
background than white.

A camera turret made out of servos is way too expensive to ever be a
viable product.  If you could get it to recognize objects in daylight,
it could be a superior alternative to GPS.

The main question is how to track objects which don't spin & how to
handle ambient light.  We're looking at building databases of objects in
every possible angle & distance, with low resolution proxies,
compensating for differences in exposure, & using the 20MB OpenCV

The only problem is the Goo Tube videos of OpenCV don't show the
accuracy we need.  Hard to believe the other machine vision guy died, a year ago.

With all the talk of balloon projected images rounding the blogs, maybe
sonar has a future in some future blimp which is small enough to fit in
the apartment.

Sleep schedule is pretty screwed up from flying the Marcy special in the
most stable, early morning hours, then spending hours wondering why our
flight controller didn't work.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • If you haven't seen it yet, take a look at

This reply was deleted.