For very small UAVs, especially those used indoors, an interesting way to navigate is using a technique called "optical flow". Basically, it's the way flies see: they detect motion rather than resolve images. As you move, the objects closest to you appear to move the fastest, which for a camera chip means pixels shifting position faster.
The video above is from a Swiss team that have used optical flow to steer indoor blimps and microlight aircraft (video here). They've got pretty fancy equipment and lots of money--but is there a way to do the same on the cheap? Yes. It turns out that the sensor on an optical mouse (you probably have a few laying around) can do the job. Here are instructions on how to take the chip from an old mouse and connect it to a Basic Stamp (an Arduino would work even better) and create a low-budget optical flow sensor. Taking the dx, dy information from that and using it to drive the airplane's servos or actuators to move in the opposite direction from the highest optical flow should be a pretty easy matter. The only tricky thing is integrating the mouse chip and processor into a package no larger and heavier than the RC receiver that this optical autopilot replaces. The schematic on the mouse chip to Basic Stamp circuit is below:
Comments
Hi Chris
Could you please provide a modern link to the bs2 software etc
thanks Clinton
Its the way PIR motion sensors used to be made, before they went to the fancy fresnel lens.
I started experimenting with this technique on the bench, watching the trace on a CRO.
I then used the circuit from a mouse, I was trying to get head positioning for a VR headset.
I worked for relative motion, but getting absolute head tracking was hard.
http://home.roadrunner.com/~maccody/robotics/croms-1/croms-1.html