After a rough start in Summer 2011, pan/tilt vision systems have emerged as the ultimate solution for indoor navigation. No surprise, after the success of the Kinect. This was the most advanced, so far.
It wasn't as stable as the AR Drone, but the advantages were size & absolute coordinates. It took a few years, but the budget & technology were finally in place to make an indoor quad copter capable of flying in the world's smallest apartment work. Now it's off to a customer.
The breakthrough with this system was conversion to 1 bit & run length encoding of the image on the microcontroller instead of JPEG compression. That allowed 320x240 to go at 70fps. 640x240 went at 40fps & turned out to be the optimum resolution. 40fps gives 20 unique position readouts, enough to maintain very tight altitude.
Since 1/2 the rows are skipped, the LED sometimes gets lost, but all the columns are scanned in 640x240. Ideally, this would use an FPGA & do 640x480 so every pixel would be covered.
The other factor making it possible was the arrival of copters stable & cheap enough to do the job. The Syma X1 is more stable than anything else tried. It has no accelerometer, yet automatically levels itself & resists horizontal motion. There seems to be a gyroscopic effect from the propellers.
It needed only a magnetometer for the autopilot to detect heading. It was level enough that a decent heading could be determined without any tilt information.
The Blade CX2 was also tried & found to be hopeless. That would not level itself, for some reason. There's hope a Blade MCX will be more stable, but the CX2 is still the only thing small enough to fit in the apartment, with a reasonable payload capacity.
The TCM8230MD, STM32F407, & RTL8192 combination has emerged as the ideal jellybean camera solution. That can do the high framerates, manual exposure, multiple camera synchronization & custom encoding you need for machine vision.
It turned out there was a blanking interval on the TCM8230MD where you could pause the camera clock & restart it to synchronize multiple cameras. It didn't affect the exposure.
The mane problem with the pan/tilt camera is determining where it's pointing. The direction derived from the servo PWM doesn't completely agree with the direction in the image. There's also wobble & delay in the servo motion. This creates a position which constantly drifts & has noise.
The ideal solution would be stationary markers in the room, which show up in the image & give the cameras an exact readout of where each frame is pointing. The most practical idea is 3 gyros directly on a camera for an instantaneous pointing direction which is blended with the PWM pointing direction.
There are ideas to improve the background separation. The flashing LED works really well, but alternating colors would work better. It's really time to start using FPGA's.