Strategy: Have MatrixPilot / UAV DevBoard (UDB) point camera roughly in the right direction.
Take out the jitter movement using software stabilization.
The above flight was flown on a day with 10mph of wind, with a lot of turbulence (I included the landing in the video to show the turbulence).
1. The UDB is calculating its orientation 40 times / second using a 16 bit Direction Cosine Matrix
2. I use se Bill Premerlani's "High Bandwidth Dead Reckoning", which means MatrixPilot knows it's
absolute position 40 times / second, by integrating the accelerometers. The accelerometer positions are
corrected by allowing for the GPS delay and some of the GPS dynamics (GPS info arrives at least 1
seconds after the real event).
3. The camera code computes the target location from the above 40 times / second.
The main issue at the moment is that I'm using a camera with progressive scan. This causes each frame of the image to be distorted when the camera rotates (accelerates) in a new direction.
Pitch Servo resolution is 0.2 degrees which translates into moving the picture 7 pixels. Ideally it would be at least 1/10th of that, e.g. 0.02 degrees.
Photos of the build of this project are here.
Main Wiki for MatrixPilot is here.
For reference the flight path, in autonomous mode, is shown below.