Matrix Pilot Camera Targeting: Tree Test - Software Stabilized
Strategy: Have MatrixPilot / UAV DevBoard (UDB) point camera roughly in the right direction.
Take out the jitter movement using software stabilization.
The above flight was flown on a day with 10mph of wind, with a lot of turbulence (I included the landing in the video to show the turbulence).
1. The UDB is calculating its orientation 40 times / second using a 16 bit Direction Cosine Matrix
2. I use se Bill Premerlani's "High Bandwidth Dead Reckoning", which means MatrixPilot knows it's
absolute position 40 times / second, by integrating the accelerometers. The accelerometer positions are
corrected by allowing for the GPS delay and some of the GPS dynamics (GPS info arrives at least 1
seconds after the real event).
3. The camera code computes the target location from the above 40 times / second.
The main issue at the moment is that I'm using a camera with progressive scan. This causes each frame of the image to be distorted when the camera rotates (accelerates) in a new direction.
Pitch Servo resolution is 0.2 degrees which translates into moving the picture 7 pixels. Ideally it would be at least 1/10th of that, e.g. 0.02 degrees.
Photos of the build of this project are here.
Main Wiki for MatrixPilot is here.
For reference the flight path, in autonomous mode, is shown below.
Comments
Inspirational
I LOVE your twinstar setup, very if it fits it works. Inspiring.
APM already gives bearing to next waypoint so I'm guessing a POI bearing would use the same code and then attitude adjustments (ie banking angle).
To include recognition and visual tracking, I would follow the work from PixHawk using the UDB and MatrixPilot. The UDB and MatrixPilot would plug to their architecture block as the IMU. Then add another computer running Linux (they call that the Flight Controller), which receives high speed telemetry from the IMU (matrixPilot), and add the camera as a USB device to the Linux computer (Flight Controller). PixHawk have most of the software and build environment already done for that. The communication protocol (MavLink) is likely to work just fine, and will probably save memory and CPU cycles over what we do now. However we would have to up our baud rate from 19200 to the 57K range.
Without that targeting software, my main goal would be sooo far away in the future... A real work of art! Now we just have to wait until someone writes a recognition and visual tracking application suitable for a dsPIC , then this is turning into extremely interesting stuff...
/Marc
I will implement the same feature to my GluonPilot/EasyStar/Kodak Zx1.
@Pete: do you still use the standard lens on your camera or do you use any wideangle lens?
any chance arducopter / pilot gonna get this function hehe