A few days ago I posted some videos of my object tracking algorithm, and some folks were curious how well it would perform if the object left the frame. Since posting I have hit kind of a breakthrough in the algorithm and it is working much better than I had ever expected. I am still fine tuning some of the sensitivities, but I added some neat features.
- Now you get a white box when you click and drag to define the target.
- There is an adjustable accuracy threshold. If one finds that the target will not re-aquire, one can reduce the accuracy (demonstrated in the video above)
- New improved multi-threading (under the covers)
I am working on both Mission Planner integration and, for those who want to use this for auto antenna tracking or camera tracking from the ground, Pololu Micro Maestro integration. As you can see in the video above, the software is able to re-aquire the target when it leaves the frame or gets confused. This is thanks to the constant learning of that objects features as it changes shape or orientation over time.