We just realized how prevailing our system from 2009 still is: Marker based position hold and vision based pattern detection, all onboard, with a full companion architecture. Its great to start to see more of that more broadly, in particular in the open source space.
You need to be a member of diydrones to add comments!
Comments
SmartFusion2 = mammal brain with insect reflexes?
Hi Jerry, interesting comparison insect = FPGA, mammal = CPU.
I think for the future of our machines, the key will be an optimized FPGA front end serving initial data extraction and compression and a multi core GPU back end for rapidly processing supplied data and responding.
in the nature world, insect has an eye looks like light field optical sensors running on FPGAs; and mammals more like a CPU comparing to FPGA, for learning new things.
a instruction based platform will be much more flexible, but FPGA platform will be deadly efficient.
You are the undisputed champions of the real time autonomous fly !!
Reading your papers is like music, it just sounds great. I really wish that you could make some of your FPGA code available, in particular the semi-global matching stereo (SGM):
The FPGA allows us to implement the algorithm in hardware, making it vastly more efficient than a CPU or GPU implementation. We run it on a small, lightweight FPGA and companion CPU board that is 76 mm x 46 mm and weighs 50 grams
This is 5 times faster and 5 times more power efficient than a standard implementation as we can compare in this paper :
http://groups.csail.mit.edu/robotics-center/public_papers/Barry15a.pdf