3D Robotics

Neruromorphic chip "learns" how to fly

3689624364?profile=originalFrom Technology Review:

There isn’t much space between your ears, but what’s in there can do many things that a computer of the same size never could. Your brain is also vastly more energy efficient at interpreting the world visually or understanding speech than any computer system.

That’s why academic and corporate labs have been experimenting with “neuromorphic” chips modeled on features seen in brains. These chips have networks of “neurons” that communicate in spikes of electricity (see “Thinking in Silicon”). They can be significantly more energy-efficient than conventional chips, and some can even automatically reprogram themselves to learn new skills.

Now a neuromorphic chip has been untethered from the lab bench, and tested in a tiny drone aircraft that weighs less than 100 grams.

In the experiment, the prototype chip, with 576 silicon neurons, took in data from the aircraft’s optical, ultrasound, and infrared sensors as it flew between three different rooms.

The first time the drone was flown into each room, the unique pattern of incoming sensor data from the walls, furniture, and other objects caused a pattern of electrical activity in the neurons that the chip had never experienced before. That triggered it to report that it was in a new space, and also caused the ways its neurons connected to one another to change, in a crude mimic of learning in a real brain. Those changes meant that next time the craft entered the same room, it recognized it and signaled as such.

The chip involved is far from ready for practical deployment, but the test offers empirical support for the ideas that have motivated research into neuromorphic chips, says Narayan Srinivasa, who leads HRL’s Center for Neural and Emergent Systems. “This shows it is possible to do learning literally on the fly, while under very strict size, weight, and power constraints,” he says.

The drone, custom built for the test by drone-maker company Aerovironment, based in Monrovia, California, is six inches square, 1.5 inches high, and weighs only 93 grams, including the battery. HRL’s chip made up just 18 grams of the craft’s weight, and used only 50 milliwatts of power. That wouldn’t be nearly enough for a conventional computer to run software that could learn to recognize rooms, says Srinivasa.

The flight test was a challenge set by the Pentagon research agency DARPA as part of a project under which it has funded HRL, IBM, and others to work on neuromorphic chips. One motivation is the hope that neuromorphic chips might make it possible for military drones to make sense of video and sensor data for themselves, instead of always having to beam it down to earth for analysis by computers or humans.

Prototypes made under DARPA’s program—like HRL’s—have delivered promising results, but much work remains before such technology can perform useful work, says Vishal Saxena, an assistant professor working on neuromorphic chips at Boise State University. “The biggest challenge is identifying what the applications will be and developing robust algorithms,” he says.

Researchers also face a chicken-and-egg scenario, with chips being developed without much idea of what algorithms they will run and algorithms being written without a firm idea of what chip designs will become established. At the same time, neuroscientists are still discovering new things about how networks of real brain cells work on information. “There’s a lot of work to be done collectively between circuit and algorithm experts and the neuroscience community,” says Saxena.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Old news.  This research doesn't even come close to existing efforts.  Probably why they're inventing their own terms to describe existing ideas.

    How about an article on scam and hype pimps who don't credit the inventors of ideas they're trying to co-opt?

  • Personally I think a small a stack of IBM's new True North Chips should do it.

    http://spectrum.ieee.org/computing/hardware/how-ibm-got-brainlike-e...

    Might still be a tad power hungry though, may have to scale up a bit.

    We could call it Sky Net.

  • Developer
    ASIC you mean FPGAs?

    As for AI boffins always are stating they just need some more and then it will work ;-)
  • MR60

    They will need much more than 576 neurons to achieve anything practical. I wonder more generally if a drone flight control based on an intrinsically unpredictable output of neural networks makes sense...

  • It seems that the next evolution should be ASIC setups directed toward the specific algorithmic needs, hopefully this evolution will appear in short order

This reply was deleted.