From BotJunkie:

"This little helicopter is able to understand you when you tell it what to do. No pushing buttons, no using special commands, you just tell it where you want it to go and (eventually) it goes. Of course, I’m sure it required a bit of work to define where “door” and “elevator” and “window” are, but it’s a much more intuitive way to control a UAV that works when your hands are full, when you’re stressed (think military), or simply when you have no idea now to control a UAV.

I don’t have much in the way of other details on this project, besides the fact that it probably comes from the Robust Robotics Group at MIT, and possibly from someone who lives in this dorm. How do I know? Well, one of the research goals of the RRG is “to build social robots that can quickly learn what people want without being annoying or intrusive,” and this video is on the same YouTube channel. ‘Nuff said.

[ MIT Robust Robotics Group ]"

Views: 440

Comment by Krzysztof Bosak on March 18, 2010 at 3:18am
Not exactly related but close:
I have seen a music-controlled research quadcopter at a polytechnic institute at Zurich.
It was oscillating in sync with music from typical mp3 player fed to the machine.
Comment by Rana on March 18, 2010 at 7:27am
Nice vide Chris !
Nice videos Bosak !, amazing !
Comment by Roy Brewer on March 18, 2010 at 9:41am
I wish they had robot quadrotor projects back when I was living in that dorm...
Comment by Dimitar Kolev on March 18, 2010 at 9:44am
Thanks for the link Chris! Flight arena and quadrotor autonomous triple flips at 1300deg/s are impressive, other project are very interesting, too.
Comment by Jack Crossfire on March 18, 2010 at 10:23am
Sort of going out of his way to get iPhone points. Shouldn't there be a microphone on the aircraft with him talking to the aircraft?
Comment by Mike on March 18, 2010 at 12:16pm
Not that impressive really. Just connecting two technologies and the end result is a lot of latency when compared with a joystick/tx or even keyboard. For what they are aiming for maybe that is ok though...
Comment by Dave Glidden on March 18, 2010 at 12:38pm
"fly to the kitchen and bring me a sandwich"
Comment by passunca on March 18, 2010 at 3:25pm
Very nice... It looks to me like a very nice combination of several different technologies like voice recognition, natural language processing, navigation in 3d environment, indoor positioning and autonomous flight. The commands given to the machine are simple navigation commands, like goto to known location, or relative movements. The flight plan is "basically" a sequential processing of those multiple commands.

So i would say it works like this:
-The speech recognition translates the voice input into text.
-the natural language processing system translates the text into a flight plan.
-the 3d navigation system defines the optimal path
- the quad starts to execute the flight plan, adjustments are made to the flight plan to avoid moving obstacles like people until the drone reaches its targeted location.

easy, right? no not at all... great work!
Comment by Ravi Gaddipati on March 20, 2010 at 10:52am
I think a premade voice processor was used. You can see the iphone showing the text, so it seems it does all the processing a transmits it to the copter.

How do they do the map thing in the lower right corner?
Comment by Roy Brewer on March 20, 2010 at 2:24pm
Here's a tech paper on it

Although this level of sophistication is probably too much for the typical hobbyist, I'll bet it wouldn't be that difficult to get a groundstation to interpret much simpler commands (e.g. "left", stop" "climb" etc) to command a telemetry equipped UAV

- Roy


You need to be a member of DIY Drones to add comments!

Join DIY Drones

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service