This forum will discuss and demonstrate the use of natural gestural interactions for communication with an autonomous UAV. They could be inspired by human interactions with other people and the environment to understand the user needs (For example , hand signals for marshalling helicopters and directing infantry personal, or just natural reactions like putting arms around the head to protect from an immediate danger), or be inspired by human and animal interactions in the relationship between a human and an autonomous UAS (For example the herding commands used by a shepherd to his dog or orders given by a falconer to his raptor).
If I want to use gyroscopes to control a car.
Will seniors ask,If I want to use gyroscopes to control a car. There are two motors around.How do you use Arduino to write programs?I do not know what will gyroscope signal.thank you.
Read more…
Comments
This comment has been deleted due to it being inappropriate for posting on this Forum.
TCIII Admin
@Keith,
what is presented on video, by Prof. Richards
is a mix of wireless surface EMG and inertial measurements.
So physical gestures coupled with IMU.
If we could remove IMU from the sensor we get pure
electromyography.
To detect and record EMG signals we need much more complicated hardware part.
To my understanding, what is presented by Prof. Richards is Muscle Gesture Control, just another game pad.
Check out Research into Serious Games - UCLan How to use muscle activity to fly a quadcopter drone. Professor Richards and Steven Lindley show their latest work on Serious Games at the University of Central Lancashire.
I'm very new to UAVs - been flying toys for a few months to get some chops... (No pun intended.) Just finished, almost, my first scratch build - waiting for Pixhawk) I never even considered gesture control! I should have, because back in the 90s I worked with gesture controlled musical instruments. I was a professor of music at NYU and we had an entire room outfitted with motion sensors - we worked with body suits, too. Anyone heard of Laurie Anderson (Lou Reed's widow - hate to refer to her that way, but everyone's heard of Lou Reed, I hope)? I saw her using some amazing gestural control (for visuals and sound) back in the 80s at the Brooklyn Academy of Music.
I once tried neurofeedback (a form of biofeedback that senses brain waves) as a treatment for migraines. It didn't work for the headaches, but I got very proficient at moving objects around on a computer screen. Back then those devices were around $5000. That was 15 years ago - I'll bet there are now cheaper things to experiment with.
Best,
Klaus
A quick update for anyone looking to use Myo with a drone - We've released a basic controller for the AutoFlight application for the Parrot AR Drone here.
Internally at Thalmic we're also working on an iPad application to control the AR Drone which will be released at some point in the next few months.
Just started looking for groups, and had to join, as this is a great concept!
I personally prefer audio (speech or tone recognition) and radio/computer control, but this is fascinating. I imagine it takes some extra technology for this kind of control, but I will be checking into it!
Hi,
Thanks to Keith for inviting me here.
i have a personal project : I would like a specific sound (clap for instance) to command a servo (so i guess a clap switch giving a signal to servo). Sound would be heard through a very directional microphone orientated toward the ground so that drone motors do not interfere too much.
https://www.youtube.com/watch?v=oWu9TFJjHaM
The MYO arm band by Thalmic labs could be the first COTS device to enable Gesture Control to replace the RC transmitter, this would be very good for cold weather operations, or for one handed (arm) control. I plan to start development work on this during the spring, once the MYO unit has arrived!