This forum will discuss and demonstrate the use of natural gestural interactions for communication with an autonomous UAV.  They could be inspired by human interactions with other people and the environment to understand the user needs (For example , hand signals for marshalling helicopters and directing infantry personal, or just natural reactions like putting arms around the head to protect from an immediate danger), or be inspired by human and animal interactions in the relationship between a human and an autonomous UAS (For example the herding commands used by a shepherd to his dog or orders given by a falconer to his raptor).

36 Members
Join Us!

Myo by Thalmic labs

Could 2014 be the year we see Gesture Control replace the RC transmitter?

You need to be a member of diydrones to add comments!

Join diydrones

Comments are closed.

Comments

  • This comment has been deleted due to it being inappropriate for posting on this Forum.

    TCIII Admin

  • @Keith,

    what is presented on video, by Prof. Richards
    is a mix of wireless surface EMG and inertial measurements.
    So physical gestures coupled with IMU.
    If we could remove IMU from the sensor we get pure
    electromyography.
    To detect and record EMG signals we need much more complicated hardware part.
    To my understanding, what is presented by Prof. Richards is Muscle Gesture Control, just another game pad.

  • Check out Research into Serious Games - UCLan How to use muscle activity to fly a quadcopter drone. Professor Richards and Steven Lindley show their latest work on Serious Games at the University of Central Lancashire.

  • I'm very new to UAVs - been flying toys for a few months to get some chops...  (No pun intended.)  Just finished, almost, my first scratch build - waiting for Pixhawk) I never even considered gesture control!  I should have, because back in the 90s I worked with gesture controlled musical instruments.  I was a professor of music at NYU and we had an entire room outfitted with motion sensors - we worked with body suits, too.  Anyone heard of Laurie Anderson (Lou Reed's widow - hate to refer to her that way, but everyone's heard of Lou Reed, I hope)?  I saw her using some amazing gestural control (for visuals and sound) back in the 80s at the Brooklyn Academy of Music.

    I once tried neurofeedback (a form of biofeedback that senses brain waves) as a treatment for migraines.  It didn't work for the headaches, but I got very proficient at moving objects around on a computer screen.  Back then those devices were around $5000. That was 15 years ago - I'll bet there are now cheaper things to experiment with.

    Best,

    Klaus

  • A quick update for anyone looking to use Myo with a drone - We've released a basic controller for the AutoFlight application for the Parrot AR Drone here

    Internally at Thalmic we're also working on an iPad application to control the AR Drone which will be released at some point in the next few months. 

  • Just started looking for groups, and had to join, as this is a great concept!

    I personally prefer audio (speech or tone recognition) and radio/computer control, but this is fascinating. I imagine it takes some extra technology for this kind of control, but I will be checking into it!

  • Hi,

    Thanks to Keith for inviting me here.

    i have a personal project : I would like a specific sound (clap for instance) to command a servo (so i guess a clap switch giving a signal to servo). Sound would be heard through a very directional microphone orientated toward the ground so that drone motors do not interfere too much.

  • https://www.youtube.com/watch?v=oWu9TFJjHaM

    The MYO arm band by Thalmic labs could be the first COTS device to enable Gesture Control to replace the RC transmitter, this would be very good for cold weather operations, or for one handed (arm) control. I plan to start development work on this during the spring, once the MYO unit has arrived!

This reply was deleted.

[Gesture Control of a U.A.V] Current Status

Hi folks, So it's been a few weeks since I posted last and truth be told I feel like I've made some good progress on all fronts. The main aspects of this project were namely:The fabrication of a gesture recognition device (seeing as our MYO Armbands failed to turn up)Including the ability to detect finger positionsIncluding the integration of an IMUMust be mobile/portableThe Programming of the Gesture recognition device to actually recognise gesturesThe programming and calibration of a 9DOF IMU…

Read more…
1 Reply

[Gesture Control of a U.A.V] CN-06 GPS Receiver Super U-blox GPS Module w/ Active Ceramic Antenna MWC

Hi folks, So I am also designing and building a gesture recognition and control 'glove' in conjunction to Filip which will be used to control a Parrot AR Drone. I've received the following components:- Arduino- Flex Sensors- Glove- GPS Module- and some really over sized switches :PI’m waiting on:- The 9 DOF board- The WiFi Shield- Some breadboard equipmentI've performed some initial experimentation on the CN-06 Super Ublox component and had some very good results. The module itself is very…

Read more…
0 Replies

[Gesture Control of a U.A.V] AR Drone 2.0 Initial Experimentation

Hi folks, So I'm also working on  the design and manufacture of a gesture recognition device which will eventually be used as control interface for a U.A.V Drone. In my particular case a Parrot AR Drone. The main difference between an AR Drone quad copter and a conventional fixed wing drone, apart from their methods of flight, is that the drone is controlled via Wi-Fi. This therefore means that the transmission and control aspects of the project can utilize powerful software tools such…

Read more…
0 Replies