3D Robotics

Humanoid robot learns to fly a flight simulator!

This is both adorable and really cool. From IEEE Spectrum:

As much trouble as humanoid robots are to build and control, we keep on trying to make it work because it's easiest to operate in a human environment if you can do the same things that a human can. There are some good arguments for why it makes a lot more sense to modify our environments to better suit robots, but the fact is, if you can pull it off, humanoid is still the best way to go.

Even for flying airplanes.

If this sounds crazy to you, it sounded crazy to us too, until we saw it basically working at an IROS presentation.

The little robot in the picture above is a PIBOT, a small, very low-cost humanoid (actually a Bioloid Premium from Robotis). It's been slightly modified to be able to work the controls of a scaled-down, simulated aircraft cockpit, as in the pic above. PIBOT is able to identify and use all of the buttons and switches and stuff that you'd find in the cockpit of a normal light aircraft designed for humans:

3689616584?profile=original

Most of the inputs come from the simulator itself (roll, pitch, yaw, airspeed, GPS location), although the robot does use vision for some things, like identifying the runway using edge detection. And this is all it takes, according to the researchers, who state that: "PIBOT can satisfy the various requirements specified in the flying handbook by the Federal Aviation Administration."

You can see PIBOT rocking a simulation in the video below, and for you pilot-types, appended is a comprehensive description of what the robot is doing. Remember, this is all autonomous.

The airplane is initially parked on a runway of an airport. The robot prepares the flight by 1) pulling throttle to zero-point, 2) turning on the battery, 3) the altimeter, 4) the avionics, 5) the fuel pump, and 6) start the engine while pressing the switches on the panel. Then, PIBOT grabs the two control sticks for flight control and brakes are released. When the heading of the airplane aligns with the runway within an error less than 5 degree and its speed exceeds the taxiing speed, the second sequence begins and PIBOT increase the power. The airplane takes off at a proper speed and PIBOT controls both pitch and speed so that the vertical velocity of the airplane reaches the initial rate of climbs. At a certain distance from its departure, PIBOT starts Sequence 3. The airplane turns to the opposite direction, while maintaining its speed and altitude at their given references. In this sequence, PIBOT performs straight-and-level of flight, turns and climbs. In order to land on the runway, PIBOT turns the airplane while decreasing speed when it establishes an enough distance from the expected landing point. This is sequence 4, base leg. Final approach starts at the sequence 5. PIBOT aligns the aircraft with the runway and gradually pitch down at a slower speed. When it flies at around 20 feet above ground, it flares and gently lands on the ground.

Your question right now is probably the same as ours was: "when are you going to get it out of the simulator and flying a real plane?" That work will be presented at a forthcoming conference, but they're doing it already, and you can see a little teaser in the picture at the top of this article: the Macbook on the right is playing a video showing a little humanoid robot at the controls of a small-scale model biplane, flying it fully autonomously with its grippers on the controls.

The robot looked wasn't doing the best job keeping the model plane stable, but being a robot, it doesn't get airsick and puke all over the instrument panel like I would. It can do takeoffs, follow waypoints, maneuver, and make a final approach to landing, although at this point, it still needs some human help for the final touchdown. By the time the researchers publish, however, the 'bot may have nailed that too: there are still some perception challenges to solve, but they're getting very, very close.

A Robot-Machine Interface for Full-functionality Automation using a Humanoid, by Heejin Jeong, David Hyunchul Shim and Sungwook Cho from KAIST in South Korea, was presented yesterday at IROS 2014 in Chicago.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • You will be surprised over then next 10 years. I expect things grow exponential and there will be things we never even dreamed of coming out. There's just so much this tech can do. We are approaching the limit of anything I ever dreamed of with it already. Can't wait to see what it will look like in 2024. Wow

  • ardudroid next?  if we start doing humanoid robotics incorporation in our world, They better not be using hands to drive. they better be plugging into the OBD port or something.  (kinda like rd2d2 being able to plug in and control whatever).  Last thing i need is a humanoid driving me and its slipping a grip.  (ill only let that happen when a humanoid can outlast a bullrider)  My 2 Cents

  • Moderator

    Made my week love it amazeballs.

  • This is truly amazing!

    i hope that in few years time the APM equivalent will support humanoid robots. 

This reply was deleted.