3D Robotics

Nvidia demos visual navigation on 3DR Iris+

From Nvidia: Here's the full paper.

Most drones would be lost without GPS. Not this one.

A drone developed by NVIDIA researchers navigates even the most far-flung, unmapped places using only deep learning and computer vision powered by NVIDIA Jetson TX1 embedded AI supercomputers.

Although initially designed to follow forest trails to rescue lost hikers or spot fallen trees, the low-flying autonomous drone could work far beyond the forest — in canyons between skyscrapers or inside buildings, for example — where GPS is inaccurate or unavailable.

“This works when GPS doesn’t,” said Nikolai Smolyanskiy, the NVIDIA team’s technical lead. “All you need is a path the drone can recognize visually.”

To keep costs low, researchers built their drone with off-the-shelf components. The drone navigates without GPS and relies instead on deep learningResearchers built their drone with off-the-shelf components to reduce costs.

No GPS? No Problem

Although the technology is still experimental, it could eventually search for survivors in damaged buildings, inspect railroad tracks in tunnels, check stock on store shelves, or adapted to examine communications cables underwater, Smolyanskiy said.

The team’s already trained it to follow train tracks and ported the system to a robot-on-wheels to traverse hallways. The drone also avoids obstacles like people, pets or poles.

“We chose forests as a proving ground because they’re possibly the most difficult places to navigate,” he said. “We figured if we could use deep learning to navigate in that environment, we could navigate anywhere.”

Unlike a more urban environment, where there’s generally uniformity to, for example, the height of curbs, shape of mailboxes and width of sidewalks, the forest is relatively chaotic. Trails in the woods often contain no markings. Light can be filtered through leaves; it also varies from bright sunlight to dark shadows. And trees vary in height, width, angle and branches.

Flight Record

To keep costs low, the researchers built their device using an off-the-shelf drone equipped with the NVIDIA Jetson TX1 and two cameras.

“Our whole idea is to use cameras to understand and navigate the environment,” Smolyanskiy said. “Jetson gives us the computing power to do advanced AI onboard the drone, which is a requirement for operating in remote environments.”

The NVIDIA team isn’t the first to pursue a drone that navigates without GPS, but the researchers achieved what they believe is the longest and most stable flight of its kind. Their fully autonomous drone flies along the trail for a kilometer (about six-tenths of a mile), avoiding obstacles and maintaining a steady position in the center of the trail.

Team member Alexey Kamenev played a big role in making this happen. He developed deep learning techniques that allowed the drone to smoothly fly along trails without sudden movements that would make it wobble. He also reduced the need for massive amounts of data typically needed to train a deep learning system.

In the video above, the drone follows a trail in the forest near the researchers’ Redmond, Wash., office. The areas in green are where the robot decided to fly and the red areas are those it rejected.

No Breadcrumbs Needed

The drone learned to find its way by watching video that Smolyanskiy shot along eight miles of trails in the Pacific Northwest. He took the video in different lighting conditions with three wide-angle GoPro cameras mounted on the left, center and right of a metal bar on a mini Segway.

In addition to their own footage, researchers trained their neural network — called TrailNet — on video recorded on trails in the Swiss Alps by AI researchers at Istituto Dalle Molle di Studi sull’Intelligenza Artificiale (IDSIA) in Zurich.

In fact, IDSIA’s work on drone forest navigation was one inspiration for NVIDIA’s autonomous drone team. The other inspiration was NVIDIA’s self-driving car, BB8.

Next Steps

The team now plans to create downloadable software for Jetson TX1 and Jetson TX2 so others can build robots that navigate based on visual information alone.

Long term, the idea is to tell the robot to travel between two points on any map — whether it’s a Google map or a building plan — and have it successfully make the trip, avoiding obstacles along the way.

For more information about the team’s work, see “Toward Low-Flying Autonomous MAV Trail Navigation using Deep Neural Networks for Environmental Awareness” or watch their talk at the GPU Technology conference.

 

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Good Job. So, guess the future of drones will cruise around the house when you buy one. Maybe in 5 years, this will be sold in bestbuy.

  • Amazing !

  • What's the ITAR status of this, given the US military can't jam it...?

  • 100KM

    I have always liked the Iris.     Even though I have a Solo now, I think the Iris is a better work quadcopter than the Solo.    The Iris has both a primary rf control link and a secondary rf link for telemetry.     Video would be a third frequency.

    The separation of function and redundancy of the control links made for more reliable control of the quadcopter.   

This reply was deleted.