Precision Land ArduCopter Demo

TURN UP THE VOLUME!!! THE AUDIO IS QUIET!!!!

For the last few months I've been working on vision assisted landing for ArduCopter. The hope is to provide a means of landing a multirotor in a precise matter which is currently not attainable with GPS based landing supported by ArduCopter.

The development of this software was stimulated by Japan's recent effort to increase the use of UAV's for Search and Rescue. More can be read about that here!!! This sub-project of the S&R is being funded by Japan Drones, a 3DR retailer, and Enroute, also a 3DR retailer and a member of the DroneCode Foundation

This specific feature, precision land, is a very small part of the large project and is designed for Multitrotor recovery. The idea is to fly a Multirotor to a disaster zone, survey the land, and relay intel(such as pictures) back to a base station. The base station may be a couple of miles away from the disaster location so precious flight time, and ultimately battery, is used to fly the copter to and from the disaster location. Multirotors are not known for their lengthy flight times, so the more battery that can be conserved for surveying and not traveling is critical for a successful mission. 

That's where the precision land comes in. The idea is to station rovers, or unmanned ground vehicles, near the disaster location. These rovers will have a landing pad on top for a Multirotor. That way a Multirotor can use all of its battery to survey an area, land on top of a rover, and hitch a ride back to the base station on the rover.

The specifics:

Autopilot: Pixhawk with ArduCopter 3.2

Companion Computer: Odroid U3

Camera: Logitech c920

Vision algorithm: OpenCV canny edge detection, OpenCV ellipse Detector, my concentric circle algorithm(real simple)

Performance(on four cores): Process images at 30+ fps in good light and 10 fps in low light. Performance is limited by camera exposure not the Odroid's processing power!

The future:

1. I hope to have my first live test in a week or so. More testing needs to be done in the simulator to check all the edge cases and make the landing logic more robust.

2. Integrate the code more closely with ArduCopter code. Currently the companion computer takes control of the aircraft when it is in GUIDED mode. The hope is to have the companion computer take control in landing modes(RTL).

3. Check the performance on other companion computers: Intel Edison, BeagleBoneBlack, Raspberry Pi(maybe).

The code:

The code can be found on my github. Be cautious!

Thanks to Randy Mackay for helping me integrate the code with ArduCopter. 

Daniel Nugent

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • http://diydrones.com/profiles/blogs/precision-land-arducopter

    An update for anyone that is interested.

    Daniel
  • What you mentioned is possible. Hopefully we will see more visual navigation applications integrated with Arducopter in the near feature. This could allow for a more multipurpose and modular vision system.

    I am currently working out some interesting bugs but once I feel comfortable flying it then I will put a wiki together for others to try at their own risk. End of March at the latest.

  • I do believe your software your would receive a great following.... Especially if you could share a gimbal mounted camera for landing purposes.

       > Perhaps gimbal reverts to NADIR position when initiating LAND  +- when BaroAlt says the drone is <10m from the ground.

    This method would require no additional hardware.

    When do you think you would invite members to risk their machines to test your code?

        > I have the luxury of an R&D budget :)

  • @Dylan

    The hope is to have this feature closely tied to the Ardupilot code base someday. The vision algorithm is still in its infancy but I have already experimented with creating unique targets using varying ring spacing, like a circular bar code. 

    @Doug

    Maybe someday. Currently it is not designed to perform that task. There are two parts to the program: The vision algorithm, which identifies a target in an image; and the controls logic, which determines how the robots reacts to what is in the image. In this case the controls logic lands the Multirotor on the target. Someone could use a similar(or same) vision algorithm with different controls logic to do a precision drop. However, that is outside the scope of this current project.

  • Can this software be used to perform precision drops?

  • A customizable "landing symbol" would be amazing --- Each vehicle carries a large logo on the roof which could serve this purpose.

  • I am looking forward to this feature! My application would call for a multirotor to be deployable from the roof of an armoured vehicle and for it then to return to exactly the same spot :)

    Has a formal feature request been made for this?

    Cudos Daniel!

  • So far I haven't noticed any major issues with the rolling shutter problem but I haven't done a ton of testing. Still in the prototyping phase. Definitely something to look out for.

    Ardupilot was chosen because it meets the larger project requirements(S&R). Also Ardupilot reaches a larger user base and is built off the PX4 project.

  • Great work! Looking forward to seeing your flight tests.

    I'm working on a similar project, attempting to land on stationary (and eventually mobile) targets as an exercise in practicing object tracking techniques I'm learning in a computer vision course at school. I'm using a Point Grey Firefly MV camera since it has a global shutter that (I hope) will make it immune to some of the motion effects that plague other cameras.

    Out of curiosity, why did you select the APM autopilot vs PX4?

  • @pal
    Very cool. Haven't done a ton of research into other cameras. The guys at ETH Zurich use an mvbluefox-mlc.
This reply was deleted.