Daniel Nugent's Posts (4)

Sort by

Precision Land ArduCopter

Did some test flights this weekend for my precision landing program(A detailed explanation of this project can be found here.). They were short due to the cold conditions and battery life but beneficial. I was able to attempt the landing four times. All attempts went well in that the Copter didn't crash and held its position using vision. Only one attempt succeeded. Two had to be aborted due to low battery and poorly secured landing mat. One missed the target. I think it was because of wind.

Next I plan to implement a more robust controls strategy which should increase accuracy.


Read more…

Precision Land ArduCopter Demo


For the last few months I've been working on vision assisted landing for ArduCopter. The hope is to provide a means of landing a multirotor in a precise matter which is currently not attainable with GPS based landing supported by ArduCopter.

The development of this software was stimulated by Japan's recent effort to increase the use of UAV's for Search and Rescue. More can be read about that here!!! This sub-project of the S&R is being funded by Japan Drones, a 3DR retailer, and Enroute, also a 3DR retailer and a member of the DroneCode Foundation

This specific feature, precision land, is a very small part of the large project and is designed for Multitrotor recovery. The idea is to fly a Multirotor to a disaster zone, survey the land, and relay intel(such as pictures) back to a base station. The base station may be a couple of miles away from the disaster location so precious flight time, and ultimately battery, is used to fly the copter to and from the disaster location. Multirotors are not known for their lengthy flight times, so the more battery that can be conserved for surveying and not traveling is critical for a successful mission. 

That's where the precision land comes in. The idea is to station rovers, or unmanned ground vehicles, near the disaster location. These rovers will have a landing pad on top for a Multirotor. That way a Multirotor can use all of its battery to survey an area, land on top of a rover, and hitch a ride back to the base station on the rover.

The specifics:

Autopilot: Pixhawk with ArduCopter 3.2

Companion Computer: Odroid U3

Camera: Logitech c920

Vision algorithm: OpenCV canny edge detection, OpenCV ellipse Detector, my concentric circle algorithm(real simple)

Performance(on four cores): Process images at 30+ fps in good light and 10 fps in low light. Performance is limited by camera exposure not the Odroid's processing power!

The future:

1. I hope to have my first live test in a week or so. More testing needs to be done in the simulator to check all the edge cases and make the landing logic more robust.

2. Integrate the code more closely with ArduCopter code. Currently the companion computer takes control of the aircraft when it is in GUIDED mode. The hope is to have the companion computer take control in landing modes(RTL).

3. Check the performance on other companion computers: Intel Edison, BeagleBoneBlack, Raspberry Pi(maybe).

The code:

The code can be found on my github. Be cautious!

Thanks to Randy Mackay for helping me integrate the code with ArduCopter. 

Daniel Nugent

Read more…


I've seen huge improvements in terms of sensor fusion algorithms implementations into the APM 2.5, but none of those changes have been made to the ArduIMU v3. As a result the arduIMU is EXTREMELY sensitive to vibration and drift. I have taken a look at the APM sensor fusion code(in an attempt in port it over) and I noticed every thing is pretty much in a custom library.

This makes the code especially hard for someone like me(who doesn't know how to use the libraries) to port over, but it should be fairly easy for someone who knows how to use the libraries to port over. 

I don't really know any developers on diydrones but I would love to team up with some of them and improve the arduIMU code. I mean it hasn't been updated since August 2012. I have made improvements myself which improve its speed and ease of interfacibility......I just don't know where to post them? 

The arduIMU is the only affordable sensor board on the market which comes with sensor fusion software already installed. The software just isn't very good. Why do I feel like this project has been abandon. It wasn't even transferred to the new downloads page.

Any suggestions?

Read more…

NugePilot: Communication Timing Issue



For the past year I have been engineering my own autopilot shield for the Arduino Mega(and unintentionally the Due). I hope to be finished by the time AVC rolls around, but I have run into an issue. I currently an using 3 separate Arduinos(Mega: autopilot, ArduIMU, and Atmega328: fail safe) and I am having trouble communicating with them efficiently.  Currently I have the Mega just waiting for a start data signal(at the beginning of a steam of data. Ex. "###") from the fail safe or ArduIMU, and then it decodes the data string following the start signal. But sometimes it can take upwards of 130ms per system to receive one string of data(because its waiting for the start signal. not because of excess data). 130ms for ArduIMU and 130ms for Fail safe. This isn't fast enough. The problem is that I am waiting for the two arduinos(Ex. arduIMU and mega) to be in sync but that doesn't happen very often. I have thought about using interrupts, but then I was afraid of interrupting crucial sections of the code. I have looked into MAVlink protocol because, If I understand it correctly, the heartbeat that it sends out is used for synchronization between devices? 

I was hoping someone could point me in a direction that would help me synchronize communications in a way that does not involve a lot of waiting.

Thanks in advance!


If you are interested in reading more about the project, I have a blog.(Way out of Date and no pictures yet) 

(Sorry for posting on the main page, I didn't see a suitable place in the forums)

Read more…