Thomas Stone's Posts (16)

Sort by

GPS-denied Position-hold w/ 3DR Solo


15 Minutes of Vision-based position-hold with 3DR Solo: This is the first iteration of a vision-based position hold controller. This initial implementation is simply a roll/pitch angle PD controller. However, it works surprisingly well after limited tuning.

The video shows a full flight with limited pilot input (during take-off and landing only). During the ~15min flight, the copter autonomously loiters above the beacon placed on the ground. Wind gusts push the copter around, but no pilot corrections were required. Thoughts on controls improvements are included below.

The current system is fully-functional in any lighting condition (night/day) up to altitudes of 15 meters. The system iuses object-recognition, which eliminates position drift over time, enabling GPS-denied autonomous flight for extended periods (i.e., much longer than the battery will last).







3DR Solo (*disconnected GPS)

IR-LOCK Sensor (*custom-calibrated)

MarkOne Beacon

SF Rangefinder


Modified ArduCopter-Solo Code (post-rebase)



As aforementioned, the current iteration of the controller is simplistic, and is sensitive to wind gusts. The object recognition and rangefinder readings are used as input to the roll/pitch angle PD controller. The controls performance could be improved via more sophisticated sensor fusion and filtering. It should also be noted that this demo includes a custom-calibrated sensor/lens, which we use for particular commercial projects. This calibration should be improve further in future iterations. 


After disconnecting the GPS module and modifying the flight code, the heading state estimated by the flight code drifts in a strangely consistent manner. During the 15 minute flight, the copter slowly makes a 360 degree yaw rotation. This needs to be investigated further (see log file linked above). Perhaps, the issue can be solved with some simple parameter modifications.

Stay tuned for more details and updates by signing up for the IR-LOCK Newsletter:

Read more…

Solo Smart Drone Lands Smartly: PART 2

Our precision landing Solo is flying great! :) This test rig has an IR-LOCK sensor (via I2C), an SF/10A rangefinder (via Serial), and is running a customized precision landing firmware based on APM:Copter (solo-rebase). The video above shows two vision-guided landings. The full 10-landing test sequence is shown below. The typical landing accuracy ranges from 0-10cm, even in moderate/gusty wind.


One of the primary challenges of designing a precision landing UAV system is the controls tuning. Every UAV has different flight characteristics, and finding the right PID (+EKF) parameters can be a painstaking process. So it is very helpful to have a popular, consistent hardware platform (e.g., Solo) to develop on. Solo flies nicely out of the box without significant parameter modifications. The precision landing performance is noticeably better than our IRIS+ platform, and our previous Solo test platform (w/out a laser rangefinder). In the 10-landing test sequence, the typical landing accuracy ranged from ~0-10cm. There was only one ‘failure’ (landing #5 out of 10), in which the copter landed far outside of the specified bounds. This will be analyzed and corrected via the controls code.


This test platform is running a customized precision landing firmware, which uses the vision-based localization data to actively manage the landing accuracy. You can read more about the code, and see previous testing here. USER BEWARE: this code is experimental, and it assumes that you have a reliable rangefinder connected. 

Customized Firmware: ArduCopter-v2.px4


Connecting IR-LOCK to Solo: article

Read more…

Precision Landing with Accuracy Management

I was inspired by the recent solar-powered launch system announcement to share some recent precision landing development work.

Automated charging systems require a high level of controls performance, especially during the landing process. This is a challenging problem, especially when operating in outdoor conditions, where a gust of wind can easily push the copter off-course.

The video shows a vision-guided precision landing technique, based on the existing APM:Copter feature (link). The default code has been reported to produce precision landing performance of ~30cm (i.e., the copter lands 0-to-30cm from the visual target). This is great in general, but the aforementioned automation applications require even better performance. Moreover, in some scenarios, a bad landing results in a crashed copter, for example, when ‘landing on a box’.

The video demonstrates a modified version of the default precision landing code. In the modified version, the copter localizes itself with respect to the visual target, and this localization is used to actively monitor and manage the landing accuracy. In this simple example, the copter is programmed to descend when the landing accuracy is within the specified bounds, and to ascend when outside the specified bounds. The image below shows that the error bound is set to 25cm for the AGL altitude range 1m-to-2m.


This approach can be extended and customized in a variety of ways depending on the particular application. The ultimate goal of ensuring safe and accurate landings for automated UAV systems.


IRIS+ Copter

SF10/A Rangefinder

IR-LOCK Sensor

MarkOne Beacon

APM:Copter Code (*modified 3.3.2-rc2)

Note: the default precision landing feature is enabled by default in the master branch of APM:Copter, intended for advanced users/developers

Read more…

Recharge your drone at my place

Does anybody want to move in next door? We could send drones to each other ad infinitum. :) I cannot confirm if this installation increases/decreases the value of your home. 

This test represents an interesting combination of products and technology from 3DR, APM:CopterSkysense, and IR-LOCK. Previously, each component had been demonstrated separately. .... The only thing missing is a rangefinder to get accurate AGL readings. Precision Landing w/ Solo


Read more…

Copter Localization w/ IR Landmark

The video shows localization data collected during a short flight over an IR marker, which is used to estimate the relative location of the copter with respect to the visual landmark. Note that this was a manually flown flight, but the ultimate goal is automation. Detecting visual landmarks/features is a fundamental task in many forms of robot localization and navigation. For example, the Snapdragon Flight includes 4 camera sensors for visual-inertial odometry.



The plot in the video shows the copter’s vision-based position estimation, versus the traditional position estimation. The red data is logged by APM:Copter running on Pixhawk with a 3DR GPS module. The blue data is derived from IR-LOCK sensor data, as it detects a MarkOne Beacon at approximately 50Hz. LidarLite is used for AGL altitude measurements. The presented data looks nice, since it was a fairly tame test. We need to calibrate the lens before we can correctly handle larger pitch/roll angles.


You can think of this as a ‘flipped’ version of the StarGazer indoor robot localization system, where a unique visual landmark is placed on the ceiling. However, the copter localization problem is a bit more tricky, due to the extra degrees of freedom. It can pitch, roll, ascend, etc. So the copter localization estimation also depends on the flight controller’s state estimation. And ideally, all of the data would be fused together.


One of the key advantages of having a uniquely identifiable visual landmark is that it can be used to remove drift in velocity and/or position estimations, which is typically the function of the GPS. This can also be accomplished by developing a local map (i.e., SLAM) …. With the MarkOne Beacon, we can also operate at night, but the video would be even more boring. :) Robust vision performance in variable lighting conditions typically requires some form of IR projection. (see Intel RealSense specs)

Read more…


NVIDIA's press release states that "Jetson TX1 is the first embedded computer designed to process deep neural networks -- computer software that can learn to recognize objects or interpret information." The 3.4x2inch module includes a Tegra X1 ARM Cortex-A57 processor with 256-core NVIDIA Maxwell graphics, 4GB of LPDDR4 memory, 16GB of eMMC storage, 802.11ac WiFi and Bluetooth, and Gigabit Ethernet support.

AnandTech Article:

The Jetson TX1 Development Kit will be available for preorder starting Nov. 12 for $599 in the United States. The kit includes the Jetson TX1 module, a carrier board (pictured below), and a 5MP camera. The stand-alone module will be available in early 2016 (for $299 in bulk).


The Jetson TK1 (not TX1) was released in 2014 to encourage the development of products based on the Tegra K1 processor. However, according to AnandTech, developers were using the Jetson TK1 outright as a production board, choosing to focus on peripheral and software development instead of system hardware development. With the new TX1, all of the I/O connectivity is provided on a carrier board, enabling rapid development on the credit-card sized TX1 module. After development is finished, the TX1 module can be directly deployed in products, such as drones. 

NVIDIA used a drone application to promote the Jetson TX1


Read more…

Jetson TK1 Promo Offer


I am not an employee of NVIDIA or Make. :) But I am using this promo as an opportunity to see if anyone has made progress on getting the Jetson TK1 airborne.

The Jetson was released over a year ago. The community support has developed nicely since then. A custom kernel provided by users, and 'JetPack' (by NVIDIA) help to accelerate the setup process. I was able to get Jetson and Pixhawk talking via MAVLink by following the "Communicating with ODroid via MAVLink" wiki entry, without too many issues.


I have not yet flown with the Jetson. This development board is a bit bulky: approximately twice as big and heavy as the Odroid XU4 (XU3 and Jetson pictured). However, it offers top-notch performance for computational intensive tasks such as vision processing. 

Jetson/XU4 Weight:  120g / 60g

Jetson/XU4 Dimensions:  127x127mm  /  82x58mm

*Percepto's Indiegogo campaign proposed TK1-based hardware that is more down-to-size.


It is worth mentioning that DJI has recently released a companion computer based on the Jetson for their development copter. 'Manifold' is approximately the same size/weight as the Jetson development board, but with different I/O.


(image source:

Read more…

How many copters did we sink? :) Check the video above. The development of Precision Landing hardware and software has progressed since our last blog post. The existing controls code has been added to master in APM:Copter (i.e., V3.4-dev). The two demo videos included in this post were produced with a modified version of APM:Copter V3.3 (similar to the master code). This documentation (link) should provide some clarification for advanced users who want to experiment with the features. And the relevant APM Wiki link is here: link

CONTROLS: Keep in mind that the Precision Landing controls code will probably undergo significant changes in the future. As it stands, the sensor detects the target and outputs an ‘angle-to-target’ reading. The roll/pitch of the copter is subtracted from these readings, assuming the sensor fixed to the copter frame. Coordinates are transformed from body-frame to earth-frame. Then, the altitude of the copter and angle-to-target in earth-frame is used to calculate the distance-to-target in the ‘x-y’ plane. This distance-to-target is used to re-position the copter over the target. One limitation of the current controls method is that the relative altitude wrt the target is required. Fortunately, there are good range-finder options available now.


VARIABLE LOGS: One helpful improvement is that the Precision Landing variables are logged, enabling much more effective post-flight troubleshooting and analysis.

bX/bY: Angle-to-target measurement in the body-frame reference BEFORE accounting for roll/pitch of copter.
eX/eY: Angle-to-target measurement in the earth-frame reference AFTER accounting for roll/pitch of copter.
pX/pY: Calculated distance of target from copter in earth-frame. Note that the distance calculation depends on the altitude measurement. Also, the altitude variable in this particular instance is range limited, such that a negative value or very small value is not used to calculate the position offset.


SENSING: There are improved sensor system options for professional users (MarkOne). There has been significant interest from industrial users/developers regarding the development of automated UAV systems: automated charging, safe automated landing, copter-on-vehicle landing, etc. The boat landing demo video is useful for demonstrating the improved reliability of the sensor readings, which is critical for professional use in automated systems. 

Some of the logged sensor readings from the boat landings are plotted below. The green line indicates the altitude of the copter. The red line ('bX') indicates the angle-to-target measured by sensor in the x-direction, relative to the sensor (fixed to the copter). We no longer have issues with false detections, even in challenging operating environments. False detections which would be indicated by immediate, large changes in the ‘bX’ value. (see an example of false detections here: link).

INTERPRETING LOGS: When the LAND mode is initiated, the copter begins moving toward the target, driving the angle measurement toward zero. Oscillations in the angle measurement are typical. 'Flat lines' in the bX/bY plot indicate periods of time when no target is detected. This may due to the target being out of range of the sensor, or out of the field of view of the sensor. In this test (below), the MarkOne beacon is detected at distances of over 15 meters.

Toward the end of the landing, the angle measurement may increase significantly and/or turn into a 'flat line'. A flat line is expected if the copter does not land directly on top of the marker (i.e., 5-30cm away). The marker can easily escape the field of view of the sensor during the final ~10cm of descent. Also, the detection angle can easily become very large when the sensor is very close to the marker.

DEMO 2: Here is a demonstration with 5 consecutive Precision Landings. Below, you will find the following content:
->Plot of logged variables for 5 landings
->Plot of logged variables for a single landing



Read more…

SOLO Smart Drone lands smartly :)

Warning: I think this voids your warranty. :) We integrated an IR-LOCK sensor with a 3DR SOLO. This project was mostly for fun, so don't read too much into the flight performance shown in the video. We only had one test session, and we didn't tune any parameters. 

You might notice that the flight characteristics are similar to those in Randy's precision landing demonstration (link). We ported his code over to the ardupilot-solo repository for this demo. The repositories used for this test are here and here. Unfortunately, the SOLO does not expose the I2C bus that we typically use to connect the IR-LOCK sensor, so we tapped into the power and data lines (SDA/SCL) used by the compass. (btw, it's pretty cool that 3DR sells a SOLO compass leg



This video also demonstrates a new landing beacon (MarkOne) that we have developed. Initial details on MarkOne are here: Significant improvements have been made to the machine vision performance. Namely, we can operate in bright sunlight over water, cars, etc., without worrying about false detections. 

Recently, the precision landing code was pushed to the ArduCopter master, with the intent of having official support in AC3.4. The relevant developer conversation is here, and the ArduCopter wiki entry is here. As always, many thanks to the AC developers and community!

Read more…

Safe Landings in Tight Spaces

Landing on your back porch? In close proximity to trees? And near your grandmother? :) 

We produced documentation regarding the 'Precision Landing' developments (link). The primary intent of this development is to enable more industrial use-cases for multi-copters, namely, fully automated systems with auto re-charge capabilities. However, it is also useful for assisting manual landings in tight spaces. I typically would not attempt to land on my patio, near a house, under a tree, and on a table. 

The assisted landing uses a Sensor/Beacon system (link) to center the IRIS+ over the IR Beacon when in Loiter or Land flight modes. In the above video, we are using a modified version of AC3.2.1 which uses readings from the Sensor/Beacon system. In short, if the Sensor detects the Beacon, the copter will hover over the Beacon. If the Beacon is not detected, the copter operates as expected in Loiter/Land.

Loitering over a Beacon is demonstrated in the video below.

Randy shot a very nice video of a precision landing. This development will not be in the AC3.3 master. We are still experimenting .... Also, note that the 3.3-based version (demonstrated below) is somewhat different than the 3.2.1 version. You can read more about it here (link). 

I am also curious about the use-cases for fixed wings. We have had requests about this, but my knowledge on the matter is limited. Feel free to shoot me a message (thomas at irlock dot com), or reply below. 

Read more…


Drones provide an “eye in the sky” which can be very useful for water scientists. Many water bodies are often surrounded by difficult terrain, making it difficult to observe changes over time. Extreme droughts like the ones currently in California and Texas make the use of drones for monitoring rivers and lakes especially important.

A multinational team of American, German, and Estonian researchers have teamed up to measure the flows in rivers with unmanned aerial vehicles (UAVs) using a new real-time particle tracking technology.Obtaining the speed of the water in a river traditionally involves engineers wading into the river or using boats outfitted with special equipment for flow measurements. ...

Read more at:

Centre for Biorobotics

Read more…

Fully-automated Precision Landing Mission

I finally got around to editing the footage of this fully-automated Precision Landing mission. The purpose of this mission is to demonstrate how vision-assisted Precision Landing can be a solution to the 'last 100 ft' problem, where traditional GPS is not sufficiently accurate to navigate to a precise location (RTK GPS systems are another possible solution). Some applications areas are automated landing on charging stations, boats/ground vehicles, or even your front porch. :)

The inputs to the mission are GPS waypoints, which are set at the known coordinates of two stationary IR Beacons. The mission is to travel to the next GPS WP, and then 'Precision Land' on the nearby Beacon. This work was performed a while back with an older version of AC, so we are transferring the code to a fork of AC3.2.1.

Thanks for watching! If you would like to stay up-to-date with our progress, or to check more details, our website is here:   

NOTE that the 'bullseyes' in the video are only used as an accuracy reference (not for the machine-vision processing).
For excellent 'bullseye examples', check the work done by Daniel Nugent. (link)
For more explanations about IR-based machine vision, check out the recent post by Dan Wilson. (link)



Read more…

Maybe we can land on a moving car in our next demo. :)

IR-LOCK is a developing precision landing system for multi-rotors, which is expected to enable high levels of automation in UAS. Imagine autonomously landing on a small charging station, or autonomously performing inspections at very specific points on infrastructure or facilities. In our prototype testing, landing accuracy of 5-30cm has been achieved. 

Since the system uses an IR Beacon, it is not hindered by low lighting conditions or even complete darkness. And the beacon modulation is designed to accommodate extremely bright conditions. Overall, the beacon detection has proven to be very reliable, which is obviously important in this application (we don't want to autonomously land on your dog/cat). 

A Precision Landing Development Kit will be made available soon. The controls code will be provided so you can easily integrate the system into your UAS (based on ArduCopter 3.2.1). Preliminary details are here:

Please do not hesitate to ask questions. We definitely appreciate the support and feedback from the DIYDRONES community. 



Read more…


IR-LOCK has been testing the latest version of our sensor/beacon hardware for precision landing, and the results are great! With an IRIS+, we achieve reliable landings at around 5-30cm from the target beacon.

Our development focus is on the reliability and accuracy of the sensor/beacon hardware, and optimal integration with control systems. One feature of the new beacon design is a modulated signal that enables the sensor to avoid false detections. Also, it can be powered by a standard 12V power adapter. 

An obvious next-step is to integrate the beacons into a charging pad/station, or other small landing surfaces. A primary goal of the technology and hardware development is to enable fully-autonomous and remotely controlled systems of quads ... Imagine never changing a battery again :) 

If you want to stay updated on the progress, feel free to sign up at our website. You can also find more details there.

We are currently running a modified version of ArduCopter. Many thanks to the Diydrones and Drone-discuss community of developers! I don't know how to thank you enough. 



Read more…

IR-LOCK Sensor for Precision Landing, etc


We finally have a progress report on the integration of the IR-LOCK (Infrared Pixy) vision sensor with Pixhawk. A group of students at the Georgia Institute of Technology is using the sensor for their 'Package Delivery Drone', which requires precision landings. They plan to use GPS to travel to a waypoint, and then use the IR-LOCK sensor to land precisely on an infrared beacon. Recently, they were able to get an IRIS+IR-LOCK to auto-hover over an IR beacon (see videos). 

Other applications for IR-LOCK/Pixhawk are in the works (or at least in mind):

  • Search&Rescue (auto-search for emergency IR beacons)
  • Moving target following
  • Aerial surveying (I will be able to post a relevant video soon)
  • ... your thoughts/ideas are welcome :)

We are in the process of making the IR-LOCK/Pixhawk interface more developer-friendly. The current developments (by the student group) are based on an IR-LOCK sensor 'driver' and codebase discussed on this FAQ page (link). A 'more official' implementation should be under development soon. The relevant discussion is here (link). 

Small IR 'Pods' can be detected by the IR-LOCK sensor at about ~30-60ft. The IR-LOCK sensor reports the (xy) position of IR targets to Pixhawk at 50Hz. 


Read more…