Controling a copter by image recognition

Hi all, here we would like to share a recent progress on our image recognition based copter controller.

The system uses a tablet and a transmitter. The tablet detects the copter's position by recognizing the red circle marker painted on the copter, and try to maintain the copter's position within the center of the tablet's display (Green circled area on tablet's display) via a Bluetooth -> PPM adapter attached to the trainer port of the transmitter (Futaba 14SG). The control is a simple PID loop, using the difference of red marker and green center shown on display as Err for P, and the change of position for D, with slightly applied I. As shown in the video attached, with a fan blowing aside and deliberately deviated course by hand, once the switch is turned on, the system automatically recovers the copter into the center and maintain it there.

This is an initial try, currently it only control the X and Y on a flat plane, and we are trying to add Z (Depth) and Yaw axis by adding markers and detecting the change in marker size. The whole detection and control is done within the Tablet (Nvidia Shield Tablet), the embedded back camera of the tablet gives a good result inside the room, and we will give it a try soon in outside. The image recognition uses OpenCV, which runs in approx. 20fps (@1080) with a noticeable time-lag. The time-lag causes certain instability on the control loop, to overcome such problem, we are trying to implement an approximated delayed-feedback model into the control loop.

The copter side uses a pixhawk running APM 3.2 in AltHold mode. The system was developed for infrastructure investigation copters, e.g. under a bridge or beside buildings where GPS signal is weak and wind blows randomly, in such case one can fix the tablet on a tripod, shooting and control the copter. We met Randy some weeks ago, when he suggested us to use Mavlink instead of RC, and we would modify the system and make it even simple. This is an open source project by enRoute Co. Ltd., a member of Dronecode from Japan, we will put the source onto GitHub when the Depth and Yaw control is implemented.



Views: 2185

3D Robotics
Comment by Chris Anderson on May 24, 2015 at 9:07pm

Impressive. And congrats on becoming a Dronecode member!

Comment by Jiro Hattori on May 24, 2015 at 10:25pm


Great demo. Are you going to use Jetson TK1 on the copter in the future ? 

Comment by Randy on May 25, 2015 at 4:13am


The control of the copter is really looking pretty good.  There's very little overshoot.
Nicely done!

Comment by Tomoyuki Izu on May 25, 2015 at 5:06am


Yes it is, We are testing Jetson + e-con80.

Comment by Glenn Gregory on May 25, 2015 at 5:19am
Excellent control! The e-con80 looks interesting.
Comment by Daniel Nugent on May 25, 2015 at 10:47am
Very impressive! Was there any reason you tapped into the trainer port of the transmitter instead of sending mavlivk commands(rc override) directly from the tablet using telemetry radios?
Comment by Kai Yan on May 25, 2015 at 3:42pm

@Chris Thanks a lot :D

@Jiro Jetson TK1 is a nice platform, however, its USB host controller somehow limits the bandwidth and 1080 video input is not smooth, maybe its not tuned up yet.  We are collaborating with NVIDIA and looking forward to test the Tegra X1 version of Jetson platform. E-CON camera is very nice btw.

@Randy, thanks! The position difference on display is dependent on depended on Z (distance from copter to camera in depth direction), thus the PID parameter should be a function of Z. If using the constant parameter set there will be overshooting when copter is closer to the camera. The parameter set is also effected by the specific lens the tablet is using, I am trying to make a simple way to customize and adjust such parameters.

@Daniel In case of trouble we can immediately turn off the signal source by the switch on transmitter, so initially we used transmitter + trainer port input. But now it's getting stable so at some point we will migrate to Mavlink directly. 

Comment by Jiro Hattori on May 25, 2015 at 4:45pm

How are you expecting from X1 processor power ?

Very exciting on your development work.

Comment by Kai Yan on May 25, 2015 at 9:53pm

@Jiro, actually not sure yet, but hopefully there will be more choice on cameras other than USB at the time Tegra X1 Jetson comes out, USB host controller is a bottle-neck in any patform

Comment by technicus acityone on May 26, 2015 at 9:47pm

"The tablet detects the copter's position by recognizing the red circle marker painted on the copter, and try to maintain the copter's position within the center of the tablet's display"
2D projection positioning? Very freaky =)
But if u find money for one or two more chinose tablest and try real 3D pilotage it may be very interesting experiments!


You need to be a member of DIY Drones to add comments!

Join DIY Drones


Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service