Tablets, copters, & extreme precision





The Blade MCX being the most stable indoor flying thing ever invented buys most of the precision, but it still took some doing to get vision, sensors, & feedback good enough to keep it in the smallest box.  The Blade MCX is 4 years old, but it had just the right combination of cyclic, flybar, & servos to be more stable than anything since.

The IMUs since then haven't achieved the same motion damping of the mechanical flybar.  It's a mystery why the Blade CX2 wasn't as stable.

Got the last of the features on the tablet which were last done on the RC transmitter, years ago. Manely manual position control in autopilot mode. Rediscovered the ages old problem where velocity & direction can't be changed if a move is already in progress. It would have to stop moving & recalculate a new starting position of a new line to fly.

The algorithm was all part of a plan to make it fly in straight lines. Merely setting fixed velocities didn't make it fly in a straight line. It would need a much larger flying area for changing velocity to be practical. The war on straight lines was a long battle in 2008, comprising many blog posts.



As the tablet interface evolves, it's very confusing, with separate inputs for manual mode & autopilot mode.




The final leap in accuracy came from tapping the Blade MCX's integrated gyro to drastically improve the heading detection without increasing the component count.

Its heading is extremely stable, allowing its position tracking to be more stable than using the magnetometer alone. The improvement costs nothing, but would require more parts on copters with no analog gyro already installed.

 That was purely the magnetometer without the gyro.

Another discovery with this system was pointing it 45' left of the cameras during calibration seems to be the optimum alignment for the cyclic phasing.

So far, these micro copters have proven the smallest indoor autopilot works, but what you want is a flying camera. Dreams of useful quality video from a monocopter were busted. The available cameras aren't fast enough. There were signs that they could be synchronized to the rotation by pausing the clock. The blurry images would then require a really fast wireless connection.

A camera on a micro copter would take serious investment in really fast, microscopic, wireless communication. All roads are leading not to building aircraft, but perfecting a camera & wireless communication. 

There is a desire to put the autopilot on a ladybird or convert something big enough to fly a camera.

You are subscribed to this thread  


Many years ago, a fake test pilot noted that averaged sensor data produced better flying than lowpass filtered sensor data. Lowpass filtering was the academic way of treating the data because it got rid of aliases.

The fake test pilot also noted that jittery servos produced better flying than perfectly timed servos.

In all these cases, the noisy unfiltered data had less latency than the filtered data & glitching the servo PWM around 50hz conveyed more data than their normal 50Hz update rate allowed. Since there were no data points at an alias frequency & with enough amplitude which could cause the aircraft to oscillate, the reduction in latency was a bigger win than the reduction in noise.



Now a camera system has 2 cameras, each running at 68fps, limited by clockcycles. They're not perfectly timed or synchronized, so an image from either camera is captured at 136 unique points in time. A new position is calculated when each of the 136 frames comes in. This allows slightly faster position updating than if the cameras shot at exactly the same 68 points in time, without requiring more horsepower.



The velocity calculation has only a 1/13 second delay, is pure noise, but gives a much tighter flight.

Anyways, the dual 68fps system uses 90% of the raspberry pi with the ground station on. Without the ground station, it uses only 60%.  The RLE compression generated by the board cams takes a lot less horsepower to decompress than the JPEG compression from the webcams, but is made up for in the higher framerate.

The dual cameras on a single pan/tilt mount at 320x240 70fps is probably as good as a cost effective system can get.  Better results could be had from 640x480 or higher resolution at 70fps.  That would take FPGA design & something faster than a raspberry pi.  Webcams max out at 640x480 30fps, but higher framerate has proven more important than higher resolution.

 Baby Vicon busted
 There was a delusion of having 2 cameras on separate pan/tilt mounts, to give very precise distance readings & eliminate servo wobble.


 The problem became obvious, immediately after starting the calibration. The pointing direction of the servos can't be known precisely enough to get a distance from the angles of the cameras. The convergence angle needs to be more precisely known than any other angle to get a useful distance.

The cameras in a 2 eye mount have a fixed convergence which can be hard coded. The cameras in 1 eye per mount have variable convergence which must be deduced from the servo angles. That couldn't be known as accurately as hoped. The Hitec HS-311 is the tightest servo known, but it's still not accurate enough.

If the cameras were on different sides of the room, so they always converged at 90 degrees, the problem would be solved, but that would require having a 270 degree field of view with no lights that could interfere with machine vision. The cameras have to be close together & on the same side of the room to make the lighting practical.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Thumbs up Jack! I know who the "Fake test pilot is " Have a Great Day!

  • Jack, there are easier ways to get a hair cut....!!!

  • Jack, brilliant work!  What cameras did you use?

  • Jack, you may very well be "the most interesting man" on diydrones.  Are there any youtube videos of you bull fighting, or skydiving into a volcano?

  • "Manely" indeed.  (Impressive work!)

  • Hi Jack,

    Excellent work. 

    I'm curious how you are able to make PCBs with such nice small traces. I have two questions:

    1. Do you use toner-transfer method or Photo-resist method? If its the former, what medium do you use for toner transfer?

    2. What chemical do you use for etching the PCBs?

    Thanks and Regards,

  • I take it you're using the wii-mote sensors? Those things are amazing.. so cheap and easy to use, too. Too bad they only track up to 4 (is it 4?) blobs..

  • Developer

    Great stuff jack!  I love reading your blogs 'cuz it tells us where the rest of us will be in 2 years.

  • Just do amazing work! Thanks for sharing!

This reply was deleted.