### Determining Altitude AGL Using Optical Flow

You are probably familiar with optical flow sensors which provide extremely precise measurement of ground speed (and therefore position) for quadcopters and computer mice.  The PX4FLOW is the most-recent example but there are many others.

### How it works

The optical flow sensor provides a 2D measurement of the angular speed of the image that is moving through its field of view, which is perfect for a mouse but by itself not that useful for a free-flying robot. It cannot differentiate between a near thing moving slowly, a far thing moving quickly, or its own camera rotating. Optical flow must be integrated with other kinds of sensor data for it to make sense in 3D space.

For those familiar with trigonometry, the diagram at right provides an intuitively-correct way of understanding how the various sensors are integrated, and how it can be possible to calculate an unknown value from known measurements.

• Point C represents the center of the optical flow sensor’s view.
• Length b is the current altitude above ground level (AGL) as reported by sonar.
• Angle α is the raw optical flow, minus the current pitch rate as reported by the gyro. (Pitching up or down changes the viewpoint of the sensor, adding to the optical flow. We have to subtract this out to get a grounded measurement.)
• Length a is the ground speed of the craft.

So, given altitude from sonar, pitch rate from the IMU, and optical flow, we can calculate ground speed.

While helpful for visualizing the math, the trigonometric method is only actually correct at α = 0. At α = 90°, a and h hit infinity, which is obviously nonsense. To calculate the actual distance traveled, you must calculate the sum of the movement across point C, which, when you do the calculus, ends up being the exact same formula used to calculate the length of the curved section of a circle segment. I've illustrated this with the second diagram at right.

It might seem that adding that curve to the equation has hopelessly complicated things, but in reality it makes the equation simpler, with no trigonometric functions at all. If I take out the obvious unit conversions, it looks like this:

ground speed = (optical flow - pitch rate) × altitude

### A problem, and a solution

In the above formula altitude is a known value directly measured by sonar. Sonar works well in quadcopters and other craft which fly in a hover regime, but tends to be much less reliable in planes, where accurate AGL measurement is probably even more important, especially for landing.

Fortunately, ground speed is not an unknown in planes. GPS and airspeed measurements provide highly-accurate ground speed measurements (there are errors but they’re small compared to the higher speed of the craft). With ground speed solved, we can use basic algebra to move the terms around and solve for a different unknown:

altitude = ground speed / (optical flow - pitch rate)

Thus, we have a reliable and accurate measurement of altitude which works at even greater range than sonar.

### Work to be done

The new PX4FLOW has its own sonar altimeter and gyros and performs all sensor integration internally. To add ground speed as an input and altitude as an output, one of the following will need to occur:

• Autopilot reports ground speed to the PX4FLOW, which then integrates it with its own sensors and sends back the calculated altitude, or
• PX4FLOW reports raw optical flow rates to autopilot. All other PX4FLOW sensors are turned off. Autopilot integrates optical flow with its own sensor data, providing improved accuracy of altitude, ground speed, position, and winds aloft.

The second option is the winner in my opinion, as it enables drop-in use of less complex optical flow hardware. I really don’t understand why a redundant IMU was placed on the PX4FLOW in the first place.

Please note that I’m not in a position to do this work myself. My own plane does not have any optical flow sensor, and probably won’t for a long time.  I’m OK with hard landings. I’m just offering the math that you’ll need to implement your own.

### Caveats

You may find that your optical flow sensor stops sending intelligible data as the ground gets close. This problem is also encountered with sonar, which has a minimum measurable distance due to the sensor deafening itself while generating the same sound that it needs to listen for. Optical flow fails for a different reason. If we solve for a different unknown we can see why:

optical flow = (ground speed / altitude) + pitch rate

As altitude approaches zero, optical flow will approach infinity. I don’t know the limits of the PX4FLOW (and I don’t have one to test) but I assume the limits are lower than you would want. There are several solutions to mix and match:

• Install flaps, land at slower speeds.
• Mount the optical flow board higher, perhaps under a wing instead of the belly.
• Install a wider-angle lens on the optical flow sensor. The PX4FLOW camera has a 16mm focal length. A shorter lens will widen the view and reduce the detected optical flow. Use simple lenses only, not fisheye/GoPro/FPV, as geometric distortion will cause bad results. Do not use any lens which causes vignetting (black corners) as you will then need to digitally reduce the sample area which will defeat the purpose.
• Solve it in software: Before landing, collect terrain data with a fly by, continue to measure altitude as long as possible on approach, and complete the landing the using the other available sensors.

• I love these sensors, they have come full circle! . Ive tested several mouse sensors on a wheeled robot chassis as linear sensors about a year ago. The System On Chip tracks pattern movment based on a pattern from military planes flying over icy mountains, which also looks like paper and any surface on that scale when magnified . contacting a majour sensor manufacturer i was informed 15 to 20%accuracy was realistic. And you can test this , I did, move your desk mouse between fixed points ! It has no need to be better because we compensate.

But that was mice, i havent tried these and I hope they are better, I hope so. Ive come from balancing robots and my experience of flying a quadcopter is in wind ! permanent pitch so the sonar also has to be on a gimble. I hear you say sonar is innifective from defeaning. The sonar device I have is fixed amplitude so that needs to change also,I years ago ! have made a pulser that delivers a single edge with a neon bulb across the TX transducer, US should be able to cover 5m to 10cm Bought a laser rangefinder UT392 before xmas for 42GBP , 2xAA , weight i estimate the optics, 30mm 13mm 20mm 10g, the electronics , 20g, ditch the screen, keyboard, case, batteries. laser rangefinder 30g . For vertical gnd measure you still have to have on double gimble, but then you can spin it round. described as Laser Distance meters .

PUSH IT  and SEE IF IT COMES BACK

• Developer

Hi Veikko,

Do you have some estimation how long and with what kind of accuracy the dead reckoning could be done with optical flow sensor in plane?

Well, that depends on what you ask it to do. ArduPlane already switches to dead-reckoning automatically when GPS fails, using the airspeed sensor plus the last wind estimate for ground speed, or the last ground speed value it got from the GPS if there is no airspeed sensor. It then uses the compass for navigation.

If all you are doing is RTL then it should get the heading to within a few degrees, so it should be able to come back to home quite well. If you ask it to continue flying a complex mission then it will lose position much more quickly. I usually count on "reasonable" accuracy for about 3 minutes, to within about 100m position, but that varies a lot depending on how much the wind varies.

The main source of errors are in the ground speed and ground course (which of course is what you need to navigate!). The optical flow gives you both of those, as long as the ground height doesn't vary too much. If flying over hilly terrain then the optical flow sensor will give poor ground speed, but over flat terrain I'd hope its speed will be within 5% or better, but I won't know until I try it. What is really nice is that an optical flow sensor can also give ground course, which means you can continue to update the wind estimate while flying without a GPS. I think that means you could navigate quite long distances (perhaps 10s or kilometers) with quite good accuracy. I'd also expect the error to accumulate linearly with distance, which means it won't suddenly explode as happens with accelerometer based techniques.

It's all guesses till we try it though!

Cheers, Tridge

• Developer

Nice article Jonathan! I've been thinking about using PX4FLOW for altitude in ArduPlane as well.

Perhaps there is no need to change the software on the PX4FLOW. The plane can just report a fake altitude to the PX4FLOW board, for example just the barometric altitude. The PX4FLOW will then report a ground speed based on that altitude. The ratio of the PX4FLOW reported ground speed and the GPS ground speed gives you the correction factor for the altitude. So you then multiply the altitude you supplied to the PX4FLOW by that ratio and you get the true altitude.

I also don't think that the limits of PX4FLOW as you come in to land are really a problem. As long as you know when it starts to break down, you can use the last good point as a baseline and use barometric changes from then on. The barometric altitude is quite accurate as a relative measure over the time it would take to go from 10m altitude to landing.

What I don't know yet is how accurate the PX4FLOW will be when used in this way. Time for some experiments!

There are other nice things you can do with this as well:

• when you lose GPS, you can switch to barometer for altitude, and use PX4FLOW for ground speed, allowing for nice long term dead reckoning navigation even without an airspeed sensor
• you can be constantly recalibrating your barometer while flying, thus allowing long flights with accurate barometric pressure
• when the PX4FLOW board is not pointing at the ground you can use the barometer for accurate altitude, for example while in turns or inverted
• terrain following will become much easier, even without a terrain model

Cheers, Tridge

• The PX4FLOW outputs the raw pixels, you can use them as you wish, even better, the output is at 50Hz, so I think APM should also be able to handle it.

• @Noth:
I see your point about the task overloading the CPU. I don't think it will, but the story of the camel's back and the straw does come to mind.
In case I gave that impression, I wasn't talking about making the host CPU process the raw video data. That task is best performed on the PX4FLOW. In my scenario, the only thing sent to the host CPU would be the optical flow reports, just two whole integers indicating how many pixels the image shifted during that frame. This is how most optical flow boards work.
As I demonstrate above, that last step --sensor integration-- is not that hard: It's just a fixed-point subtract and then 1-3 multiplies, 250 times a second. If APM can't handle it, I think PX4 certainly can.

• Sorry about the missing pics. They were there when I wrote, previewed, and submitted the post but somehow disappeared when the moderator approved it. I was asleep by then. Whoever fixed it, thanks!

• Cool pics added, makes for better understandingnof the concept.
I plan on getting a px4fmu and flow setup soonish, and will be experimenting with the stuff mentioned here. I wonder a lot about the performance impacts as mentioned before, as right now unless I read the documentation of the flow cam incorrectly it currently runs software written specifically for the hardware without abstraction level due precisely to performance.
In the past experiments I did I ran object/symbol detection/tracking on an older ARM cpu at about 90mhz, it ran at 100% load with varying update rates, but often below 1hz which is only usable on a very slow moving ground based platform, unsure of how the optical flow stuff compares to it in terms of calculations needed.
performance.in - This website is for sale! - performance Resources and Information.
This website is for sale! performance.in is your first and best source for all of the information you’re looking for. From general topics to more of…
• The pics/diagrams are indeed missing, but I tend to disagree with the conclusion proposed, simply because the cpu load on the main cpu tends already to be high, this is why you want to do the computation on a second MCU, not to mention the px4flow can be used on craft that do not have a px4fmu to do the computing on.
I have an autonomous rover with a setup not dissimilar from px4fmu plus px4flow, and the main cpu on it would not have been able to cope with image processing as well as run PID loops fast enough to be useful, in that case I went for a faster MCU to process images than the main controller for that precise reason.
• the pic!

• where is the Picture you referring to ?

This reply was deleted.