3D Robotics
[In the previous installments of my interview with Curt Olson, I focused on using FlightGear for GPS simulation--you fly the plane around in the flight simulator and the software outputs GPS data to fool your autopilot into thinking that it's the plane in the flight simulator, so you can see how the autopilot responds to being off course. Now we're going to turn to more sophisticated uses, with sensor data from your UAV (either live or recorded) driving the simulated plane on-screen. This is pretty hard-core stuff and not for everyone, but for those of you who want to really push the envelope, here's Curt, straight from the source. -ca]

FlightGear has the ability to turn off/disable its built in flight dynamics engine (i.e. the built in physics model that computes the simulated motion of the simulated aircraft.) Once the built in physics are turned off, you can feed in a sequence of locations, orientations, control surface positions, and other data from some external source, and FlightGear will dutifully render that data. If you connect up FlightGear to a data stream from a real time flight, or a data stream from a replay of a real flight, and feed in a smooth sequence of changing positions and orientations (hmmm, I work at a university so I should probably call this a sequence of locations and attitudes so as not to offend) :-) then FlightGear will render a smoothly changing view based on the data you are sending.

This turns out to be a very powerful and popular use of FlightGear. You can watch the FlightGear rendered flight from a variety of perspectives, you can examine the motion of the aircraft relative to the control surface deflections, you can leave a trail of virtual bread crumbs in the sky to see your flight path, you could include a sphere at the location and altitude of each of your waypoints to see how the autopilot tracks towards them, etc.

One thing I working on now (in partnership with a company in the LA area called LFS Technologies) is a glass cockpit display that shows even more detailed and interesting information. It's not completely finished, but it would connect up to flightgear and serve as another view into the guts of what the autopilot is thinking and trying to do. I don't know if I ever posted this link publicly, but here is a short/crude movie that shows the "glass" cockpit view in action ... low res and chunky, sorry about that.

Hopefully you can see there is a map view of the route and waypoints, there is a compass rose, a vsi (vertical speed indicator), an altimeter tape, and a speed tape. On each of these you can place a "bug". I.e. a "heading bug" would sit on top of the target heading, the "altitude bug" would sit on top of the target altitude, etc. And then in addition to all this, the actual control surface deflections are shown in the upper right.

What this gives you is a real time view of where you are at, where you should be going, and what the autopilot is trying to do to get there. For example. Let's say the target altitude is 1500' MSL and the UAV is currently flying at 1200' MSL. The altitude bug should be planted at the 1500' mark on the altimeter tape so we know where we are trying to go, and the altimeter will show the current altitude. If there is an altitude error, the autopilot should want to climb to fix that. So the VSI will have it's bug positioned to show what the autopilot wants the target rate of climb to be ... maybe 500 fpm for this example (and the rate of climb should diminish the closer you get to the target altitude so you intercept it smoothly.) Now you can look at the VSI and see what the target rate of climb is compared to the actual rate of climb. If there is a difference between the two, the autopilot will try to adjust to aircraft's pitch angle to increase/decrease the actual rate of climb. The primary flight display (PFD) shows actual pitch and roll, but also contains flight director style "vbars" that can be used to show the desired pitch and roll angles. Finally, in the upper right, you can see an indication of the actual elevator position that the autopilot is commanding to try to achieve the target pitch angle. Easy huh? :-) I actually use the word actually a lot I guess, basically feel free to edit that down a bit, and basically, "basically" is another word I actually use too much. Good thing I'm a software guy, and not a journalist!

So in other words, elevator deflection is used to achieve a target pitch angle. The pitch angle is varied to achieve a target rate of climb. The rate of climb is varied to achieve a target altitude. And the glass cockpit display in combination with FlightGear shows *exactly* what the autopilot is thinking and attempting to do for each of these stages of the altitude controller.

And the story is similar for the route following. The autopilot uses aileron deflection to achieve a target bank angle. The bank angle is varied to achive a desired heading. The heading is varied to fly towards the next waypoint. And again all of these can be seen changing in real time as the aircraft flies (or in a replay of the flight later on if you are trying to analyze and tune or debug the autopilot.)

So how does this help anything? If you are actually working on designing an autopilot and doing some test flights, then (as an example) maybe you see that the target pitch angle is never achieved. Then you notice from the visualizer that you run out of elevator authority before you get to the target pitch. Or maybe you could see things like excessive elevator motion as the system continually over compensates trying to seek the desired pitch. Then based on the specific observed behavior, you can tune the gains or limits of the autopilot system to fix the problem. Maybe you need to reduce the sensitivity (gain) of the elevator controller to eliminate the over compensation, maybe you need to increase the limits of the allowed elevator motion to be able to allow for greater pitch up or down angles.

As another example, maybe very early on in your autopilot development, you observe the aircraft always enters a loop when you activate the autopilot. From the FlightGear visualizer system, you might see that the elevator is moving the wrong direction so that more and more up elevator is commanded as the pitch angle gets higher and higher. You might realize that the direction of elevator deflection should be reversed inside the autopilot so less and less up elevator is given as the pitch angle increases.

The alternative to having tools to replay and visualize what is happening is that you fly, something strange happens, you land, scratch your head and try to figure it out, make some stab in the dark change, take off, see if it helps, it probably doesn't, you land, reverse the change you made earlier or make the same change but in the opposite direction, take off, more weirdness, more head scratching, etc. The more tools you have to see and understand the process of exactly what the autopilot is trying to do, the better off you are when it comes to making adjustments to make the autopilot even begin to work, or later to refine the behavior and make it work better or converge faster.

You could even take a step backwards in the pipeline. All of the above presupposes that you know your location, and know your attitude (pitch, roll, yaw.) But all those values are based on noisy and imperfect sensors and probably process through a variety of filtering and interpretation algorithms. It may be that you are trying to debug and tune the algorithms that tell you what your roll, pitch, heading, location, rate of climb, etc. are. In this case, a visualization tool is still very useful because you can compare what the computer is showing you to what you are seeing in real life (or what you remember seeing.) You can look at a real-time 3d visualization of the sensor data and say "that aint right", or "that looks about right". Either a plane looks like it is flying right or it doesn't. If it is flying along on a straight and level path but the nose is pitched up 45 degrees in the visualizer, you immediately see that something is wrong. If it is flying along on a straight and level path, but the tail is pointed forward, you can see something is wrong. A visualization tool like this can help you see and understand what your sensor integration algorithms are doing and possibly when and where they are breaking down. And all of that is useful information to point you towards the exact location of the problem and hopefully help you find a solution.


E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones