I just thought I'd share a quick video (and picture.) This is a movie I made of a replay of a real flight a couple days ago. I collected the data remotely while the flight was in progress via a maxstream radio modem connection between the aircraft and the ground station. I saved the flight data on the ground station and then can replay it later in FlightGear. The display is a "glass cockpit" style display I'm just beginning to develop.
Disclaimer: I have really noisy inertial sensors rigged up at the moment, 10 bit ADC's, you'd laugh if I described my IMU calibration procedure, probably very non-optimally tuned kalman filter for these sensors, windy day (yeah that's it, blame it all on the wind) :-)
But the really cool stuff (I think) if you can look past some of the less smooth flying, is to see all the elements of a very dynamic system playing together in one view.
For instance: the altitude tape shows the current altitude and the target altitude (via a magenta altitude bug.) If you are below the target altitude, the VSI will show a target climb rate (or decent rate if you are too high.) The autopilot tries to match the target rate of climb by manipulating pitch. The green vbars show the target pitch angle. The yellow bird shows the actual pitch angle. The autopilot manipulates the elevator to try to achieve the target pitch angle ... and you can see all these elements working together to achieve the goal.
There is a similar process with route navigation. The system computes a target ground track heading using wgs84 based math. The target heading is marked by a magenta heading bug on the horizon heading tape. There is a white "V" indicator that floats on the horizon heading tape which indicates the ground track heading. The actual heading shown is "true" heading as computed by a 15-state kalman filter. (The filter converges to true heading, independent of wind, side slip, etc.) So the autopilot computes a target roll angle to try to line the white "V" ground track indicator up with the magenta heading bug. And finally the ailerons are manipulated to try to match the target roll angle. If you watch the video you can probably see that I need to increase the gains a bit on my ailerons (maybe the end point limits as well.) What do you think? You can see the actual roll angle often lags pretty far behind the target, and this leads to some excessive serpentining as the aircraft flies towards the target. But if I dial up the gains too much, I may start over reacting to my filter's attitude estimate correction jumps. I think there is a balancing act that needs to be made between tracking your targets quickly and accurately versus slowing things down a bit to help smooth out the flight.
Finally you can also watch airspeed. Right now the autopilot is configured to try to match the target airspeed by manipulating the throttle, so you can watch the throttle move up and down to try match airspeed. Of course as you fly the course and bank into turns, encounter up and down drafts, and work around filter estimation errors everything is changing all at once.
On the one hand, I would like to see much smoother and tighter control, but on the other hand I have to sit back in a bit of wonderment just watching all the pieces working together and doing what they are supposed to do.
I think this is evolving into a really powerful system for evaluating how well an autopilot system is tuned and how well it is tracking it's targets. If your PID gains are inducing oscillations, you can quickly see that. If the PID gains are too low, you can see the system react too slowly. You can see your controls throws max out at their preset limits (or not if that is the case.)
And for what it's worth, this display can also be fed from real-time telemetry data so you could optionally have this running during the flight ... I'm not sure why you'd want it ... maybe during the design and development phase or to impress your wife or girl friend.
Comments
Hi Curt,
I know it's an old post but I wanted to ask, is there any documentation about the glass cockpit baby ? Because I don't understand completely the svginstr library!
Thanks in advance !
$ git clone git://gitorious.org/svginstr/svginstr.git
Regards,
Curt.
I'm using FlightGear as the display engine. To create the actual instrument graphics I used a python library developed by one of our long time FlightGear developers called svginstr. You create instruments by writing short python scripts which generate .svg format graphics when then can be converted (automatically) to png for use as FlightGear texture components for the instruments. It's a bit of a tedious process. However, the use of scripts to generate the actual graphics is important. Let's say I was drawing by hand and drew 10 large tics around a circle, 10 medium tics at the half way marks, and 100 small ticks ... and then decided I needed to change the circle diameter or line width or something? If this was all done by hand, I'd just about have to start over. With a script I change one number, rerun the script and the new version pops out.
What software are you using to code the glass cockpit display?
Please let me know if you do distribute it. I teach several classes in flight test engineering and it would be perfect for my students!
Actually I found that FlightGear did not have a glass display that met my specs so I am working on developing a new display that is optimized with the information I need for my UAV development work. I'm not sure in what context I'll be able to distribute this, it is yet to be determined.
Which version of Flight Gear includes the Glass Cockpit visual you show