Great post from Curtis Olson, the lead developer of FlightGear and a long-time autopilot developer, on using simulators to do sophisticated autopilot analysis.
Blending real video with synthetic data yields a powerful and cool! way to visualize your kalman filter (attitude estimate) as well as your autopilot flight controller.
Conformal HUD Elements
Conformal definition: of, relating to, or noting a map or transformation in which angles and scale are preserved. For a HUD, this means the synthetic element is drawn in a way that visually aligns with the real world. For example: the horizon line is conformal if it aligns with the real horizon line in the video.
- Horizon line annotated with compass points.
- Pitch ladder.
- Location of nearby airports.
- Location of sun, moon, and own shadow.
- If alpha/beta data is avaliable, a flight path marker is drawn.
- Aircraft nose (i.e. exactly where the aircraft is pointing towards.)
Nonconformal HUD Elements
- Speed tape.
- Altitude tape.
- Pilot or autopilot ‘stick’ commands.
Autopilot HUD Elements
- Flight director vbars (magenta). These show the target roll and pitch angles commanded by the autopilot.
- Bird (yellow). This shows the actual roll and pitch of the aircraft. The autopilot attempts to keep the bird aligned with the flight director using aileron and elevator commands.
- Target ground course bug (show on the horizon line) and actual ground course.
- Target airspeed (drawn on the speed tape.)
- Target altitude (drawn on the altitude tape.)
- Flight time (for referencing the flight data.)
Case Study #1: EKF Visualization (above)
What to watch for:
- Notice the jumpiness of the yellow “v” on the horizon line. This “v” shows the current estimated ground track, but the jumpiness points to an EKF tuning parameter issue that has since been resolved.
- Notice a full autonomous wheeled take off at the beginning of the video.
- Notice some jumpiness in the HUD horizon and attitude and heading of the aircraft. This again relates back to an EKF tuning issue.
I may never have noticed the EKF tuning problems had it not been for this visualization tool.
Case Study #2: Spin Testing
What to watch for:
- Notice the flight path marker that shows actual alpha/beta as recorded by actual alpha/beta airdata vanes.
- Notice how the conformal alignment of the hud diverges from the real horizon especially during aggressive turns and spins. The EKF fits the aircraft attitude estimate through gps position and velocity and aggressive maneuvers lead to gps errors (satellites go in and out of visibility, etc.)
- Notice that no autopilot symbology is drawn because the entire flight is flown manually.
Case Study #3: Skywalker Autopilot
What to watch for:
- Notice the yellow “v” on the horizon is still very jumpy. This is the horizontal velocity vector direction which is noisy due to EKF tuning issues that were not identified and resolved when this video was created. In fact it was this flight where the issue was first noticed.
- Notice the magenta flight director is overly jumpy in response to the horizontal velocity vector being jumpy. Every jump changes the current heading error which leads to a change in roll command which the autopilot then has to chase.
- Notice the flight attitude is much smoother than the above Senior Telemaster flight. This is because the skywalker EKF incorporates magnetometer measurements as well as gps measurements and this helps stabilize the filter even with poor noise/tuning values.
- You may notice some crazy control overshoot on final approach. Ignore this! I was testing an idea and got it horribly wrong. I’m actually surprised the landing completed successfully, but I’ll take it.
- Notice in this video the horizon stays attached pretty well. Much better than in the spin-testing video due to the non-aggressive flight maneuvers, and much better than the telemaster video due to using a more accurate gps: ublox7p versus ublox6. Going forward I will be moving to the ublox8.
Impressive work Curt! Congratulations and thanks to Chris for highlighting it here.
I have begun writing a how to, but it is far from complete. That said, I will post the link here with with lots of disclaimers and no promises. If you'd like to work through the process, I'm happy to answer questions. My scripts are tested on linux (not windows) so that could be a potential challenge if you are a windows person. I have built opencv from source (so I can include ffmpeg support) and that could be a challenge for some. And finally getting your flight logs into a compatible format could be a 3rd challenge. Here is the beginnings of a draft explaining "howto":
Curt, I found your videos very interesting. if you ready to publish the howto of what you did for post HUD using telemetry, or even just the locations of your tools and scripts, then I for one would be interested for MatrixPilot. We have used Dashware in the past, but I am particularly interested in seeing a better representation of how well the IMU is working overlaid onto the video. Best wishes, Pete
I was surprised (pleasantly) that you picked up my post and included it here! One quick correction on the title of this post. This process is not using FligfhtGear at all to visualize UAV flights. Instead I'm taking actual nose-cam footage from the flight and overlaying a HUD using opencv and python. The result is a modified nose-cam video with the hud added and hopefully exactly aligning with the onboard video if all the calibration is done correctly and the EKF is converging to truth.