Interesting read on the limits and human factors complicating flight automation. I contend that all issues are solved with better sensors & processing ability/reliability.
From Things With Wings:
"As far back as 1980, renowned aviation human factors guru Earl Wiener (pictured sitting in the captain’s seat of the a Northwest Airlines Boeing 757 circa 1992) was asking the question on everyone’s mind after the tragic crash of Asiana Flight 214 in San Francisco earlier this month – Has automation gone too far?
Had he been around and of sound mind, Wiener would surely have weighed in.
However he passed away June 14 at the age of 80, the victim of a long bout with Parkinson’s disease.
Along with his family, books and scholarly papers and a new generation of human factors professionals, Wiener left us with Wiener's Laws, 15 jewels of wisdom that will keep giving for decades to come because human nature, hence human error, is not changing all that rapidly.
The laws were sent to me by former Wiener student and co-worker Asaf Degani, Ph.D., now a technical fellow at General Motors. “Some are funny and some are dead serious,” says Degani.
I have no explanation of why Laws 1-16 are "intentionally left blank"...
Which one is your favorite?
WIENER’S LAWS
(Note: Nos. 1-16 intentionally left blank)
17. Every device creates its own opportunity for human error.
18. Exotic devices create exotic problems.
19. Digital devices tune out small errors while creating opportunities for large errors.
20. Complacency? Don’t worry about it.
21. In aviation, there is no problem so great or so complex that it cannot be blamed on the pilot.
22. There is no simple solution out there waiting to be discovered, so don’t waste your time searching for it.
23. Invention is the mother of necessity.
24. If at first you don’t succeed… try a new system or a different approach.
25. Some problems have no solution. If you encounter one of these, you can always convene a committee to revise some checklist.
26. In God we trust. Everything else must be brought into your scan.
27. It takes an airplane to bring out the worst in a pilot.
28. Any pilot who can be replaced by a computer should be.
29. Whenever you solve a problem you usually create one. You can only hope that the one you created is less critical than the one you eliminated.
30. You can never be too rich or too thin (Duchess of Windsor) or too careful what you put into a digital flight guidance system (Wiener).
31. Today’s nifty, voluntary system is tomorrow’s F.A.R.
...
Wiener in the early 1980s began researching what happens when humans and computers attempt to coexist on a flight deck. Though his “day job” was professor of management science at the University of Miami, Wiener is widely known for embedding in jump seats of his airline pilot subjects as part of research projects funded by the NASA Ames Research Center. Wiener would continue performing NASA human factors works for more than two decades. “Earl was an ongoing grantee,” says a NASA co-worker from that time. “He would publish a paper and 25 people would write their masters’ theses or doctoral dissertations on the topic.
In a 1980 paper he co-wrote with NASA’s Renwick Curry, “Flight-deck automation: promises and problems”, Wiener wrote, “It is highly questionable whether total system safety is always enhanced by allocating functions to automatic devices rather than human operators, and there is some reason to believe that flight-deck automation may have already passed its optimum point.” Compilations of scholarly papers by Wiener and his colleagues resulted in two key human factors books, one of which – Human Factors in Aviation – is still in publication today, albeit as a new edition with new editors.""
Comments
"The problem now days is that we are stuffing less and less experienced and skilled pilots in the cockpit and they are making bad decisions" 10,000 hrs in the seat & 50 hrs of actual time on controls...
Another incident similar to the recent SFO tragedy has emerged where an international carriers aircraft was in a similar situation low/slow in VFR conditions on approach and was waved off for a go-around by the SFO tower.
I see autopilots for R/C models being useful not only for autonomous flight, but also to help pilots raise their flying skills. The aim of improving flying skills is why I added TRAINING mode, and is also what I am aiming for with the new ACRO mode. It is also one of the main reasons for geo-fencing.
We use TRAINING mode to help teach people to fly manually, while reducing the number of crashes. It helps teach people to keep the aircraft flat and use rudder in turns, and also reduces stress on the trainer, as the plane is always at a flyable attitude then the buddy switch is released and the trainer takes over.
For ACRO, I'd like to be able to enhance that over time to teach 3D flying. For example, I'd like to be able to get the APM to handle 1 or 2 axes for prop-hang or knife-edge, so the pilot can concentrate on mastering the other axes. Then they would move to controlling all 3 axes once they gain enough confidence, while using geo-fencing as a safety net.
It all depends on what you are trying to get out of APM. If you want a stable plane for photography, or just want to learn about autonomous flight then modes like FBWA and CRUISE are great. If you want to improve your flying skills then they can teach you bad habits, and you are better off with TRAINING, ACRO or just flying in manual, with the option to switch on a fence or stabilization mode if something goes wrong.
Cheers, Tridge
As a pilot I feel the need for less automation and more stick and rudder skills have collided. Pilots these days rely more and more on automation while their overall skills in the airplane go down. Do you think a computer could have pulled off a perfect landing in the Hudson River? The fact is computers replacing the decision making ability of a highly experience pilot is a long way off. The problem now days is that we are stuffing less and less experienced and skilled pilots in the cockpit and they are making bad decisions. Engineers aren't pilots and they can think of every situation that may arise, and in these situations the computer tends to fight against the pilot. Take the Hudson River crash again. The FADEC would not allow the pilots to power the engines up because of the damage sustained by the birds. The computer believed that if the engines were powered over a certain percent they would come apart. This is probably true but nobody ever thought that 2 engines could sustain damage at the same time. As a pilot your first responsibility is to the passengers, if running those engines at max power and risking the engines coming apart got you to land then that is what you are going to do. The FADEC took away that option.
I have experience with the first on the list, actually I think anyone who has crashed an RC plane has experience with that one. I didnt know the difference between FBW and stabilize until I threw the stick hard over and couldnt understand why it wasnt holding a nice bank and was in a dive. I hadnt properly worked out switch position with mode.
Lol - very droll R_Lefebvre...
With one revision:
...ridiculousness of combining a sequential 2 position switch and a 3 position switch to choose from 6 flight conditions?
Anyone who has set RTL before Auto will know what I mean.
Gary - thanks for posting that up. Very interesting read. Goes to show that humans are still the weakest link - whether it's the one at the control or the one designing the system. The mind boggles at the Iridium system mistake.
Is there anything in there about the ridiculousness of combining a 2 position switch and a 3 position switch to choose from 6 flight conditions? :p
I think the idea is to get to the point where there's absolutely no need for hand flying, or even a pilot in the cockpit.
This is perhaps the human factors poster boy incident for UA http://web.mit.edu/aeroastro/labs/halab/papers/Carrigan_AUVSI.pdf
Good read, thanks for sharing.