Interesting read on the limits and human factors complicating flight automation. I contend that all issues are solved with better sensors & processing ability/reliability.
From Things With Wings:
"As far back as 1980, renowned aviation human factors guru Earl Wiener (pictured sitting in the captain’s seat of the a Northwest Airlines Boeing 757 circa 1992) was asking the question on everyone’s mind after the tragic crash of Asiana Flight 214 in San Francisco earlier this month – Has automation gone too far?
Had he been around and of sound mind, Wiener would surely have weighed in.
However he passed away June 14 at the age of 80, the victim of a long bout with Parkinson’s disease.
Along with his family, books and scholarly papers and a new generation of human factors professionals, Wiener left us with Wiener's Laws, 15 jewels of wisdom that will keep giving for decades to come because human nature, hence human error, is not changing all that rapidly.
The laws were sent to me by former Wiener student and co-worker Asaf Degani, Ph.D., now a technical fellow at General Motors. “Some are funny and some are dead serious,” says Degani.
I have no explanation of why Laws 1-16 are "intentionally left blank"...
Which one is your favorite?
(Note: Nos. 1-16 intentionally left blank)
17. Every device creates its own opportunity for human error.
18. Exotic devices create exotic problems.
19. Digital devices tune out small errors while creating opportunities for large errors.
20. Complacency? Don’t worry about it.
21. In aviation, there is no problem so great or so complex that it cannot be blamed on the pilot.
22. There is no simple solution out there waiting to be discovered, so don’t waste your time searching for it.
23. Invention is the mother of necessity.
24. If at first you don’t succeed… try a new system or a different approach.
25. Some problems have no solution. If you encounter one of these, you can always convene a committee to revise some checklist.
26. In God we trust. Everything else must be brought into your scan.
27. It takes an airplane to bring out the worst in a pilot.
28. Any pilot who can be replaced by a computer should be.
29. Whenever you solve a problem you usually create one. You can only hope that the one you created is less critical than the one you eliminated.
30. You can never be too rich or too thin (Duchess of Windsor) or too careful what you put into a digital flight guidance system (Wiener).
31. Today’s nifty, voluntary system is tomorrow’s F.A.R.
Wiener in the early 1980s began researching what happens when humans and computers attempt to coexist on a flight deck. Though his “day job” was professor of management science at the University of Miami, Wiener is widely known for embedding in jump seats of his airline pilot subjects as part of research projects funded by the NASA Ames Research Center. Wiener would continue performing NASA human factors works for more than two decades. “Earl was an ongoing grantee,” says a NASA co-worker from that time. “He would publish a paper and 25 people would write their masters’ theses or doctoral dissertations on the topic.
In a 1980 paper he co-wrote with NASA’s Renwick Curry, “Flight-deck automation: promises and problems”, Wiener wrote, “It is highly questionable whether total system safety is always enhanced by allocating functions to automatic devices rather than human operators, and there is some reason to believe that flight-deck automation may have already passed its optimum point.” Compilations of scholarly papers by Wiener and his colleagues resulted in two key human factors books, one of which – Human Factors in Aviation – is still in publication today, albeit as a new edition with new editors.""