[UPDATE 11/21: The FAA’s new report on flight automation and safety has been released and can be read here.]
At an aviation conference held in Milan in November of 2010, Kathy Abbott, a top human-factors researcher with the Federal Aviation Administration, gave what she described as an early look at a major new report on flight automation. The findings Abbott presented were disturbing. An FAA review of flights between 2001 and 2009 implicated automation-related problems in a large percentage of crashes and dangerous incidents during those years. The Wall Street Journal‘s Andy Pasztor summed up Abbott’s remarks:
The study’s conclusions buttress the idea that a significant percentage of airline pilots rely excessively on computerized cockpit aids. Such adherence to computer-assisted piloting — and the confusion that can result when pilots fail to properly keep up with computer changes — increasingly are considered major factors in airliner crashes world-wide. … The errors included inappropriate control inputs by pilots and incorrect responses when trying to recover from aircraft upsets. … Focusing too much on manipulating flight-control computers, according to Ms. Abbott, often “distracts from managing the flight path of the airplane.”
The FAA indicated that the final report on the new research, which was intended as a follow-up to the agency’s landmark 1996 study on cockpit automation, would likely be released in 2011. It never appeared. It didn’t appear in 2012, either. During this time, I began to research the human impacts of computer automation (research that led to my article in the current Atlantic and that forms the basis of my next book, The Glass Cage). I made a couple of attempts to interview Abbott, which were politely but curtly rebuffed. I sensed that the FAA knew its research would be controversial, and it was being meticulous in preparing the report and its rollout.
In what seemed like a preview of the report’s conclusions, the agency released in January of this year a “Safety Alert for Operators”—SAFO 13002—that urged airlines to get their pilots to do more manual flying. Drawing on the ongoing research, the alert contained a warning:
Modern aircraft are commonly operated using autoflight systems (e.g., autopilot or autothrottle/autothrust). Unfortunately, continuous use of those systems does not reinforce a pilot’s knowledge and skills in manual flight operations. Autoflight systems are useful tools for pilots and have improved safety and workload management, and thus enabled more precise operations. However, continuous use of autoflight systems could lead to degradation of the pilot’s ability to quickly recover the aircraft from an undesired state.
Now, at long last, the FAA appears ready to release its full report, according to an article by Pasztor in today’s Journal. Pasztor has read a draft of the nearly 300-page document, and he summarizes its main thrust:
Relying too heavily on computer-driven flight decks — and problems that result when crews fail to properly keep up with changes in levels of automation — now pose the biggest threats to airliner safety world-wide, the study concluded. The results can range from degraded manual-flying skills to poor decision-making to possible erosion of confidence among some aviators when automation abruptly malfunctions or disconnects during an emergency.
The report includes the observation that, thanks to flight automation, pilots have grown “accustomed to watching things happen … instead of being proactive.” The pilot’s new role as “a manager of systems” can intrude on the actual flying of the plane.
None of this is unexpected. By now, there is a very large body of research on flight automation, dating back a couple of decades, that clearly demonstrates the risk of skill erosion as dependency on computers grows. The FAA study promises to play a vital role in bringing this research into the public eye. Beyond the important implications for the aviation profession, it will serve as a timely and general warning about the risks of relying too much on software, both in our work lives and our personal lives.
Image: Airbus A380 “glass cockpit.”
Capt. Chesley Sullenberger also made references about inadequate training whenever he was asked, “How can we train pilots to think more like you in emergencies?”