The recent AP article highlights a draft FAA study (I could not find a source link, please add in comments if you find) that finds that pilots sometimes “abdicate too much responsibility to automated systems.” Despite all of the redundancies and fail-safes built into modern aircraft, a cascade of failures can overwhelm pilots who have only been trained to rely on the equipment.
The study examined 46 accidents and major incidents, 734 voluntary reports by pilots and others as well as data from more than 9,000 flights in which a safety official rides in the cockpit to observe pilots in action. It found that in more than 60 percent of accidents, and 30 percent of major incidents, pilots had trouble manually flying the plane or made mistakes with automated flight controls.
A typical mistake was not recognizing that either the autopilot or the auto-throttle — which controls power to the engines — had disconnected. Others failed to take the proper steps to recover from a stall in flight or to monitor and maintain airspeed.
The investigation reveals a fatal airline crash near Buffalo New York in 2009 where the actions of the captain and co-pilot combined to cause an aerodynamic stall, and the plane crashed into the ground. Another crash two weeks later in Amsterdam involved the plane’s altimeters feeding incorrect information to the plane’s computers; the auto-throttle reduced speed such that the plane lost lift and stalled. The flight’s three pilots had not been closely monitoring the craft’s airspeed and experienced “automation surprise” when they discovered the plane was about to stall.
Recently, crash investigators from France are recommending that all pilots get mandatory training in manual flying and handling a high-altitude stall. In May, the FAA proposed that pilots be trained on how to recover from a stall, as well as expose them to more realistic problem scenarios.
But other new regulations are going in the opposite direction. Today, pilots are required to use their autopilot when flying at altitudes above 24,000 feet, which is where airliners spend much of their time cruising. The required minimum vertical safety buffer between planes has been reduced from 2,000 feet to 1,000 feet. That means more planes flying closer together, necessitating the kind of precision flying more reliably produced by automation than human beings.
The same situation is increasingly common closer to the ground.
The FAA is moving from an air traffic control system based on radar technology to more precise GPS navigation. Instead of time-consuming, fuel-burning stair-step descents, planes will be able to glide in more steeply for landings with their engines idling. Aircraft will be able to land and take off closer together and more frequently, even in poor weather, because pilots will know the precise location of other aircraft and obstacles on the ground. Fewer planes will be diverted.
But the new landing procedures require pilots to cede even more control to automation.
These are some of the challenges that the airline industry is facing as it relies on using more automation. The benefits of using more automation are quite significant, but it is enabling new kinds of catastrophic situations caused by human error.
The benefits of automation are not limited to aircraft. Automobiles are adopting more automation with each passing generation. Operating heavy machinery can also benefit from automation. Implementing automation in control systems enables more people with less skill and experience to operate those systems without necessarily knowing how to correct from anomalous operating conditions.
Is “automation addiction” a real problem or is it a symptom of system engineering that has not completely addressed all of the system requirements? As automation moves into more application spaces, the answer to this question becomes more important to define with a sharp edge. Where and how should the line be drawn for recovering from anomalous operating conditions; how much should the control system shoulder the responsibility versus the operator?
Tags: Autopilot, Human Error, Safety