Pilot error

1994 Fairchild Air Force Base B-52 crash, caused by flying the aircraft beyond its operational limits. Here the aircraft is seen in an unrecoverable bank, moments before the crash. This incident is now used in military and civilian aviation environments as a case study in teaching crew resource management.
Actual flight path (red) of TWA Flight 3 from departure to crash point (controlled flight into terrain). Blue line shows the nominal Las Vegas course, while green is a typical course from Boulder. The pilot inadvertently used the Boulder outbound course instead of the appropriate Las Vegas course.
Maraba Airport
Belem Airport
Location of the crash landing after running out of fuel and departure/destination airports of the Varig Flight 254 (navigational error).
Runway collision caused by taking the wrong taxiing route (red instead of green), as control tower had not given clear instructions. The accident occurred in thick fog.
The Tenerife airport disaster now serves as a textbook example.[1] Due to several misunderstandings, the KLM flight tried to take off while the Pan Am flight was still on the runway. The airport was accommodating an unusually great number of large aircraft, resulting in disruption of the normal use of taxiways.
The 3p design altimeter is one of the most prone to misreading by pilots (a cause of the UA 389 and G-AOVD crashes).

Pilot error (sometimes called cockpit error) is a term once used to describe a decision, action or inaction by a pilot or crew of an aircraft determined to be a cause or contributing factor in an accident or incident. The term included mistakes, oversights, lapses in judgment, gaps in training, adverse habits, and failures to exercise due diligence in a pilot's duties.

The causes of pilot error are due to psychological and physiological human limitations, and various forms of threat and error management have been implemented into pilot training programs to teach crew members how to deal with impending situations which arise throughout the course of a flight.[2]

A broader view of how human factors fits into a system is now considered standard practice by accident investigators when examining the chain of events that led to an accident.[2][3]

Description

Usually in an accident caused by pilot error, it is assumed that the pilot in command (captain) makes an error unintentionally. However, an intentional disregard for a standard operating procedure (or warning) is still considered to be a pilot error, even if the pilot's actions justified criminal charges.

Pilot error is a decision or action mistake by pilots when there is an emergency. As the commander of the aircraft, the pilot is always regarded to be one of the most important factors. The decisions of the pilot determine everything on the craft, and can be affected by countless external elements. Analyses of accident patterns can have value for the improvement of passengers safety.[4]

The pilot may be a factor even during adverse weather conditions if the investigating body deems that the pilot did not exercise due diligence. The responsibility for the accident in such a case would depend upon whether the pilot could reasonably know of the danger and whether he or she took reasonable steps to avoid the weather problem. Flying into a hurricane (for other than legitimate research purposes) would be considered pilot error; flying into a microburst would not be considered pilot error if it was not detectable by the pilot, or in the time before this hazard was understood. Some weather phenomena (such as clear-air turbulence or mountain waves) are difficult to avoid, especially if the aircraft involved is the first aircraft to encounter the phenomenon in a certain area at a certain time.

Placing pilot error as a cause of an aviation accident has often been controversial. For example, the NTSB ruled that the crash of American Airlines Flight 587 was because of the failure of the rudder, which was caused by "unnecessary and excessive rudder pedal inputs" on the part of the co-pilot who was operating the aircraft at the time. Attorneys for the co-pilot, who was killed in the crash, argue that American Airlines' pilots had never been properly trained concerning extreme rudder inputs. The attorneys also claimed that the rudder failure was actually caused by a flaw in the design of the Airbus A300 aircraft and that the co-pilot's rudder inputs should not have caused the catastrophic rudder failure that led to the accident that killed 265 people.

Modern accident investigators avoid the words "pilot error", as the scope of their work is to determine the cause of an accident, rather than apportion blame. Furthermore, any attempt to blame pilots does not consider that they are part of a broader system, which in turn may be at fault for their fatigue, work pressure or lack of training.[3] ICAO and its member states therefore adopted the Reason Model in 1993 in an effort to better understanding the role of human factors in aviation accidents.[5]

Thus, pilot error is a major cause of air accidents. During 2004, pilot error was pointed to be the primary reason of 78.6% of disastrous GA (general aviation) accidents, and as the major cause of 75.5% of general aviation accidents in the US.[6] Pilot errors are related to multiple causes. Decision errors can be caused by several factors such as tendencies, biases as well as breakdowns when human proceeds the information coming in. For pilot in aviation, these errors are highly to produce not only errors but also fatalities.[7]

Causes of pilot error

Pilots work in complex environments and are routinely exposed to high amounts of situational stress in the workplace, inducing pilot error which may result in a threat to flight safety. While aircraft accidents are infrequent, they are highly visible and often involve massive loss of life. For this reason, research on causal factors and methodologies of mitigating risk associated with pilot error is exhaustive. Pilot error results from physiological and psychological limitations inherent in humans. “Causes of error include fatigue, workload, and fear as well as cognitive overload, poor interpersonal communications, imperfect information processing, and flawed decision making.”[8] Throughout the course of every flight, crews are intrinsically subjected to a variety of external threats and commit a range of errors that have the potential to negatively impact the safety of the aircraft.[9]

Threats

The term "threat" is defined as any event "external to flight crew's influence which can increase the operational complexity of a flight."[10] Threats may further be broken down into environmental threats and airline threats. Environmental threats are ultimately out of the hands of crew members and the airline, as they hold no influence on "adverse weather conditions, hazardous , air traffic control shortcomings, bird strikes, and high terrain."[10] Conversely, airline threats are not manageable by the flight crew, but may be controlled by the airline's management. These threats include "aircraft malfunctions, cabin interruptions, operational pressure, ground/ramp errors/events, cabin events and interruptions, ground maintenance errors, and inadequacies of manuals and charts."[10]

Errors

The term "error" is defined as any action or inaction leading to deviation from team or organizational intentions.[8] Error stems from physiological and psychological human limitations such as illness, medication, stress, alcohol/drug abuse, fatigue, emotion etc. Error is inevitable in humans and is primarily related to operational and behavioural mishaps.[11] Errors can vary from incorrect altimeter setting and deviations from flight course to more severe errors such as exceeding maximum structural speeds or forgetting to put down landing or takeoff flaps.

Decision making

Reasons for negative reporting of accident include staff being too busy, confusing data entry forms, lack of training and less education, lack of feedback to staff on reported data and punitive organizational cultures.[12] Wiegmann and Shappell invented three cognitive models to analyze approximately 4,000 pilot factors associated with more than 2,000 U.S. Naval aviation mishaps. Although the three cognitive models has slight difference in the types of errors all three lead to the same conclusion: judgment errors.[13] There are three steps which are decision-making, goal-setting, and strategy-selection errors. All of those were highly related with primary accidents.[13] For example, on December 28, 2014, AirAsia Flight 8501, which carrying seven crew members and 155 passengers, crashed into Java sea due to several fatal mistakes of the captain in the poor weather condition. in this case, the captain chose to adjust the flight altitude at the high rate which is not acceptable.[14]

Psychological illness

The psychological treatment and requirements of pilots is always listed in aviation law and enforced by individual airlines. Facing multiple special challenges, pilots must exercise control in complicated environments. Psychological illness is typically defined as an unintended physical, mental, or social injury, harm or complication that results in disability, death, or increased use of health care resources.[15] Due to physiological problems such as jet lag, pilots usually feel uncomfortable after long-hour flights. Psychological illness is regarded as a primary problem for pilots which had also caused several fatal accidents in the past.[16]

SilkAir Flight 185 On 19 December 1997, Flight 185 crashed into the Musi River near Palembang in southern Sumatra. All 97 passengers and seven crew were killed on board. After the investigation of the accident, all the evidence pointed to the captain which was concluded to be a planned suicidal accident.

EgyptAir Flight 990 On 31 October 1999, the Boeing 767 crashed into the Atlantic Ocean south of Nantucket Island, Massachusetts. All 217 people on board were killed. Although the result had never been proved, the crash was considered as a deliberate action by the relief first officer.[17]

Germanwings Flight 9525 On 24 March 2015, the aircraft, flight 9525 crashed in the French Alps. All 144 passengers and six crew members were killed. As the co-pilot of the plane, Andreas Lubitz had been treated for suicidal tendencies and been banned to work by a doctor. Lubitz hid this information from his employer. During the flight, the door was locked by Lubitz and the captain could not enter before Lubitz deliberately caused the aircraft to crash into a mountain.

Threat and Error Management (TEM)

TEM involves the effective detection and response to internal or external factors that have the potential to degrade the safety of an aircraft's operations.[9] Methods of teaching TEM stress replicability, or reliability of performance across recurring situations.[18] TEM aims to prepare crews with the "coordinative and cognitive ability to handle both routine and unforeseen surprises and anomalies."[18] The desired outcome of TEM training is the development of 'resiliency'. Resiliency, in this context, is the ability to recognize and act adaptively to disruptions which may be encountered during flight operations. TEM training occurs in various forms, with varying levels of success. Some of these training methods include data collection using the Line Operations Safety Audit (LOSA), implementation of crew resource management (CRM), cockpit task management (CTM), and the integrated use of checklists in both commercial and general aviation. Some other resources built into most modern aircraft that help minimize risk and manage threat and error are airborne collision and avoidance systems (ACAS) and ground proximity warning systems (GPWS).[19] With the consolidation of onboard computer systems and the implementation of proper pilot training, airlines and crew members look to mitigate the inherent risks associated with human factors.

Line Operations Safety Audit (LOSA)

LOSA is a structured observational program designed to collect data for the development and improvement of countermeasures to operational errors.[20] Through the audit process, trained observers are able to collect information regarding the normal procedures, protocol, and decision making processes flight crews undertake when faced with threats and errors during normal operation. This data driven analysis of threat and error management is useful for examining pilot behavior in relation to situational analysis. It provides a basis for further implementation of safety procedures or training to help mitigate errors and risks.[10] Observers on flights which are being audited typically observe the following:[20]

LOSA was developed to assist crew resource management practices in reducing human error in complex flight operations.[10] LOSA produces beneficial data that reveals how many errors or threats are encountered per flight, the number of errors which could have resulted in a serious threat to safety, and correctness of crew action or inaction. This data has proven to be useful in the development of CRM techniques and identification of what issues need to be addressed in training.[10]

Crew Resource Management (CRM)

CRM is the "effective use of all available resources by individuals and crews to safely and effectively accomplish a mission or task, as well as identifying and managing the conditions that lead to error."[21] CRM training has been integrated and mandatory for most pilot training programs, and has been the accepted standard for developing human factors skills for air crews and airlines. Although there is no universal CRM program, airlines usually customize their training to best suit the needs of the organization. The principles of each program are usually closely aligned. According to the U.S. Navy, there are seven critical CRM skills:[21]

These seven skills comprise the critical foundation for effective aircrew coordination. With the development and use of these core skills, flight crews "highlight the importance of identifying human factors and team dynamics to reduce human errors that lead to aviation mishaps."[21]

Application and effectiveness of CRM

Since the implementation of CRM circa 1979, following the need for increased research on resource management by NASA, the aviation industry has seen tremendous evolution of the application of CRM training procedures.[22] The applications of CRM has been developed in a series of generations:

Today, CRM is implemented through pilot and crew training sessions, simulations, and through interactions with senior ranked personnel and flight instructors such as briefing and debriefing flights. Although it is difficult to measure the success of CRM programs, studies have been conclusive that there is a correlation between CRM programs and better risk management.[23]

Cockpit Task Management (CTM)

Multiple sources of information can be taken from one interface here. Pilots may get information from the attitude indicator, altitude or airspeed in one scan.

Cockpit task management (CTM) is the "management level activity pilots perform as they initiate, monitor, prioritize, and terminate cockpit tasks."[24] A 'task' is defined as a process performed to achieve a goal (i.e. fly to a waypoint, descend to a desired altitude).[24] CTM training focuses on teaching crew members how to handle concurrent tasks which compete for their attention. This includes the following processes:

The need for CTM training is a result of the capacity of human attentional facilities and the limitations of working memory. Crew members may devote more mental or physical resources to a particular task which demands priority or requires the immediate safety of the aircraft.[24] CTM has been integrated to pilot training and goes hand in hand with CRM. Some aircraft operating systems have made progress in aiding CTM by combining instrument gauges into one screen. An example of this is a digital attitude indicator, which simultaneously shows the pilot the heading, airspeed, descent or ascent rate and a plethora of other pertinent information. Implementations such as these allow crews to gather multiple sources of information quickly and accurately, which frees up mental capacity to be focused on other, more prominent tasks.

Checklists like these ensure that pilots are able to follow operational procedure and aids in memory recall.

Checklists

The use of checklists before, during and after flights has established a strong presence in all types of aviation as a means of managing error and reducing the possibility of risk. Checklists are highly regulated and consist of protocols and procedures for the majority of the actions required during a flight.[25] The objectives of checklists include "memory recall, standardization and regulation of processes or methodologies."[25] The use of checklists in aviation has become an industry standard practice, and the completion of checklists from memory is considered a violation of protocol and pilot error. Studies have shown that increased errors in judgement and cognitive function of the brain, along with changes in memory function are a few of the effects of stress and fatigue.[26] Both of these are inevitable human factors encountered in the commercial aviation industry. The use of checklists in emergency situations also contributes to troubleshooting and reverse examining the chain of events which may have led to the particular incident or crash. Apart from checklists issued by regulatory bodies such as the FAA or ICAO, or checklists made by aircraft manufacturers, pilots also have personal qualitative checklists aimed to ensure their fitness and ability to fly the aircraft. An example is the IM SAFE checklist (illness, medication, stress, alcohol, fatigue/food, emotion) and a number of other qualitative assessments which pilots may perform before or during a flight to ensure the safety of the aircraft and passengers.[25] These checklists, along with a number of other redundancies integrated into most modern aircraft operation systems, ensure the pilot remains vigilant, and in turn, aims to reduce the risk of pilot error.

Notable examples

One of the most famous incidents of an aircraft disaster attributed to pilot error was the nighttime crash of Eastern Air Lines Flight 401 near Miami, Florida on December 29, 1972. The captain, first officer, and flight engineer had become fixated on a faulty landing gear light and had failed to realize that the flight controls had been bumped by one of the crew, altering the autopilot settings from level flight to a slow descent. Told by ATC to hold over a sparsely populated area away from the airport while they dealt with the problem (with, as a result, very few lights on the ground visible to act as an external reference), the distracted flight crew did not notice the plane losing height and the aircraft eventually struck the ground in the Everglades, killing 101 out of 176 passengers and crew.

The subsequent National Transportation Safety Board (NTSB) report on the incident blamed the flight crew for failing to monitor the aircraft's instruments properly. Details of the incident are now frequently used as a case study in training exercises by aircrews and air traffic controllers.

During 2004 in the United States, pilot error was listed as the primary cause of 78.6% of fatal general aviation accidents, and as the primary cause of 75.5% of general aviation accidents overall.[27] For scheduled air transport, pilot error typically accounts for just over half of worldwide accidents with a known cause.[28]

See also

References

  1. "TENERIFE DISASTER – 27 MARCH 1977: The Utility of the Swiss Cheese Model & other Accident Causation Frameworks". Go Flight Medicine. Retrieved 13 October 2014.
  2. 1 2 "Risk management handbook" (pdf) (Change 1 ed.). Federal Aviation Administration. January 2016. Chapter 2. Human behavior. Retrieved 24 August 2016.
  3. 1 2 Rural and Regional Affairs and Transport References Committee (May 2013). "Aviation Accident Investigations" (pdf). Government of Australia.
  4. Kendig, H.; Mealing, N. "Development of a community care research agenda for Australia". Australasian Journal on Ageing (2011): 37–40. doi:10.1111/j.1741-6612.2010.00471.x. PMID 21395939.
  5. Investigating Human Error: Incidents, Accidents, and Complex Systems. Ashgate Publishing. 2004. ISBN 0754641228.
  6. "Accident statistics". www.planecrashinfo.com. Retrieved 2015-10-21.
  7. Foyle, D. C., & Hooey, B. L. (Eds.). (2007). Human performance modeling in aviation. CRC Press.
  8. 1 2 Helmreich, Robert L. (March 18, 2000). "On Error Management: Lessons From Aviation". BMJ: British Medical Journal. 320-7237: 781–785.
  9. 1 2 Thomas, Matthew J.W. (2004). "Predictors of Threat and Error Management: Identification of Core Nontechnical Skills and Implications for Training Systems Design". The International Journal of Aviation Psychology (14(2)): 207–231.
  10. 1 2 3 4 5 6 Earl, Laurie; Bates, Paul R.; Murray, Patrick S.; Glendon, A. Ian; Creed, Peter A. (January 2012). "Developing a Single-Pilot Line Operations Safety Audit". Aviation Psychology and Applied Human Factors. 2: 49–61. doi:10.1027/2192-0923/a000027. ISSN 2192-0923.
  11. Li, Guohua; Baker, Susan P.; Grabowski, Jurek G.; Rebok, George W. (February 2001). "Factors Associated With Pilot Error in Aviation Crashes". Aviation, Space, and Environmental Medicine (72(1)): 52–58.
  12. Stanhope, N.; Crowley-Murphy, M. (1999). "An evaluation of adverse incident reporting.". Journal of Evaluation in Clinical Practice: 5–12.
  13. 1 2 Wiegmann, D. A., & Shappell, S. A. (2001). Human error perspectives in aviation. The International Journal of Aviation Psychology, 11(4), 341–357.
  14. Stacey, Daniel (15 January 2015). "Indonesian Air-Traffic Control Is Unsophisticated, Pilots Say". The Wall Street Journal. Retrieved 26 January 2015
  15. Sears, N.; Baker, G.R. (2013). "The incidence of adverse events among home care patients". International Journal for Quality in Health Care: 16–28.
  16. Wickens, C. D. (2002). Situation awareness and workload in aviation. Current directions in psychological science, 11(4), 128–133.
  17. Paxson, P (2002). "Have you been injured? The current state of personal injury lawyers' advertising". The Journal of Popular Culture. 36 (2): 191–199. doi:10.1111/1540-5931.00001.
  18. 1 2 Dekker, Sidney; Lundström, Johan (May 2007). "From Threat and Error Management (TEM) to Resilience". Journal of Human Factors and Aerospace Safety (260(70)): 1–10.
  19. Maurino, Dan (April 2005). "Threat and Error Management (TEM)". Canadian Aviation Safety Seminar (CASS); Flight Safety and Human Factors Programme – ICAO.
  20. 1 2 "Line Operations Safety Audit (LOSA)". SKYbrary. Retrieved 24 August 2016.
  21. 1 2 3 Myers, Charles; Orndorff, Denise (2013). "Crew Resource Management: Not Just for Aviators Anymore". Journal of Applied Learning Technology. 3 (3): 44–48.
  22. Helmreich, Robert L.; Merritt, Ashleigh C.; Wilhelm, John A. (1999). "The Evolution of Crew Resource Management Training in Commercial Aviation". The International Journal of Aviation Psychology (9(1)): 19–32.
  23. 1 2 Salas, Eduardo; Burke, Shawn C.; Bowers, Clint A.; Wilson, Katherine A. (2001). "Team Training in the Skies: Does Crew Resource Management (CRM) Training Work?". Human Factors. 43 (4): 641–674. doi:10.1518/001872001775870386. ISSN 0018-7208.
  24. 1 2 3 Chou, Chung-Di; Madhavan, Das; Funk, Ken (1996). "Studies of Cockpit Task Management Errors". The International Journal of Aviation Psychology (6(4)): 307–320.
  25. 1 2 3 Hales, Brigette M.; Pronovost, Peter J. (2006). "The Checklist -- A Tool for Error Management and Performance". Journal of Critical Care (21): 231–235.
  26. Cavanagh, James F.; Frank, Michael J.; Allen, John J.B. (April 2010). "Social Stress Reactivity Alters Reward and Punishment Learning". Social Cognitive and Affective Neuroscience (6(3)): 311–320.
  27. 2005 Joseph T. Nall Report
  28. PlaneCrashInfo.com accident statistics
  29. http://www.dephub.go.id/knkt/ntsc_aviation/baru/Final_Report_PK-KKW_Release.pdf Aircraft Accident Investigation Report of Indonesian's National Transportation Safety Committee
This article is issued from Wikipedia - version of the 12/1/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.