航空 发表于 2010-5-15 08:36:46

HFACS对商用航空事故分析DOT版

<P>HFACS对商用航空事故分析DOT版</P>
<P>&nbsp;</P>
<P>**** Hidden Message *****</P>

航空 发表于 2010-5-15 08:37:00

<P>Douglas A. Wiegmann<BR>University of Illinois at Urbana-Champaign<BR>Institute of Aviation<BR>Savoy, IL 61874<BR>Scott A. Shappell<BR>FAA Civil Aeromedical Institute<BR>P.O. Box 25082<BR>Oklahoma City, OK 73125<BR>February 2001<BR>Final Report<BR>This document is available to the public<BR>through the National Technical Information<BR>Service, Springfield, Virginia 22161.<BR>Office of Aviation Medicine<BR>Washington, D.C. 20591<BR>A Human Error Analysis of<BR>Commercial Aviation Accidents<BR>Using the Human Factors<BR>Analysis and Classification<BR>System (HFACS)<BR>U.S. Department<BR>of Transpor tation<BR>Federal Aviation<BR>Administration<BR>DOT/FAA/AM-01/3<BR>NOTICE<BR>This document is disseminated under the sponsorship of<BR>the U.S. Department of Transportation in the interest of<BR>information exchange. The United States Government<BR>assumes no liability for the contents thereof.<BR>Technical Report Documentation Page<BR>1. Report No. 2. Government Accession No. 3. Recipient's Catalog No.<BR>DOT/FAA/AM-01/3<BR>4. Title and Subtitle 5. Report Date<BR>A Human Error Analysis of Commercial Aviation Accidents February 2001<BR>Using the Human Factors Analysis and Classification System (HFACS)<BR>6. Performing Organization Code<BR>7. Author(s) 8. Performing Organization Report No.<BR>Wiegmann, D.A.1, and Shappell, S.A.2<BR>9. Performing Organization Name and Address 10. Work Unit No. (TRAIS)<BR>11. Contract or Grant No.<BR>1University of Illinois at Urbana-Champaign, Institute of Aviation,<BR>Savoy, IL 61874<BR>2FAA Civil Aeromedical Institute, P.O. Box 25082, Oklahoma City, OK 73125 99-G-006<BR>12. Sponsoring Agency name and Address 13. Type of Report and Period Covered<BR>Office of Aviation Medicine<BR>Federal Aviation Administration 14. Sponsoring Agency Code<BR>800 Independence Ave., S.W.<BR>Washington, DC 20591<BR>15. Supplemental Notes<BR>Work was accomplished under task # AAM-A-00-HRR-520.<BR>16. Abstract<BR>The Human Factors Analysis and Classification System (HFACS) is a general human error framework<BR>originally developed and tested within the U.S. military as a tool for investigating and analyzing the human<BR>causes of aviation accidents. Based upon Reason’s (1990) model of latent and active failures, HFACS<BR>addresses human error at all levels of the system, including the condition of aircrew and organizational<BR>factors. The purpose of the present study was to assess the utility of the HFACS framework as an error<BR>analysis and classification tool outside the military. Specifically, HFACS was applied to commercial aviation<BR>accident records maintained by the National Transportation Safety Board (NTSB). Using accidents that<BR>occurred between January 1990 and December 1996, it was demonstrated that HFACS reliably<BR>accommodated all human causal factors associated with the commercial accidents examined. In addition, the<BR>classification of data using HFACS highlighted several critical safety issues in need of intervention research.<BR>These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation<BR>arena.<BR>17. Key Words 18. Distribution Statement<BR>Aviation, Human Error, Accident Investigation,<BR>Database Analysis, Commercial Aviation<BR>Document is available to the public through the<BR>National Technical Information Service,<BR>Springfield, Virginia 22161<BR>19. Security Classif. (of this report) 20. Security Classif. (of this page) 21. No. of Pages 22. Price<BR>Unclassified Unclassified 17<BR>Form DOT F 1700.7 (8-72) Reproduction of completed page authorized<BR>i</P>
<P>ACKNOWLEDGMENTS<BR>The authors thank Frank Cristina and Anthony Pape for their assistance in gathering,<BR>organizing and analyzing the accident reports used in this study.<BR>iii</P>
<P>1<BR>A HUMAN ERROR ANALYSIS OF COMMERCIAL AVIATION ACCIDENTS USING THE<BR>HUMAN FACTORS ANALYSIS AND CLASSIFICATION SYSTEM (HFACS)<BR>INTRODUCTION<BR>Humans, by their very nature, make mistakes; therefore,<BR>it should come as no surprise that human error has<BR>been implicated in a variety of occupational accidents,<BR>including 70% to 80% of those in civil and military<BR>aviation (O’Hare, Wiggins, Batt, &amp; Morrison, 1994;<BR>Wiegmann and Shappell, 1999; Yacavone, 1993). In<BR>fact, while the number of aviation accidents attributable<BR>solely to mechanical failure has decreased markedly over<BR>the past 40 years, those attributable at least in part to<BR>human error have declined at a much slower rate (Shappell<BR>&amp; Wiegmann, 1996). Given such findings, it would<BR>appear that interventions aimed at reducing the occurrence<BR>or consequences of human error have not been as<BR>effective as those directed at mechanical failures. Clearly,<BR>if accidents are to be reduced further, more emphasis<BR>must be placed on the genesis of human error as it relates<BR>to accident causation.<BR>The prevailing means of investigating human error in<BR>aviation accidents remains the analysis of accident and<BR>incident data. Unfortunately, most accident reporting<BR>systems are not designed around any theoretical framework<BR>of human error. Indeed, most accident reporting<BR>systems are designed and employed by engineers and<BR>front-line operators with only limited backgrounds in<BR>human factors. As a result, these systems have been useful<BR>for identifying engineering and mechanical failures but<BR>are relatively ineffective and narrow in scope where<BR>human error exists. Even when human factors are addressed,<BR>the terms and variables used are often ill-defined<BR>and archival databases are poorly organized. The end<BR>results are post-accident databases that typically are not<BR>conducive to a traditional human error analysis, making<BR>the identification of intervention strategies onerous<BR>(Wiegmann &amp; Shappell, 1997).<BR>The Accident Investigation Process<BR>To further illustrate this point, let us examine the<BR>accident investigation and intervention process separately<BR>for the mechanical and human components of an<BR>accident. Consider first the occurrence of an aircraft<BR>system or mechanical failure that results in an accident or<BR>injury (Figure 1). A subsequent investigation takes place<BR>that includes the examination of objective and quantifiable<BR>information, such as that derived from the wreckage<BR>and flight data recorder, as well as that from the application<BR>of sophisticated analytical techniques like metallurgical<BR>tests and computer modeling. This kind of<BR>information is then used to determine the probable<BR>mechanical cause(s) of the accident and to identify safety<BR>recommendations.<BR>Upon completion of the investigation, this “objective”<BR>information is typically entered into a highlystructured<BR>and well-defined accident database. These<BR>data can then be periodically analyzed to determine<BR>system-wide safety issues and provide feedback to investigators,<BR>thereby improving investigative methods and<BR>techniques. In addition, the data are often used to guide<BR>organizations (e.g., the Federal Aviation Administration<BR>, National Aeronautics and Space Administration<BR>, Department of Defense , airplane manufacturers<BR>and airlines) in deciding which research or<BR>safety programs to sponsor. As a result, these needsbased,<BR>data-driven programs, in turn, have typically<BR>produced effective intervention strategies that either<BR>prevent mechanical failures from occurring altogether,<BR>or mitigate their consequences when they do happen. In<BR>either case, there has been a substantial reduction in the<BR>rate of accidents due to mechanical or systems failures.<BR>In stark contrast, Figure 2 illustrates the current<BR>human factors accident investigation and prevention<BR>process. This example begins with the occurrence of an<BR>aircrew error during flight operations that leads to an<BR>accident or incident. A human performance investigation<BR>then ensues to determine the nature and causes of<BR>such errors. However, unlike the tangible and quantifiable<BR>evidence surrounding mechanical failures, the evidence<BR>and causes of human error are generally qualitative<BR>and elusive. Furthermore, human factors investigative<BR>and analytical techniques are often less refined and<BR>sophisticated than those used to analyze mechanical and<BR>engineering concerns. As such, the determination of<BR>human factors causal to the accident is a tenuous practice<BR>at best; all of which makes the information entered in the<BR>accident database sparse and ill-defined.<BR>2<BR>As a result, when traditional data analyses are performed<BR>to determine common human factors problems<BR>across accidents, the interpretation of the findings and<BR>the subsequent identification of important safety issues<BR>are of limited practical use. To make matters worse,<BR>results from these analyses provide limited feedback to<BR>investigators and are of limited use to airlines and government<BR>agencies in determining the types of research or<BR>safety programs to sponsor. As such, many research<BR>programs tend to be intuitively-, or fad-driven, rather<BR>than data-driven, and typically produce intervention<BR>strategies that are only marginally effective at reducing<BR>the occurrence and consequence of human error. The<BR>overall rate of human-error related accidents, therefore,<BR>has remained relatively high and constant over the last<BR>several years (Shappell &amp; Wiegmann, 1996).<BR>Addressing the Problem<BR>If the FAA and the aviation industry are to achieve<BR>their goal of significantly reducing the aviation accident<BR>rate over the next ten years, the primary causes of aviation<BR>accidents (i.e., human factors) must be addressed (ICAO,<BR>1993). However, as illustrated in Figure 2, simply<BR>increasing the amount of money and resources spent on<BR>human factors research is not the solution. Indeed, a<BR>great deal of resources and efforts are currently being<BR>expended. Rather, the solution is to redirect safety efforts<BR>so that they address important human factors issues.<BR>However, this assumes that we know what the important<BR>human factors issues are. Therefore, before research<BR>efforts can be systematically refocused, a comprehensive<BR>analysis of existing databases needs to be conducted to<BR>determine those specific human factors responsible for<BR>aviation accidents and incidents. Furthermore, if these<BR>efforts are to be sustained, new investigative methods and<BR>techniques will need to be developed so that data gathered<BR>during human factors accident investigations can be<BR>improved and analysis of the underlying causes of human<BR>error facilitated.<BR>To accomplish this improvement, a general human<BR>error framework is needed around which new investigative<BR>methods can be designed and existing postaccident<BR>databases restructured. Previous attempts to do this have<BR>met with encouraging, yet limited, success (O’Hare, et<BR>al., 1994; Wiegmann &amp; Shappell, 1997). This is primarily<BR>because performance failures are influenced by a<BR>Figure 1. General process of investigating and preventing aviation accidents involving mechanical or<BR>systems failures.<BR>Feedback<BR>Mechanical<BR>Failure<BR>- Catastrophic failures<BR>are infrequent<BR>events<BR>- When failures do<BR>occur, they are often<BR>less severe or<BR>hazardous due to<BR>effective<BR>intervention<BR>programs.<BR>Data-Driven<BR>Research<BR>Research Sponsors<BR>- FAA, DoD, NASA, &amp; airplane<BR>manufacturers provide<BR>research funding.<BR>- Research programs are needsbased<BR>and data-driven.<BR>Interventions are therefore<BR>very effective.<BR>Accident<BR>Investigation<BR>- Highly sophisticated<BR>techniques and<BR>procedures<BR>- Information is<BR>objective and<BR>quantifiable<BR>- Effective at<BR>determining why the<BR>failure occurred<BR>Accident<BR>Database<BR>- Designed around<BR>traditional<BR>categories<BR>- Variables are welldefined<BR>and<BR>causally related<BR>- Organization and<BR>structure facilitate<BR>access and use<BR>Database<BR>Analysis<BR>- Traditional<BR>analyses are<BR>clearly outlined<BR>and readily<BR>performed.<BR>- Frequent analyses<BR>help identify<BR>common<BR>mechanical and<BR>engineering<BR>safety issues.<BR>Mitigation<BR>Prevention<BR>Effective<BR>Intervention<BR>and Prevention<BR>Programs<BR>3<BR>variety of human factors that are typically not addressed<BR>by traditional error frameworks. For instance, with few<BR>exceptions (e.g., Rasmussen, 1982), human error taxonomies<BR>do not consider the potential adverse mental<BR>and physiological condition of the individual (e.g., fatigue,<BR>illness, attitudes) when describing errors in the<BR>cockpit. Likewise, latent errors committed by officials<BR>within the management hierarchy such as line managers<BR>and supervisors are often not addressed, even though it is<BR>well known that these factors directly influence the<BR>condition and decisions of pilots (Reason, 1990). Therefore,<BR>if a comprehensive analysis of human error is to be<BR>conducted, a taxonomy that takes into account the<BR>multiple causes of human failure must be offered.<BR>Recently, the Human Factors Analysis and Classification<BR>System (HFACS) was developed to meet these needs<BR>(Shappell &amp; Wiegmann, 1997a, 2000a, and in press).<BR>This system, which is based on Reason’s (1990) model of<BR>latent and active failures, was originally developed for the<BR>U.S. Navy and Marine Corps as an accident investigation<BR>and data analysis tool. Since its original development,<BR>however, HFACS has been employed by other military<BR>organizations (e.g., U.S. Army, Air Force, and Canadian<BR>Defense Force) as an adjunct to preexisting accident<BR>investigation and analysis systems. To date, the HFACS<BR>framework has been applied to more than 1,000 military<BR>aviation accidents, yielding objective, data-driven intervention<BR>strategies while enhancing both the quantity and<BR>quality of human factors information gathered during<BR>accident investigations (Shappell &amp; Wiegmann, in press).<BR>Other organizations such as the FAA and NASA have<BR>explored the use of HFACS as a complement to preexisting<BR>systems within civil aviation in an attempt to capitalize<BR>on gains realized by the military (Ford, Jack, Crisp, &amp;<BR>Sandusky, 1999). Still, few systematic efforts have examined<BR>whether HFACS is indeed a viable tool within the<BR>civil aviation arena, even though it can be argued that the<BR>similarities between military and civilian aviation outweigh<BR>their differences. The purpose of the present study<BR>was to empirically address this issue by applying the<BR>HFACS framework, as originally designed for the military,<BR>to the classification and analysis of civil aviation<BR>accident data. Before beginning, however, a brief overview<BR>of the HFACS system will be presented for those<BR>Feedback<BR>Human<BR>Error<BR>- Errors occur<BR>frequently and are<BR>the major cause of<BR>accidents.<BR>- Few safety programs<BR>are effective at<BR>preventing the<BR>occurrence or<BR>consequences of<BR>these errors.<BR>Research Sponsors<BR>- FAA, DoD, NASA, &amp; Airlines<BR>provide funding for safety<BR>research programs.<BR>- Lack of good data leads to<BR>research programs based<BR>primarily on interests and<BR>intuitions. Interventions are<BR>therefore less effective.<BR>Fad-Driven<BR>Research<BR>Mitigation<BR>Prevention<BR>Ineffective<BR>Intervention<BR>and Prevention<BR>Programs<BR>Accident<BR>Investigation<BR>- Less sophisticated<BR>techniques and<BR>procedures<BR>- Information is<BR>qualitative and<BR>illusive<BR>- Focus on “what”<BR>happened but not<BR>“why” it happened<BR>Accident<BR>Database<BR>- Not designed<BR>around any<BR>particular human<BR>error framework<BR>- Variables often illdefined<BR>- Organization and<BR>structure difficult<BR>to understand<BR>Database<BR>Analysis<BR>- Traditional human<BR>factors analyses<BR>are onerous due<BR>to ill-defined<BR>variables and<BR>database<BR>structures.<BR>- Few analyses have<BR>been performed<BR>to identify<BR>underlying<BR>human factors<BR>safety issues.<BR>Figure 2. General process of investigating and preventing aviation accidents involving human error.<BR>4<BR>readers who may not be familiar with the framework (for<BR>a detailed description of HFACS, see Shappell and<BR>Wiegmann, 2000a and 2001).<BR>HFACS<BR>Drawing upon Reason’s (1990) concept of latent and<BR>active failures, HFACS describes human error at each of<BR>four levels of failure: 1) unsafe acts of operators (e.g.,<BR>aircrew), 2) preconditions for unsafe acts, 3) unsafe<BR>supervision, and 4) organizational influences. A brief<BR>description of each causal category follows (Figure 3).<BR>Unsafe Acts of Operators<BR>The unsafe acts of operators (aircrew) can be loosely<BR>classified into one of two categories: errors and violations<BR>(Reason, 1990). While both are common within most<BR>settings, they differ markedly when the rules and regulation<BR>of an organization are considered. That is, errors can<BR>be described as those “legal” activities that fail to achieve<BR>their intended outcome, while violations are commonly<BR>defined as behavior that represents the willful disregard<BR>for the rules and regulations. It is within these two<BR>overarching categories that HFACS describes three types<BR>of errors (decision, skill-based, and perceptual) and two<BR>types of violations (routine and exceptional).<BR>Errors<BR>One of the more common error forms, decision errors,<BR>represents conscious, goal-intended behavior that proceeds<BR>as designed; yet, the plan proves inadequate or<BR>inappropriate for the situation. Often referred to as<BR>“honest mistakes,” these unsafe acts typically manifest as<BR>poorly executed procedures, improper choices, or simply<BR>the misinterpretation or misuse of relevant information.<BR>In contrast to decision errors, the second error form,<BR>skill-based errors, occurs with little or no conscious thought.<BR>Just as little thought goes into turning one’s steering<BR>wheel or shifting gears in an automobile, basic flight<BR>Figure 3. Overview of the Human Factors Analysis and Classification System (HFACS).<BR>Errors<BR>Perceptual<BR>Errors<BR>Skill-Based<BR>Errors<BR>UNSAFE<BR>ACTS<BR>Decision<BR>Errors<BR>Routine Exceptional<BR>Violations<BR>Inadequate<BR>Supervision<BR>Planned<BR>Inappropriate<BR>Operations<BR>Failed to<BR>Correct<BR>Problem<BR>Supervisory<BR>Violations<BR>UNSAFE<BR>SUPERVISION<BR>Substandard<BR>Conditions of<BR>Operators<BR>PRECONDITIONS<BR>FOR<BR>UNSAFE ACTS<BR>Adverse<BR>Physiological States<BR>Physical/<BR>Mental<BR>Limitations<BR>Adverse Mental<BR>States<BR>Personal<BR>Readiness<BR>Crew Resource<BR>Mismanagement<BR>Substandard<BR>Practices of<BR>Operators<BR>Resource<BR>Management<BR>Organizational<BR>Climate<BR>Organizational<BR>Process<BR>ORGANIZATIONAL<BR>INFLUENCES<BR>5<BR>skills such as stick and rudder movements and visual<BR>scanning often occur without thinking. The difficulty<BR>with these highly practiced and seemingly automatic<BR>behaviors is that they are particularly susceptible to<BR>attention and/or memory failures. As a result, skill-based<BR>errors such as the breakdown in visual scan patterns,<BR>inadvertent activation/deactivation of switches, forgotten<BR>intentions, and omitted items in checklists often<BR>appear. Even the manner (or skill) with which one flies<BR>an aircraft (aggressive, tentative, or controlled) can<BR>affect safety.<BR>While, decision and skill-based errors have dominated<BR>most accident databases and therefore, have been<BR>included in most error frameworks, the third and final<BR>error form, perceptual errors, has received comparatively<BR>less attention. No less important, perceptual errors occur<BR>when sensory input is degraded, or “unusual,” as is often<BR>the case when flying at night, in the weather, or in other<BR>visually impoverished environments. Faced with acting<BR>on imperfect or less information, aircrew run the risk of<BR>misjudging distances, altitude, and decent rates, as well<BR>as a responding incorrectly to a variety of visual/vestibular<BR>illusions.<BR>Violations<BR>Although there are many ways to distinguish among<BR>types of violations, two distinct forms have been identified<BR>based on their etiology. The first, routine violations,<BR>tend to be habitual by nature and are often enabled by a<BR>system of supervision and management that tolerates<BR>such departures from the rules (Reason, 1990). Often<BR>referred to as “bending the rules,” the classic example is<BR>that of the individual who drives his/her automobile<BR>consistently 5-10 mph faster than allowed by law. While<BR>clearly against the law, the behavior is, in effect, sanctioned<BR>by local authorities (police) who often will not<BR>enforce the law until speeds in excess of 10 mph over the<BR>posted limit are observed.<BR>Exceptional violations, on the other hand, are isolated<BR>departures from authority, neither typical of the individual<BR>nor condoned by management. For example,<BR>while driving 65 in a 55 mph zone might be condoned by<BR>authorities, driving 105 mph in a 55 mph zone certainly<BR>would not. It is important to note, that while most<BR>exceptional violations are appalling, they are not considered<BR>“exceptional” because of their extreme nature. Rather,<BR>they are regarded as exceptional because they are neither<BR>typical of the individual nor condoned by authority.<BR>Preconditions for Unsafe Acts<BR>Simply focusing on unsafe acts, however, is like focusing<BR>on a patient’s symptoms without understanding the<BR>underlying disease state that caused it. As such, investigators<BR>must dig deeper into the preconditions for unsafe<BR>acts. Within HFACS, two major subdivisions are described:<BR>substandard conditions of operators and the<BR>substandard practices they commit.<BR>Substandard Conditions of the Operator<BR>Being prepared mentally is critical in nearly every<BR>endeavor; perhaps it is even more so in aviation. With<BR>this in mind, the first of three categories, adverse mental<BR>states, was created to account for those mental conditions<BR>that adversely affect performance. Principal among these<BR>are the loss of situational awareness, mental fatigue,<BR>circadian dysrhythmia, and pernicious attitudes such as<BR>overconfidence, complacency, and misplaced motivation<BR>that negatively impact decisions and contribute to<BR>unsafe acts.<BR>Equally important, however, are those adverse physiological<BR>states that preclude the safe conduct of flight.<BR>Particularly important to aviation are conditions such as<BR>spatial disorientation, visual illusions, hypoxia, illness,<BR>intoxication, and a whole host of pharmacological and<BR>medical abnormalities known to affect performance. For<BR>example, it is not surprising that, when aircrews become<BR>spatially disoriented and fail to rely on flight instrumentation,<BR>accidents can, and often do, occur.<BR>Physical and/or mental limitations of the operator, the<BR>third and final category of substandard condition, includes<BR>those instances when necessary sensory information<BR>is either unavailable, or if available, individuals<BR>simply do not have the aptitude, skill, or time to safely<BR>deal with it. For aviation, the former often includes not<BR>seeing other aircraft or obstacles due to the size and/or<BR>contrast of the object in the visual field. However, there<BR>are many times when a situation requires such rapid<BR>mental processing or reaction time that the time allotted<BR>to remedy the problem exceeds human limits (as is often<BR>the case during nap-of-the-earth flight). Nevertheless,<BR>even when favorable visual cues or an abundance of time<BR>is available, there are instances when an individual simply<BR>may not possess the necessary aptitude, physical ability,<BR>or proficiency to operate safely.<BR>6<BR>Substandard Practices of the Operator<BR>Often times, the substandard practices of aircrew will<BR>lead to the conditions and unsafe acts described above.<BR>For instance, the failure to ensure that all members of the<BR>crew are acting in a coordinated manner can lead to<BR>confusion (adverse mental state) and poor decisions in<BR>the cockpit. Crew resource mismanagement, as it is referred<BR>to here, includes the failures of both inter- and<BR>intra-cockpit communication, as well as communication<BR>with ATC and other ground personnel. This category<BR>also includes those instances when crewmembers do not<BR>work together as a team, or when individuals directly<BR>responsible for the conduct of operations fail to coordinate<BR>activities before, during, and after a flight.<BR>Equally important, however, individuals must ensure<BR>that they are adequately prepared for flight. Consequently,<BR>the category of personal readiness was created to<BR>account for those instances when rules such as disregarding<BR>crew rest requirements, violating alcohol restrictions,<BR>or self-medicating, are not adhered to. However, even<BR>behaviors that do not necessarily violate existing rules or<BR>regulations (e.g., running ten miles before piloting an<BR>aircraft or not observing good dietary practices) may<BR>reduce the operating capabilities of the individual and<BR>are, therefore, captured here.<BR>Unsafe Supervision<BR>Clearly, aircrews are responsible for their actions and,<BR>as such, must be held accountable. However, in many<BR>instances, they are the unwitting inheritors of latent<BR>failures attributable to those who supervise them (Reason,<BR>1990). To account for these latent failures, the<BR>overarching category of unsafe supervision was created<BR>within which four categories (inadequate supervision,<BR>planned inappropriate operations, failed to correct known<BR>problems, and supervisory violations) are included.<BR>The first category, inadequate supervision, refers to<BR>failures within the supervisory chain of command, which<BR>was a direct result of some supervisory action or inaction.<BR>That is, at a minimum, supervisors must provide the<BR>opportunity for individuals to succeed. It is expected,<BR>therefore, that individuals will receive adequate training,<BR>professional guidance, oversight, and operational leadership,<BR>and that all will be managed appropriately. When<BR>this is not the case, aircrews are often isolated, as the risk<BR>associated with day-to-day operations invariably will<BR>increase.<BR>However, the risk associated with supervisory failures<BR>can come in many forms. Occasionally, for example, the<BR>operational tempo and/or schedule is planned such that<BR>individuals are put at unacceptable risk and, ultimately,<BR>performance is adversely affected. As such, the category<BR>of planned inappropriate operations was created to account<BR>for all aspects of improper or inappropriate crew<BR>scheduling and operational planning, which may focus<BR>on such issues as crew pairing, crew rest, and managing<BR>the risk associated with specific flights.<BR>The remaining two categories of unsafe supervision,<BR>the failure to correct known problems and supervisory<BR>violations, are similar, yet considered separately within<BR>HFACS. The failure to correct known problems refers to<BR>those instances when deficiencies among individuals,<BR>equipment, training, or other related safety areas are<BR>“known” to the supervisor, yet are allowed to continue<BR>uncorrected. For example, the failure to consistently<BR>correct or discipline inappropriate behavior certainly<BR>fosters an unsafe atmosphere but is not considered a<BR>violation if no specific rules or regulations were broken.<BR>Supervisory violations, on the other hand, are reserved<BR>for those instances when existing rules and regulations<BR>are willfully disregarded by supervisors when managing<BR>assets. For instance, permitting aircrew to operate an<BR>aircraft without current qualifications or license is a<BR>flagrant violation that invariably sets the stage for the<BR>tragic sequence of events that predictably follow.<BR>Organizational Influences<BR>Fallible decisions of upper-level management can<BR>directly affect supervisory practices, as well as the conditions<BR>and actions of operators. Unfortunately, these organizational<BR>influences often go unnoticed or unreported by<BR>even the best-intentioned accident investigators.<BR>Traditionally, these latent organizational failures generally<BR>revolve around three issues: 1) resource management,<BR>2) organizational climate, and 3) operational<BR>processes. The first category, resource management, refers<BR>to the management, allocation, and maintenance of<BR>organizational resources, including human resource<BR>management (selection, training, staffing), monetary<BR>safety budgets, and equipment design (ergonomic specifications).<BR>In general, corporate decisions about how<BR>such resources should be managed center around two<BR>distinct objectives – the goal of safety and the goal of ontime,<BR>cost-effective operations. In times of prosperity,<BR>both objectives can be easily balanced and satisfied in<BR>full. However, there may also be times of fiscal austerity<BR>that demand some give and take between the two.<BR>Unfortunately, history tells us that safety is often the loser in<BR>such battles, as safety and training are often the first to be cut<BR>in organizations experiencing financial difficulties.<BR>7<BR>Organizational climate refers to a broad class of organizational<BR>variables that influence worker performance<BR>and is defined as the “situationally based consistencies in<BR>the organization’s treatment of individuals” (Jones, 1988).<BR>One telltale sign of an organization’s climate is its<BR>structure, as reflected in the chain-of-command, delegation<BR>of authority and responsibility, communication<BR>channels, and formal accountability for actions. Just like<BR>in the cockpit, communication and coordination are<BR>vital within an organization. However, an organization’s<BR>policies and culture are also good indicators of its climate.<BR>Consequently, when policies are ill-defined,<BR>adversarial, or conflicting, or when they are supplanted<BR>by unofficial rules and values, confusion abounds, and<BR>safety suffers within an organization.<BR>Finally, operational process refers to formal processes<BR>(operational tempo, time pressures, production quotas,<BR>incentive systems, schedules, etc.), procedures (performance<BR>standards, objectives, documentation, instructions<BR>about procedures, etc.), and oversight within the<BR>organization (organizational self-study, risk management,<BR>and the establishment and use of safety programs).<BR>Poor upper-level management and decisions concerning<BR>each of these organizational factors can also have a<BR>negative, albeit indirect, effect on operator performance<BR>and system safety.<BR>Summary<BR>The HFACS framework bridges the gap between<BR>theory and practice by providing safety professionals<BR>with a theoretically based tool for identifying and classifying<BR>the human causes of aviation accidents. Because the<BR>system focuses on both latent and active failures and their<BR>interrelationships, it facilitates the identification of the<BR>underlying causes of human error. To date, HFACS has<BR>been shown to be useful within the context of military<BR>aviation, as both a data analysis framework and an<BR>accident investigation tool. However, HFACS has yet to<BR>be applied systematically to the analysis and investigation<BR>of civil aviation accidents. The purpose of the present<BR>research project, therefore, was to assess the utility of the<BR>HFACS framework as an error analysis and classification<BR>tool within commercial aviation.<BR>The specific objectives of this study were three-fold.<BR>The first objective was to determine whether the HFACS<BR>framework, in its current form, would be comprehensive<BR>enough to accommodate all of the underlying human<BR>causal-factors associated with commercial aviation accidents,<BR>as contained in the accident databases maintained<BR>by the FAA and NTSB. In other words, could the<BR>framework capture all the relevant human error data or<BR>would a portion of the database be lost because it was<BR>unclassifiable? The second objective was to determine<BR>whether the process of reclassifying the human causal<BR>factors using HFACS was reliable. That is, would different<BR>users of the system agree on how causal factors should<BR>be coded using the framework? Finally, the third objective<BR>was to determine whether reclassifying the data using<BR>HFACS yield a benefit beyond what is already known<BR>about commercial aviation accident causation. Specifically,<BR>would HFACS highlight any heretofore unknown<BR>safety issues in need of further intervention research?<BR>METHOD<BR>Data<BR>A comprehensive review of all accidents involving<BR>Code of Federal Air Regulations (FAR) Parts 121 and<BR>135 Scheduled Air Carriers between January 1990 and<BR>December 1996 was conducted using database records<BR>maintained by the NTSB and the FAA. Of particular<BR>interest to this study were those accidents attributable, at<BR>least in part, to the aircrew. Consequently, not included<BR>were accidents due solely to catastrophic failure, maintenance<BR>error, and unavoidable weather conditions such as<BR>turbulence and wind shear. Furthermore, only those<BR>accidents in which the investigation was completed, and<BR>the cause of the accident determined, were included in<BR>this analysis. One hundred nineteen accidents met these<BR>criteria, including 44 accidents involving FAR Part 121<BR>operators and 75 accidents involving FAR Part 135<BR>operators.<BR>HFACS Classification<BR>The 119 aircrew-related accidents yielded 319 causal<BR>factors for further analyses. Each of these NTSB causal<BR>factors was subsequently coded independently by both<BR>an aviation psychologist and a commercially-rated pilot<BR>using the HFACS framework. Only those causal factors<BR>identified by the NTSB were analyzed. That is, no new<BR>causal factors were created during the error-coding process.<BR>RESULTS<BR>HFACS Comprehensiveness<BR>All 319 (100%) of the human causal factors associated<BR>with aircrew-related accidents were accommodated using<BR>the HFACS framework. Instances of all but two<BR>HFACS categories (i.e., organizational climate and<BR>personal readiness) were observed as least once in the<BR>8<BR>accident database. Therefore, no new HFACS categories<BR>were needed to capture the existing causal factors, and no<BR>human factors data pertaining to the aircrew were left<BR>unclassified during the coding process.<BR>HFACS Reliability<BR>Disagreements among raters were noted during the<BR>coding process and ultimately resolved by discussion.<BR>Using the record of agreement and disagreement between<BR>the raters, the reliability of the HFACS system was<BR>assessed by calculating Cohen’s kappa — an index of<BR>agreement that has been corrected for chance. The obtained<BR>kappa value was .71, which generally reflects a<BR>“good” level of agreement according to criteria described<BR>by Fleiss (1981).<BR>HFACS Analyses<BR>Unsafe Acts<BR>Table 1 presents percentages of FAR Parts 121 and<BR>135 aircrew-related accidents associated with each of the<BR>HFACS categories. An examination of the table reveals<BR>that at the unsafe acts level, skill-based errors were<BR>associated with the largest percentage of accidents. Approximately<BR>60% of all aircrew-related accidents were<BR>associated with at least one skill-based error. This percentage<BR>was relatively similar for FAR Part 121 carriers<BR>(63.6%) and FAR Part 135 carriers (58.7%). Figure 4,<BR>panel A, illustrates that the proportion of accidents<BR>associated with skill-based errors has remained relatively<BR>unchanged over the seven-year period examined in the<BR>study. Notably, however, the lowest proportion of accidents<BR>associated with skill-based errors was observed in<BR>the last two years of the study (1995 and 1996).<BR>Among the remaining categories of unsafe acts, accidents<BR>associated with decision errors constituted the next<BR>highest proportion (i.e., roughly 29% of the accidents<BR>examined, Table 1). Again, this percentage was roughly<BR>equal across both FAR Part 121 (25.0%) and Part 135<BR>(30.7%) accidents. With the exception of 1994, in which<BR>the percentage of aircrew-related accidents associated<BR>with decision errors reached a high of 60%, the proportion<BR>of accidents associated with decision errors remained<BR>relatively constant across the years of the study<BR>(Figure 4, panel B).<BR>Table 1. Percentage of Accidents Associated with each HFACS category.<BR>HFACS Category FAR Part 121 FAR Part 135 Total<BR>Organizational Influences<BR>Resource Management<BR>Organizational Climate<BR>Organizational Process<BR>4.5 (2)<BR>0.0 (0)<BR>15.9 (7)<BR>1.3 (1)<BR>0.0 (0)<BR>4.0 (3)<BR>2.5 (3)<BR>0.0 (0)<BR>8.4 (10)<BR>Unsafe Supervision<BR>Inadequate Supervision<BR>Planned Inappropriate Operations<BR>Failed to Correct Known Problem<BR>Supervisory Violations<BR>2.3 (1)<BR>0.0 (0)<BR>0.0 (0)<BR>0.0 (0)<BR>6.7 (5)<BR>1.3 (1)<BR>2.7 (2)<BR>2.7 (2)<BR>5.0 (6)<BR>0.8 (1)<BR>1.7 (2)<BR>1.7 (2)<BR>Preconditions of Unsafe Acts<BR>Adverse Mental States<BR>Adverse Physiological Sates<BR>Physical/mental Limitations<BR>Crew-resource Mismanagement<BR>Personal Readiness<BR>Unsafe Acts<BR>Skill-based Errors<BR>Decision Errors<BR>Perceptual Errors<BR>Violations<BR>13.6 (6)<BR>4.5 (2)<BR>2.3 (1)<BR>40.9 (18)<BR>0.0 (0)<BR>63.6 (28)<BR>25.0 (11)<BR>20.5 (9)<BR>25.0 (11)<BR>13.3 (10)<BR>0.0 (0)<BR>16.0 (12)<BR>22.7 (17)<BR>0.0 (0)<BR>58.7 (44)<BR>30.7 (23)<BR>10.7 (8)<BR>28.0 (21)<BR>13.4 (16)<BR>1.7 (2)<BR>10.9 (13)<BR>29.4 (35)<BR>0.0 (0)<BR>60.5 (72)<BR>28.6 (34)<BR>14.3 (17)<BR>26.9 (32)<BR>Note: Numbers in table are percentages of accidents that involved at least one instance of an HFACS category. Numbers in<BR>parentheses indicate accident frequencies. Because more than one causal factor is generally associated with each accident,<BR>the percentages in the table will not equal 100%.<BR>9<BR>Similar to accidents associated with decision errors,<BR>those attributable at least in part to violations of rules and<BR>regulations were associated with 26.9% of the accidents<BR>examined. Again, no appreciable difference was evident<BR>when comparing the relative percentages across FAR<BR>Parts 121 (25.0%) and 135 (28.0%). However, an<BR>examination of Figure 4, panel C, reveals that the relative<BR>proportion of accidents associated with violations increased<BR>appreciably from a low of 6% in 1990 to a high<BR>of 46% in 1996.<BR>Finally, the proportion of accidents associated with<BR>perceptual errors was relatively low. In fact, only 17 of the<BR>119 accidents (14.3%) involved some form of perceptual<BR>error. While it appeared that the relative proportion of<BR>Part 121 accidents associated with perceptual errors was<BR>higher than Part 135 accidents, the low number of<BR>occurrences precluded any meaningful comparisons across<BR>either the type of operation or calendar year.<BR>Preconditions for Unsafe Acts<BR>Within the preconditions level, CRM failures were<BR>associated with the largest percentage of accidents. Approximately<BR>29% of all aircrew-related accidents were<BR>associated with at least one CRM failure. A relatively<BR>larger percentage of FAR Part 121 aircrew-accidents<BR>involved CRM failures (40.9%) than did FAR Part 135<BR>aircrew-related accidents (22.7%). However, the percentage<BR>of accidents associated with CRM failures remained<BR>relatively constant over the seven-year period for<BR>both FAR Part 121 and 135 carriers (Figure 4, panel d).<BR>The next largest percentage of accidents was associated<BR>with adverse mental states (13.4%), followed by<BR>physical/mental limitations (10.9%) and adverse physiological<BR>states (1.7%). There were no accidents associated<BR>with personal readiness issues. The percentage of<BR>accidents associated with physical/mental limitation was<BR>higher for FAR Part 135 carriers (16%) compared with<BR>FAR Part 121 carriers (2.3%), but accidents associated<BR>with adverse mental or adverse physiological states were<BR>relatively equal across carriers. Again, however, the low<BR>number of occurrences in each of these accident categories<BR>precluded any meaningful comparisons across<BR>calendar year.<BR>Supervisory and Organizational Factors<BR>Very few of the NTSB reports that implicated the<BR>aircrew as contributing to an accident also cited some<BR>form of supervisory or organizational failure (see Table<BR>0<BR>10<BR>20<BR>30<BR>40<BR>50<BR>60<BR>70<BR>80<BR>90 91 92 93 94 95 96<BR>Percentage<BR>Year<BR>A.<BR>0<BR>10<BR>20<BR>30<BR>40<BR>50<BR>60<BR>70<BR>80<BR>90 91 92 93 94 95 96<BR>Percentage<BR>Year<BR>D.<BR>0<BR>10<BR>20<BR>30<BR>40<BR>50<BR>60<BR>70<BR>80<BR>90 91 92 93 94 95 96<BR>Percentage<BR>Year<BR>C.<BR>0<BR>10<BR>20<BR>30<BR>40<BR>50<BR>60<BR>70<BR>80<BR>90 91 92 93 94 95 96<BR>Percentage<BR>Year<BR>B.<BR>Figure 4. Percentage of aircrew related accidents associated with skill-based errors<BR>(Panel A), decision errors (Panel B), violations (Panel C) and CRM failures (Panel D)<BR>across calendar years. Lines represent seven year averages.<BR>10<BR>1). Indeed, only 16% of all aircrew-related accidents<BR>involved some form of either supervisory or organizational<BR>involvement. Overall, however, a larger proportion<BR>of aircrew-related accidents involving FAR Part 135<BR>carriers involved supervisory failures (9.3%) than did<BR>those accidents involving FAR Part 121 carriers (2.3%).<BR>In contrast, a larger proportion of aircrew-related accidents<BR>involving FAR Part 121 carriers involved organizational<BR>factors (20.5%) than did those accidents involving<BR>FAR Part 135 carriers (4.0%).<BR>DISCUSSION<BR>HFACS Comprehensiveness<BR>The HFACS framework was found to accommodate<BR>all 319 causal factors associated with the 119 accidents<BR>involving FAR Parts 121 and 135 scheduled carriers<BR>across the seven-year period examined. This finding<BR>suggests that the error categories within HFACS, originally<BR>developed for use in the military, are applicable<BR>within commercial aviation as well. Still, some of the<BR>error-factors within the HFACS framework were never<BR>observed in this commercial aviation accident database.<BR>For example, no instances of such factors as organizational<BR>climate or personal readiness were observed. In<BR>fact, very few instances of supervisory factors were evident<BR>at all in the data.<BR>One explanation for the scarcity of such factors could<BR>be that, contrary to Reason’s model of latent and active<BR>failures upon which HFACS is based, such supervisory<BR>and organizational factors simply do not play as large of<BR>a role in the etiology of commercial aviation accidents as<BR>once expected. Consequently, the HFACS framework<BR>may need to be pared down or simplified for use with<BR>commercial aviation. Another explanation, however, is<BR>that these factors do contribute to most accidents, yet<BR>they are rarely identified using existing accident investigation<BR>processes. Nevertheless, the results of this study<BR>indicate that the HFACS framework was able to capture<BR>all existing causal factors and no new error-categories or<BR>aircrew cause-factors were needed to analyze the commercial<BR>accident data.<BR>HFACS Reliability<BR>The HFACS system was found to produce an acceptable<BR>level of agreement among the investigators who<BR>participated in this study. Furthermore, even after this<BR>level of agreement between investigators was corrected<BR>for chance, the obtained reliability index was considered<BR>“good” by conventional standards. Still, this reliability<BR>index was somewhat lower than those observed in studies<BR>using military aviation accidents which, in some instances,<BR>have resulted in nearly complete agreement<BR>among investigators (Shappell &amp; Wiegmann, 1997b).<BR>One possible explanation for this discrepancy is the<BR>difference in both the type and amount of information<BR>available to investigators across these studies. Unlike the<BR>present study, previous analysts using HFACS to analyze<BR>military accident data often had access to privileged and<BR>highly detailed information about the accidents, which<BR>presumably allowed for a better understanding of the<BR>underlying causal factors and, hence, produced higher<BR>levels of reliabilities. Another possibility is that the<BR>definitions and examples currently used to describe<BR>HFACS are too closely tied to military aviation and are<BR>therefore somewhat ambiguous to those within a commercial<BR>setting. Indeed, the reliability of the HFACS<BR>framework has been shown to improve within the commercial<BR>aviation domain when efforts are taken to provide<BR>examples and checklists that are more compatible<BR>with civil aviation accidents (Wiegmann, Shappell,<BR>Cristina &amp; Pape, 2000).<BR>HFACS Analysis<BR>Given the large number of accident causal factors<BR>contained in the NTSB database, each accident appeared,<BR>at least on the surface, to be relatively unique. As<BR>such, commonalties or trends in specific error forms<BR>across accidents were not readily evident in the data. Still,<BR>the recoding of the data using HFACS did allow for<BR>similar error-forms and causal factors across accidents to<BR>be identified and the major human causes of accidents to<BR>be discovered.<BR>Specifically, the HFACS analysis revealed that the<BR>highest percentage of all aircrew-related accidents as<BR>associated with skill-based errors. Furthermore, this proportion<BR>was lowest during the last two years of this study,<BR>suggesting that accidents associated with skill-based errors<BR>may be on the decline. To some, the finding that<BR>skill-based errors were frequently observed among the<BR>commercial aviation accidents examined is not surprising<BR>given the dynamic nature and complexity of piloting<BR>commercial aircraft, particularly in the increasingly<BR>congested U.S. airspace. The question remains, however,<BR>as to the driving force behind the possible reduction<BR>in such errors. Explanations could include<BR>improved aircrew training practices or perhaps better<BR>selection procedures. Another possibility might be the<BR>recent transition within the regional commuter industry<BR>from turboprop to jet aircraft. Such aircraft are<BR>11<BR>generally more reliable and contain advanced automation<BR>to help off-load the attention and memory demands<BR>placed on pilots during flight.<BR>Unfortunately, the industry-wide intervention programs<BR>and other changes that were made during the<BR>1990s were neither systematically applied nor targeted at<BR>preventing specific error types, such as skill-based errors.<BR>Consequently, it is impossible to determine whether all<BR>or only a few of these efforts are responsible for the<BR>apparent decline in skill-based errors. Nevertheless, given<BR>that an error analysis has now been conducted on the<BR>accident data, future invention programs can be strategically<BR>targeted at reducing skill-based errors. Furthermore,<BR>the effectiveness of such efforts can be objectively<BR>evaluated so that efforts can be either reinforced or<BR>revamped to improve safety. Additionally, intervention<BR>ideas can now also be shared across organizations that<BR>have performed similar HFACS analyses. One example<BR>is the U.S. Navy and Marine Corps, which have recently<BR>initiated a systematic intervention program for addressing<BR>their growing problem with accidents associated with<BR>skill-based errors in the fleet (Shappell &amp; Wiegmann,<BR>2000b). As a result, lessons learned in the military can<BR>now be communicated and shared with the commercial<BR>aviation industry, and vice versa.<BR>The observation that both CRM failures and decision<BR>errors are associated with a large percentage of aircrewrelated<BR>accidents is also not surprising, given that these<BR>findings parallel the results of similar HFACS and human<BR>error analyses of both military and civil aviation<BR>accidents (O’Hare et al., 1994; Wiegmann &amp; Shappell,<BR>1999). What is surprising, or at least somewhat disconcerting,<BR>is the observation that both the percentage and<BR>rate of aircrew-related accidents associated with both<BR>CRM and decision errors have remained relatively stable.<BR>Indeed, both the FAA and aviation industry have invested<BR>a great deal of resources into intervention strategies<BR>specifically targeted at improving CRM and<BR>aeronautical decision making (ADM), with apparently<BR>little overall effect.<BR>The modest impact that CRM and ADM programs<BR>have had on reducing accidents may be due to a variety<BR>of factors, including the general lack of systematic<BR>analyses of accidents associated with these problems.<BR>Consequently, most CRM and ADM training programs<BR>use single case studies to educate aircrew, rather<BR>then focus on the fundamental causes of these problems<BR>in the cockpit using a systematic analysis of the<BR>accident data. Another possible explanation for the<BR>general lack of CRM and ADM effectiveness is that<BR>many established training programs involve classroom<BR>exercises that are not followed up by simulator<BR>training that requires CRM and ADM principles to be<BR>applied. More recent programs, such as the Advanced<BR>Qualification Program (AQP), have been developed<BR>to take this next step of integrating ADM and CRM<BR>principles into the cockpit. Given that the current<BR>HFACS analyses has identified the accidents associated<BR>with these problems, at least across a seven-year<BR>period, more fine-grained analyses can be conducted<BR>to identify the specific problems areas in need of<BR>training. Furthermore, the effectiveness of the AQP<BR>program and other ADM training in reducing aircrew<BR>accidents associated with CRM failures and decision<BR>errors can be systematically tracked and evaluated.<BR>The percentage of aircrew-related accidents associated<BR>with violations (e.g., not following federal regulations or<BR>a company’s standard operating procedures) exhibited a<BR>slight increase across the years examined in this study.<BR>Some authors (e.g., Geller, 2000) have suggested that<BR>violations, such as taking short-cuts in procedures or<BR>breaking rules, are often induced by situational factors<BR>that reinforce unsafe acts while punishing safe actions.<BR>Not performing a thorough preflight inspection due to<BR>the pressure to achieve an on-time departure would be<BR>one example. However, according to Reason’s (1990)<BR>model of active and latent failures, such violation-inducing<BR>situations are often set up by supervisory and management<BR>policies and practices.<BR>Such theories suggest that the best strategy for reducing<BR>violations by aircrew is to enforce the rules and to<BR>hold both the aircrew and their supervisors/organizations<BR>accountable. Indeed, this strategy has been effective<BR>with the Navy and Marine Corps in reducing aviation<BR>mishaps associate with violations (Shappell, et al., 1999).<BR>Still, as mentioned earlier, very few of the commercial<BR>accident reports examined in this study cited supervisory<BR>or organizational factors as accident causes, suggesting<BR>that more often than not, aircrews were the only ones<BR>responsible for the violations. Again, more thorough<BR>accident investigations may need to be performed to<BR>identify possible supervisory and organizational issues<BR>associated with these events.<BR>Although pilots flying with FAR Part 135 scheduled<BR>carriers had fewer annual flight hours during the years<BR>covered in this study (NTSB, 2000), the overall number<BR>of accidents associated with most error types was generally<BR>higher for FAR Part 135 scheduled carriers, compared<BR>with FAR Part 121 scheduled carriers. This finding<BR>is likely due, at least in part, to the fact that most pilots<BR>12<BR>flying aircraft operating under FAR Part 135 are younger<BR>and much less experienced. Furthermore, such pilots<BR>often fly less sophisticated and reliable aircraft into areas<BR>that are less likely to be controlled by ATC. As a result,<BR>they may frequently find themselves in situations that<BR>exceed their training or abilities. Such a conclusion is<BR>supported by the findings presented here, since a larger<BR>percentage of FAR Part 135 aircrew-related accidents<BR>were associated with the physical/mental limitations of<BR>the pilot. However, a smaller percentage FAR Part 135<BR>aircrew accidents were associated with CRM failures,<BR>possibly because some FAR Part 135 aircraft are singlepiloted,<BR>which simply reduces the opportunity for<BR>CRM failures.<BR>These differences between FAR Parts 121 and 135<BR>schedule carriers may be less evident in future aviation<BR>accident data since the federal regulations were changed<BR>in 1997. Such changes require FAR Part 135 carriers<BR>operating aircraft that carry ten or more passengers to<BR>now operate under more stringent FAR Part 121 rules.<BR>Thus, the historical distinction in the database between<BR>FAR Part 135 and 121 operators has become somewhat<BR>blurred in the years extending beyond the current analysis.<BR>Therefore, future human-error analyses and comparisons<BR>across these different types of commercial<BR>operations will therefore need to consider these changes.<BR>SUMMARY AND CONCLUSIONS<BR>This investigation demonstrates that the HFACS<BR>framework, originally developed for and proven in the<BR>military, can be used to reliably identify the underlying<BR>human factors problems associated with commercial<BR>aviation accidents. Furthermore, the results of this study<BR>highlight critical areas of human factors in need of<BR>further safety research and provide the foundation upon<BR>which to build a larger civil aviation safety program.<BR>Ultimately, data analyses such as that presented here will<BR>provide valuable insight aimed at the reduction of aviation<BR>accidents through data-driven investment strategies<BR>and objective evaluation of intervention programs. The<BR>HFACS framework may also prove useful as a tool for<BR>guiding future accident investigations in the field and<BR>developing better accident databases, both of which<BR>would improve the overall quality and accessibility of<BR>human factors accident data.<BR>Still, the HFACS framework is not the only possible<BR>system upon which such programs might be developed.<BR>Indeed, there often appears to be as many human error<BR>frameworks as there are those interested in the topic<BR>(Senders &amp; Moray, 1991). Indeed, as the need for better<BR>applied human error analysis methods has become more<BR>apparent, an increasing number of researchers have proposed<BR>other comprehensive frameworks similar to HFACS<BR>(e.g., O’Hare, in press). Nevertheless, HFACS is, to date,<BR>the only system that has been developed to meet a specific<BR>set of design criteria, including comprehensiveness, reliability,<BR>diagnosticity, and usability, all of which have<BR>contributed to the framework’s validity as an accident<BR>analysis tool (Shappell &amp; Wiegmann, in press). Furthermore,<BR>HFACS has been shown to have utility as an erroranalysis<BR>tool in other aviation-related domains such as<BR>ATC (HFACS-ATC; Pounds, Scarborough, &amp; Shappell,<BR>2000) and aviation maintenance (HFACS-ME; Schmidt,<BR>Schmorrow, &amp; Hardee, 1998), and is currently being<BR>evaluated within other complex systems such as medicine<BR>(currently referred to as HFACS-MD). Finally, it<BR>is important to remember that neither HFACS nor<BR>any other error-analysis tool can “fix” the problems<BR>once they have been identified. Such fixes can only be<BR>derived by those organizations, practitioners and human<BR>factors professionals who are dedicated to improving<BR>aviation safety.<BR>REFERENCES<BR>Bird, F. (1974). Management guide to loss control. Atlanta,<BR>GA: Institute Press.<BR>Fleiss, J. (1981). Statistical Methods for Rates and Proportions.<BR>New York: John Wiley.<BR>Ford, C., Jack, T., Crisp, V. &amp; Sandusky, R. (1999).<BR>Aviation accident causal analysis. Advances in<BR>Aviation Safety Conference Proceedings, (P-343).<BR>Warrendale, PA: Society of Automotive Engineers<BR>Inc.<BR>Geller, E. (March, 2000). Behavioral safety analysis: A<BR>necessary precursor to corrective action. Professional<BR>Safety, 29-32.<BR>International Civil Aviation Organization (1993).<BR>Investigation of human factors in accidents and<BR>incidents (Human Factors Digest #7), Montreal:<BR>Canada.<BR>13<BR>Jones, A. (1988). Climate and measurement of consensus:<BR>A discussion of “organizational climate.” In S.<BR>Cole, R.Demaree &amp; W. Curtis, (Eds.), Applications<BR>of Interactionist Psychology: Essays in Honor of<BR>Saul B. Sells (pp. 283-290). Hillsdale, NJ:<BR>Earlbaum.<BR>National Transportation Safety Board (2000). Aviation<BR>accident statistics. . Available:<BR><A href="http://www.ntsb.gov/aviation/Stats.htm">www.ntsb.gov/aviation/Stats.htm</A><BR>O’Hare, D. (in press). The Wheel of Misfortune. Ergonomics.<BR>O’Hare, D., Wiggins, M., Batt, R., and Morrison, D.<BR>(1994). Cognitive failure analysis for aircraft accident<BR>investigation. Ergonomics, 37, 1855-69.<BR>Pounds, J., Scarborough, A., &amp; Shappell, S. (2000). A<BR>human factors analysis of Air Traffic Control operational<BR>errors (Abstract). Aviation, Space and<BR>Environmental Medicine, 71, pp. 329<BR>Rasmussen, J. (1982). Human errors: A taxonomy for<BR>describing human malfunction in industrial installations.<BR>Journal of Occupational Accidents, 4,<BR>pp. 311-33.<BR>Reason, J. (1990). Human error. New York: Cambridge<BR>University Press.<BR>Schmidt, J., Schmorrow, D., &amp; Hardee, M. (1998). A<BR>preliminary human factors analysis of Naval Aviation<BR>maintenance related mishaps. Proceedings of<BR>the 1998 Airframe/Engine Maintenance and Repair<BR>Conference (P329), Long Beach, CA.<BR>Senders, J., &amp; Moray, N. (1991). Human error: Cause,<BR>prediction and reduction. Hillsdale, NJ: Earlbaum.<BR>Shappell, S., &amp; Wiegmann, D. (1996). U. S. Naval<BR>Aviation mishaps 1977-92: Differences between<BR>single- and dual-piloted aircraft. Aviation, Space,<BR>and Environmental Medicine, 67, 65-9.<BR>Shappell, S. &amp; Wiegmann D. (1997a). A human error<BR>approach to accident investigation: The taxonomy<BR>of unsafe operations. The International Journal of<BR>Aviation Psychology, 7, pp. 269-91.<BR>Shappell, S. &amp; Wiegmann, D. (1997b). A reliability<BR>analysis of the Taxonomy of Unsafe Operations<BR>(Abstract). Aviation, Space, and Environmental<BR>Medicine, 69, pp. 620.<BR>Shappell, S. &amp; Wiegmann, D. (2000a). The Human<BR>Factors Analysis and Classification System<BR>(HFACS). (Report Number DOT/FAA/AM-00/7).<BR>Washington DC: Federal Aviation Administration.<BR>Shappell, S. &amp; Wiegmann, D. (2000b). Is proficiency<BR>eroding among U.S. Naval aircrews? A quantitative<BR>analysis using the Human Factors Analysis<BR>and Classification System (HFACS). Proceedings<BR>of the 44th meeting of the Human Factors and Ergonomics<BR>Society.<BR>Shappell, S. &amp; Wiegmann, D. (2001). Applying Reason:<BR>The Human Factors Analysis and Classification<BR>System (HFACS). Human Factors and Aerospace<BR>Safety, 1, 59-86.<BR>Shappell, S., Wiegmann, D., Fraser, J., Gregory, G.,<BR>Kinsey, P., &amp; Squier, H (1999). Beyond mishap<BR>rates: A human factors analysis of U.S. Navy/<BR>Marine Corps TACAIR and rotary wing mishaps<BR>using HFACS (Abstract). Aviation, Space, and<BR>Environmental Medicine, 70, pp. 416-7.<BR>Wiegmann, D. &amp; Shappell, S. (1997). Human factors<BR>analysis of post-accident data: Applying theoretical<BR>taxonomies of human error. The International<BR>Journal of Aviation Psychology, 7, pp. 67-81.<BR>Wiegmann, D. &amp; Shappell, S. (1999). Human error and<BR>crew resource management failures in Naval aviation<BR>mishaps: A review of U.S. Naval Safety Center<BR>data, 1990-96. Aviation, Space, and Environmental<BR>Medicine, 70, pp. 1147-51.<BR>Wiegmann, D., Shappell, S., Cristina, F. &amp; Pape, A.<BR>(2000). A human factors analysis of aviation accident<BR>data: An empirical evaluation of the HFACS<BR>framework (Abstract). Aviation, Space and Environmental<BR>Medicine, 71, pp. 328.<BR>Yacavone, D. W. (1993). Mishap trends and cause<BR>factors in Naval aviation: A review of Naval Safety<BR>Center data, 1986-90. Aviation, Space and Environmental<BR>Medicine, 64, 392-5.</P>

kevingi 发表于 2010-5-22 16:28:56

没有中文版的啊?英文的看起来太费劲了

Virgin 发表于 2010-5-28 23:39:47

看看,有用

mrmmx 发表于 2010-10-19 21:37:06

的点对点的点对点的点对点的点对点的的的
页: [1]
查看完整版本: HFACS对商用航空事故分析DOT版