航空 发表于 2010-5-17 18:52:23

The Human Factors Analysis and Classification System–HFACS

<P>The Human Factors Analysis and Classification System–HFACS</P>
<P>&nbsp;</P>
<P>**** Hidden Message *****</P>

航空 发表于 2010-5-17 18:52:46

<P>DOT/FAA/AM-00/7<BR>U.S. Department<BR>of Transpor tation<BR>Federal Aviation<BR>Administration<BR>Scott A. Shappell<BR>FAA Civil Aeromedical Institute<BR>Oklahoma City, OK 73125<BR>Douglas A. Wiegmann<BR>University of Illinois at Urbana-Champaign<BR>Institute of Aviation<BR>Savoy, IL 61874<BR>February 2000<BR>Final Report<BR>This document is available to the public<BR>through the National Technical Information<BR>Service, Springfield, Virginia 22161.<BR>Office of Aviation Medicine<BR>Washington, DC 20591<BR>The Human Factors<BR>Analysis and Classification<BR>System–HFACS<BR>N O T I C E<BR>This document is disseminated under the sponsorship of<BR>the U.S. Department of Transportation in the interest of<BR>information exchange. The United States Government<BR>assumes no liability for the contents thereof.<BR>i<BR>Technical Report Documentation Page<BR>1. Report No. 2. Government Accession No. 3. Recipient's Catalog No.<BR>DOT/FAA/AM-00/7<BR>4. Title and Subtitle 5. Report Date<BR>The Human Factors Analysis and Classification System—HFACS February 2000<BR>6. Performing Organization Code<BR>7. Author(s) 8. Performing Organization Report No.<BR>Shappell, S.A.1, and Wiegmann, D.A.2<BR>9. Performing Organization Name and Address 10. Work Unit No. (TRAIS)<BR>1FAA Civil Aeromedical Institute, Oklahoma City, OK 73125<BR>2University of Illinois at Urbana-Champaign, Institute of Aviation,<BR>Savoy, Ill. 61874 11. Contract or Grant No.<BR>99-G-006<BR>12. Sponsoring Agency name and Address 13. Type of Report and Period Covered<BR>Office of Aviation Medicine<BR>Federal Aviation Administration 14. Sponsoring Agency Code<BR>800 Independence Ave., S.W.<BR>Washington, DC 20591<BR>15. Supplemental Notes<BR>This work was performed under task # AAM-A –00-HRR-520<BR>16. Abstract<BR>Human error has been implicated in 70 to 80% of all civil and military aviation accidents. Yet, most accident<BR>reporting systems are not designed around any theoretical framework of human error. As a result, most<BR>accident databases are not conducive to a traditional human error analysis, making the identification of<BR>intervention strategies onerous. What is required is a general human error framework around which new<BR>investigative methods can be designed and existing accident databases restructured. Indeed, a comprehensive<BR>human factors analysis and classification system (HFACS) has recently been developed to meet those needs.<BR>Specifically, the HFACS framework has been used within the military, commercial, and general aviation sectors<BR>to systematically examine underlying human causal factors and to improve aviation accident investigations.<BR>This paper describes the development and theoretical underpinnings of HFACS in the hope that it will help<BR>safety professionals reduce the aviation accident rate through systematic, data-driven investment strategies and<BR>objective evaluation of intervention programs<BR>17. Key Words 18. Distribution Statement<BR>Aviation, Human Error, Accident Investigation, Database<BR>Analysis<BR>Document is available to the public through the<BR>National Technical Information Service,<BR>Springfield, Virginia 22161<BR>19. Security Classif. (of this report) 20. Security Classif. (of this page) 21. No. of Pages 22. Price<BR>Unclassified Unclassified 18<BR>Form DOT F 1700.7 (8-72) Reproduction of completed page authorized</P>
<P>1<BR>THE HUMAN FACTORS ANALYSIS AND CLASSIFICATION SYSTEM–HFACS<BR>INTRODUCTION<BR>Sadly, the annals of aviation history are littered with<BR>accidents and tragic losses. Since the late 1950s, however,<BR>the drive to reduce the accident rate has yielded<BR>unprecedented levels of safety to a point where it is now<BR>safer to fly in a commercial airliner than to drive a car or<BR>even walk across a busy New York city street. Still, while<BR>the aviation accident rate has declined tremendously<BR>since the first flights nearly a century ago, the cost of<BR>aviation accidents in both lives and dollars has steadily<BR>risen. As a result, the effort to reduce the accident rate<BR>still further has taken on new meaning within both<BR>military and civilian aviation.<BR>Even with all the innovations and improvements<BR>realized in the last several decades, one fundamental<BR>question remains generally unanswered: “Why do aircraft<BR>crash?” The answer may not be as straightforward<BR>as one might think. In the early years of aviation, it could<BR>reasonably be said that, more often than not, the aircraft<BR>killed the pilot. That is, the aircraft were intrinsically<BR>unforgiving and, relative to their modern counterparts,<BR>mechanically unsafe. However, the modern era of aviation<BR>has witnessed an ironic reversal of sorts. It now<BR>appears to some that the aircrew themselves are more<BR>deadly than the aircraft they fly (Mason, 1993; cited in<BR>Murray, 1997). In fact, estimates in the literature indicate<BR>that between 70 and 80 percent of aviation accidents<BR>can be attributed, at least in part, to human error<BR>(Shappell &amp; Wiegmann, 1996). Still, to off-handedly<BR>attribute accidents solely to aircrew error is like telling<BR>patients they are simply “sick” without examining the<BR>underlying causes or further defining the illness.<BR>So what really constitutes that 70-80 % of human<BR>error repeatedly referred to in the literature? Some<BR>would have us believe that human error and “pilot” error<BR>are synonymous. Yet, simply writing off aviation accidents<BR>merely to pilot error is an overly simplistic, if not<BR>naive, approach to accident causation. After all, it is<BR>well established that accidents cannot be attributed<BR>to a single cause, or in most instances, even a single<BR>individual (Heinrich, Petersen, and Roos, 1980). In<BR>fact, even the identification of a “primary” cause is<BR>fraught with problems. Rather, aviation accidents are<BR>the end result of a number of causes, only the last of<BR>which are the unsafe acts of the aircrew (Reason, 1990;<BR>Shappell &amp; Wiegmann, 1997a; Heinrich, Peterson, &amp;<BR>Roos, 1980; Bird, 1974).<BR>The challenge for accident investigators and analysts<BR>alike is how best to identify and mitigate the causal<BR>sequence of events, in particular that 70-80 % associated<BR>with human error. Armed with this challenge, those<BR>interested in accident causation are left with a growing<BR>list of investigative schemes to chose from. In fact, there<BR>are nearly as many approaches to accident causation as<BR>there are those involved in the process (Senders &amp;<BR>Moray, 1991). Nevertheless, a comprehensive framework<BR>for identifying and analyzing human error continues<BR>to elude safety professionals and theorists alike.<BR>Consequently, interventions cannot be accurately targeted<BR>at specific human causal factors nor can their<BR>effectiveness be objectively measured and assessed. Instead,<BR>safety professionals are left with the status quo.<BR>That is, they are left with interest/fad-driven research<BR>resulting in intervention strategies that peck around the<BR>edges of accident causation, but do little to reduce the<BR>overall accident rate. What is needed is a framework<BR>around which a needs-based, data-driven safety program<BR>can be developed (Wiegmann &amp; Shappell, 1997).<BR>Reason’s “Swiss Cheese” Model of Human Error<BR>One particularly appealing approach to the genesis of<BR>human error is the one proposed by James Reason<BR>(1990). Generally referred to as the “Swiss cheese”<BR>model of human error, Reason describes four levels of<BR>human failure, each influencing the next (Figure 1).<BR>Working backwards in time from the accident, the first<BR>level depicts those Unsafe Acts of Operators that ultimately<BR>led to the accident1. More commonly referred to<BR>in aviation as aircrew/pilot error, this level is where most<BR>accident investigations have focused their efforts and<BR>consequently, where most causal factors are uncovered.<BR>1 Reason’s original work involved operators of a nuclear power plant. However, for the purposes of this manuscript, the<BR>operators here refer to aircrew, maintainers, supervisors and other humans involved in aviation.<BR>2<BR>After all, it is typically the actions or inactions of aircrew<BR>that are directly linked to the accident. For instance,<BR>failing to properly scan the aircraft’s instruments while<BR>in instrument meteorological conditions (IMC) or penetrating<BR>IMC when authorized only for visual meteorological<BR>conditions (VMC) may yield relatively<BR>immediate, and potentially grave, consequences. Represented<BR>as “holes” in the cheese, these active failures are<BR>typically the last unsafe acts committed by aircrew.<BR>However, what makes the “Swiss cheese” model<BR>particularly useful in accident investigation, is that it<BR>forces investigators to address latent failures within the<BR>causal sequence of events as well. As their name suggests,<BR>latent failures, unlike their active counterparts, may lie<BR>dormant or undetected for hours, days, weeks, or even<BR>longer, until one day they adversely affect the unsuspecting<BR>aircrew. Consequently, they may be overlooked by<BR>investigators with even the best intentions.<BR>Within this concept of latent failures, Reason described<BR>three more levels of human failure. The first<BR>involves the condition of the aircrew as it affects performance.<BR>Referred to as Preconditions for Unsafe Acts, this<BR>level involves conditions such as mental fatigue and<BR>poor communication and coordination practices, often<BR>referred to as crew resource management (CRM). Not<BR>surprising, if fatigued aircrew fail to communicate and<BR>coordinate their activities with others in the cockpit or<BR>individuals external to the aircraft (e.g., air traffic control,<BR>maintenance, etc.), poor decisions are made and<BR>errors often result.<BR>But exactly why did communication and coordination<BR>break down in the first place? This is perhaps<BR>where Reason’s work departed from more traditional<BR>approaches to human error. In many instances, the<BR>breakdown in good CRM practices can be traced<BR>back to instances of Unsafe Supervision, the third level<BR>of human failure. If, for example, two inexperienced<BR>(and perhaps even below average pilots) are paired<BR>with each other and sent on a flight into known<BR>adverse weather at night, is anyone really surprised by<BR>a tragic outcome? To make matters worse, if this<BR>questionable manning practice is coupled with the<BR>lack of quality CRM training, the potential for miscommunication<BR>and ultimately, aircrew errors, is<BR>magnified. In a sense then, the crew was “set up” for<BR>failure as crew coordination and ultimately performance<BR>would be compromised. This is not to lessen the<BR>role played by the aircrew, only that intervention and<BR>mitigation strategies might lie higher within the system.<BR>Reason’s model didn’t stop at the supervisory level<BR>either; the organization itself can impact performance<BR>at all levels. For instance, in times of fiscal<BR>austerity, funding is often cut, and as a result, training<BR>and flight time are curtailed. Consequently, supervisors<BR>are often left with no alternative but to task<BR>“non-proficient” aviators with complex tasks. Not<BR>surprisingly then, in the absence of good CRM training,<BR>communication and coordination failures will<BR>begin to appear as will a myriad of other preconditions,<BR>all of which will affect performance and elicit<BR>aircrew errors. Therefore, it makes sense that, if the<BR>accident rate is going to be reduced beyond current<BR>levels, investigators and analysts alike must examine<BR>the accident sequence in its entirety and expand it<BR>beyond the cockpit. Ultimately, causal factors at all<BR>levels within the organization must be addressed if<BR>any accident investigation and prevention system is<BR>going to succeed.<BR>In many ways, Reason’s “Swiss cheese” model of<BR>accident causation has revolutionized common views<BR>of accident causation. Unfortunately, however, it is<BR>simply a theory with few details on how to apply it in<BR>a real-world setting. In other words, the theory never<BR>defines what the “holes in the cheese” really are, at<BR>least within the context of everyday operations. Ultimately,<BR>one needs to know what these system failures<BR>or “holes” are, so that they can be identified during<BR>accident investigations or better yet, detected and<BR>corrected before an accident occurs.<BR>Mishap<BR>Latent Failures<BR>Latent Failures<BR>Latent Failures<BR>Active Failures<BR>Failed or<BR>Absent Defenses<BR>Organizational<BR>Influences<BR>Unsafe<BR>Supervision<BR>Preconditions<BR>for<BR>Unsafe Acts<BR>Unsafe<BR>Acts<BR>Figure 1. The “Swiss cheese” model of human<BR>error causation (adapted from Reason, 1990).<BR>3<BR>The balance of this paper will attempt to describe<BR>the “holes in the cheese.” However, rather than attempt<BR>to define the holes using esoteric theories with<BR>little or no practical applicability, the original framework<BR>(called the Taxonomy of Unsafe Operations) was<BR>developed using over 300 Naval aviation accidents<BR>obtained from the U.S. Naval Safety Center (Shappell<BR>&amp; Wiegmann, 1997a). The original taxonomy has<BR>since been refined using input and data from other<BR>military (U.S. Army Safety Center and the U.S. Air<BR>Force Safety Center) and civilian organizations (National<BR>Transportation Safety Board and the Federal<BR>Aviation Administration). The result was the development<BR>of the Human Factors Analysis and Classification<BR>System (HFACS).<BR>THE HUMAN FACTORS ANALYSIS AND<BR>CLASSIFICATION SYSTEM<BR>Drawing upon Reason’s (1990) concept of latent<BR>and active failures, HFACS describes four levels of<BR>failure: 1) Unsafe Acts, 2) Preconditions for Unsafe<BR>Acts, 3) Unsafe Supervision, and 4) Organizational<BR>Influences. A brief description of the major components<BR>and causal categories follows, beginning with the<BR>level most closely tied to the accident, i.e. unsafe acts.<BR>Unsafe Acts<BR>The unsafe acts of aircrew can be loosely classified<BR>into two categories: errors and violations (Reason,<BR>1990). In general, errors represent the mental or<BR>physical activities of individuals that fail to achieve<BR>their intended outcome. Not surprising, given the<BR>fact that human beings by their very nature make<BR>errors, these unsafe acts dominate most accident<BR>databases. Violations, on the other hand, refer to the<BR>willful disregard for the rules and regulations that<BR>govern the safety of flight. The bane of many organizations,<BR>the prediction and prevention of these appalling<BR>and purely “preventable” unsafe acts, continue<BR>to elude managers and researchers alike.<BR>Still, distinguishing between errors and violations<BR>does not provide the level of granularity required of<BR>most accident investigations. Therefore, the categories<BR>of errors and violations were expanded here<BR>(Figure 2), as elsewhere (Reason, 1990; Rasmussen,<BR>1982), to include three basic error types (skill-based,<BR>decision, and perceptual) and two forms of violations<BR>(routine and exceptional).<BR>Errors<BR>Skill-based errors. Skill-based behavior within the<BR>context of aviation is best described as “stick-andrudder”<BR>and other basic flight skills that occur without<BR>significant conscious thought. As a result, these<BR>skill-based actions are particularly vulnerable to failures<BR>of attention and/or memory. In fact, attention<BR>failures have been linked to many skill-based errors<BR>such as the breakdown in visual scan patterns, task<BR>fixation, the inadvertent activation of controls, and<BR>the misordering of steps in a procedure, among others<BR>(Table 1). A classic example is an aircraft’s crew that<BR>becomes so fixated on trouble-shooting a burned out<BR>warning light that they do not notice their fatal<BR>Perceptual<BR>Errors<BR>Decision<BR>Errors<BR>Skill-Based<BR>Errors<BR>Errors<BR>UNSAFE<BR>ACTS<BR>Routine Exceptional<BR>Violations<BR>Figure 2. Categories of unsafe acts committed by aircrews.<BR>4<BR>TABLE 1. Selected examples of Unsafe Acts of Pilot Operators (Note: This is not<BR>a complete listing)<BR>ERRORS<BR>Skill-based Errors<BR>Breakdown in visual scan<BR>Failed to prioritize attention<BR>Inadvertent use of flight controls<BR>Omitted step in procedure<BR>Omitted checklist item<BR>Poor technique<BR>Over-controlled the aircraft<BR>Decision Errors<BR>Improper procedure<BR>Misdiagnosed emergency<BR>Wrong response to emergency<BR>Exceeded ability<BR>Inappropriate maneuver<BR>Poor decision<BR>Perceptual Errors (due to)<BR>Misjudged distance/altitude/airspeed<BR>Spatial disorientation<BR>Visual illusion<BR>VIOLATIONS<BR>Failed to adhere to brief<BR>Failed to use the radar altimeter<BR>Flew an unauthorized approach<BR>Violated training rules<BR>Flew an overaggressive maneuver<BR>Failed to properly prepare for the flight<BR>Briefed unauthorized flight<BR>Not current/qualified for the mission<BR>Intentionally exceeded the limits of the aircraft<BR>Continued low-altitude flight in VMC<BR>Unauthorized low-altitude canyon running<BR>descent into the terrain. Perhaps a bit closer to home,<BR>consider the hapless soul who locks himself out of the<BR>car or misses his exit because he was either distracted,<BR>in a hurry, or daydreaming. These are both examples<BR>of attention failures that commonly occur during<BR>highly automatized behavior. Unfortunately, while<BR>at home or driving around town these attention/<BR>memory failures may be frustrating, in the air they<BR>can become catastrophic.<BR>In contrast to attention failures, memory failures<BR>often appear as omitted items in a checklist, place<BR>losing, or forgotten intentions. For example, most of<BR>us have experienced going to the refrigerator only to<BR>forget what we went for. Likewise, it is not difficult<BR>to imagine that when under stress during inflight<BR>emergencies, critical steps in emergency procedures<BR>can be missed. However, even when not particularly<BR>stressed, individuals have forgotten to set the flaps on<BR>approach or lower the landing gear – at a minimum,<BR>an embarrassing gaffe.<BR>The third, and final, type of skill-based errors<BR>identified in many accident investigations involves<BR>technique errors. Regardless of one’s training,<BR>experience, and educational background, the manner<BR>in which one carries out a specific sequence of events<BR>may vary greatly. That is, two pilots with identical<BR>training, flight grades, and experience may differ<BR>significantly in the manner in which they maneuver<BR>their aircraft. While one pilot may fly smoothly with<BR>the grace of a soaring eagle, others may fly with the<BR>darting, rough transitions of a sparrow. Nevertheless,<BR>while both may be safe and equally adept at flying, the<BR>techniques they employ could set them up for specific<BR>failure modes. In fact, such techniques are as much a<BR>factor of innate ability and aptitude as they are an<BR>overt expression of one’s own personality, making<BR>efforts at the prevention and mitigation of technique<BR>errors difficult, at best.<BR>Decision errors. The second error form, decision<BR>errors, represents intentional behavior that proceeds<BR>as intended, yet the plan proves inadequate or inappropriate<BR>for the situation. Often referred to as “honest<BR>mistakes,” these unsafe acts represent the actions<BR>or inactions of individuals whose “hearts are in the<BR>right place,” but they either did not have the appropriate<BR>knowledge or just simply chose poorly.<BR>5<BR>Perhaps the most heavily investigated of all error<BR>forms, decision errors can be grouped into three<BR>general categories: procedural errors, poor choices,<BR>and problem solving errors (Table 1). Procedural<BR>decision errors (Orasanu, 1993), or rule-based mistakes,<BR>as described by Rasmussen (1982), occur during<BR>highly structured tasks of the sorts, if X, then do<BR>Y. Aviation, particularly within the military and<BR>commercial sectors, by its very nature is highly structured,<BR>and consequently, much of pilot decision<BR>making is procedural. There are very explicit procedures<BR>to be performed at virtually all phases of flight.<BR>Still, errors can, and often do, occur when a situation<BR>is either not recognized or misdiagnosed, and the<BR>wrong procedure is applied. This is particularly true<BR>when pilots are placed in highly time-critical emergencies<BR>like an engine malfunction on takeoff.<BR>However, even in aviation, not all situations have<BR>corresponding procedures to deal with them. Therefore,<BR>many situations require a choice to be made<BR>among multiple response options. Consider the pilot<BR>flying home after a long week away from the family<BR>who unexpectedly confronts a line of thunderstorms<BR>directly in his path. He can choose to fly around the<BR>weather, divert to another field until the weather<BR>passes, or penetrate the weather hoping to quickly<BR>transition through it. Confronted with situations<BR>such as this, choice decision errors (Orasanu, 1993),<BR>or knowledge-based mistakes as they are otherwise<BR>known (Rasmussen, 1986), may occur. This is particularly<BR>true when there is insufficient experience,<BR>time, or other outside pressures that may preclude<BR>correct decisions. Put simply, sometimes we chose<BR>well, and sometimes we don’t.<BR>Finally, there are occasions when a problem is not<BR>well understood, and formal procedures and response<BR>options are not available. It is during these ill-defined<BR>situations that the invention of a novel solution is<BR>required. In a sense, individuals find themselves<BR>where no one has been before, and in many ways,<BR>must literally fly by the seats of their pants. Individuals<BR>placed in this situation must resort to slow and<BR>effortful reasoning processes where time is a luxury<BR>rarely afforded. Not surprisingly, while this type of<BR>decision making is more infrequent then other forms,<BR>the relative proportion of problem-solving errors<BR>committed is markedly higher.<BR>Perceptual errors. Not unexpectedly, when one’s<BR>perception of the world differs from reality, errors<BR>can, and often do, occur. Typically, perceptual errors<BR>occur when sensory input is degraded or “unusual,”<BR>as is the case with visual illusions and spatial disorientation<BR>or when aircrew simply misjudge the aircraft’s<BR>altitude, attitude, or airspeed (Table 1). Visual illusions,<BR>for example, occur when the brain tries to “fill<BR>in the gaps” with what it feels belongs in a visually<BR>impoverished environment, like that seen at night or<BR>when flying in adverse weather. Likewise, spatial<BR>disorientation occurs when the vestibular system<BR>cannot resolve one’s orientation in space and therefore<BR>makes a “best guess” — typically when visual<BR>(horizon) cues are absent at night or when flying in<BR>adverse weather. In either event, the unsuspecting<BR>individual often is left to make a decision that is based<BR>on faulty information and the potential for committing<BR>an error is elevated.<BR>It is important to note, however, that it is not the<BR>illusion or disorientation that is classified as a perceptual<BR>error. Rather, it is the pilot’s erroneous response<BR>to the illusion or disorientation. For example, many<BR>unsuspecting pilots have experienced “black-hole”<BR>approaches, only to fly a perfectly good aircraft into<BR>the terrain or water. This continues to occur, even<BR>though it is well known that flying at night over dark,<BR>featureless terrain (e.g., a lake or field devoid of trees),<BR>will produce the illusion that the aircraft is actually<BR>higher than it is. As a result, pilots are taught to rely<BR>on their primary instruments, rather than the outside<BR>world, particularly during the approach phase of<BR>flight. Even so, some pilots fail to monitor their<BR>instruments when flying at night. Tragically, these<BR>aircrew and others who have been fooled by illusions<BR>and other disorientating flight regimes may end up<BR>involved in a fatal aircraft accident.<BR>Violations<BR>By definition, errors occur within the rules and<BR>regulations espoused by an organization; typically<BR>dominating most accident databases. In contrast,<BR>violations represent a willful disregard for the rules<BR>and regulations that govern safe flight and, fortunately,<BR>occur much less frequently since they often<BR>involve fatalities (Shappell et al., 1999b).<BR>6<BR>While there are many ways to distinguish between<BR>types of violations, two distinct forms have been identified,<BR>based on their etiology, that will help the safety<BR>professional when identifying accident causal factors.<BR>The first, routine violations, tend to be habitual by<BR>nature and often tolerated by governing authority (Reason,<BR>1990). Consider, for example, the individual who<BR>drives consistently 5-10 mph faster than allowed by law<BR>or someone who routinely flies in marginal weather<BR>when authorized for visual meteorological conditions<BR>only. While both are certainly against the governing<BR>regulations, many others do the same thing. Furthermore,<BR>individuals who drive 64 mph in a 55 mph zone,<BR>almost always drive 64 in a 55 mph zone. That is, they<BR>“routinely” violate the speed limit. The same can typically<BR>be said of the pilot who routinely flies into marginal<BR>weather.<BR>What makes matters worse, these violations (commonly<BR>referred to as “bending” the rules) are often<BR>tolerated and, in effect, sanctioned by supervisory authority<BR>(i.e., you’re not likely to get a traffic citation<BR>until you exceed the posted speed limit by more than 10<BR>mph). If, however, the local authorities started handing<BR>out traffic citations for exceeding the speed limit on the<BR>highway by 9 mph or less (as is often done on military<BR>installations), then it is less likely that individuals would<BR>violate the rules. Therefore, by definition, if a routine<BR>violation is identified, one must look further up the<BR>supervisory chain to identify those individuals in authority<BR>who are not enforcing the rules.<BR>On the other hand, unlike routine violations, exceptional<BR>violations appear as isolated departures from<BR>authority, not necessarily indicative of individual’s typical<BR>behavior pattern nor condoned by management<BR>(Reason, 1990). For example, an isolated instance of<BR>driving 105 mph in a 55 mph zone is considered an<BR>exceptional violation. Likewise, flying under a bridge or<BR>engaging in other prohibited maneuvers, like low-level<BR>canyon running, would constitute an exceptional violation.<BR>However, it is important to note that, while most<BR>exceptional violations are appalling, they are not considered<BR>“exceptional” because of their extreme nature.<BR>Rather, they are considered exceptional because they are<BR>neither typical of the individual nor condoned by authority.<BR>Still, what makes exceptional violations particularly<BR>difficult for any organization to deal with is<BR>that they are not indicative of an individual’s behavioral<BR>repertoire and, as such, are particularly difficult to<BR>predict. In fact, when individuals are confronted with<BR>evidence of their dreadful behavior and asked to<BR>explain it, they are often left with little explanation.<BR>Indeed, those individuals who survived such excursions<BR>from the norm clearly knew that, if caught, dire<BR>consequences would follow. Still, defying all logic,<BR>many otherwise model citizens have been down this<BR>potentially tragic road.<BR>Preconditions for Unsafe Acts<BR>Arguably, the unsafe acts of pilots can be directly<BR>linked to nearly 80 % of all aviation accidents. However,<BR>simply focusing on unsafe acts is like focusing on a fever<BR>without understanding the underlying disease causing<BR>it. Thus, investigators must dig deeper into why the<BR>unsafe acts took place. As a first step, two major subdivisions<BR>of unsafe aircrew conditions were developed:<BR>substandard conditions of operators and the substandard<BR>practices they commit (Figure 3).<BR>PRECONDITIONS<BR>FOR<BR>UNSAFE ACTS<BR>Substandard<BR>Conditions of<BR>Operators<BR>Adverse<BR>Physiological<BR>States<BR>Physical/<BR>Mental<BR>Limitations<BR>Adverse<BR>Mental<BR>States<BR>Personal<BR>Readiness<BR>Crew Resource<BR>Mismanagement<BR>Substandard<BR>Practices of<BR>Operators<BR>Figure 3. Categories of preconditions of unsafe acts.<BR>7<BR>Substandard Conditions of Operators<BR>Adverse mental states. Being prepared mentally is<BR>critical in nearly every endeavor, but perhaps even<BR>more so in aviation. As such, the category of Adverse<BR>Mental States was created to account for those mental<BR>conditions that affect performance (Table 2). Principal<BR>among these are the loss of situational awareness,<BR>task fixation, distraction, and mental fatigue due to<BR>sleep loss or other stressors. Also included in this<BR>category are personality traits and pernicious attitudes<BR>such as overconfidence, complacency, and misplaced<BR>motivation.<BR>Predictably, if an individual is mentally tired for<BR>whatever reason, the likelihood increase that an error<BR>will occur. In a similar fashion, overconfidence and<BR>other pernicious attitudes such as arrogance and<BR>impulsivity will influence the likelihood that a violation<BR>will be committed. Clearly then, any framework<BR>of human error must account for preexisting adverse<BR>mental states in the causal chain of events.<BR>Adverse physiological states. The second category,<BR>adverse physiological states, refers to those medical or<BR>physiological conditions that preclude safe operations<BR>(Table 2). Particularly important to aviation are<BR>such conditions as visual illusions and spatial disorientation<BR>as described earlier, as well as physical fatigue,<BR>and the myriad of pharmacological and medical<BR>abnormalities known to affect performance.<BR>The effects of visual illusions and spatial disorientation<BR>are well known to most aviators. However, less<BR>well known to aviators, and often overlooked are the<BR>effects on cockpit performance of simply being ill.<BR>Nearly all of us have gone to work ill, dosed with<BR>over-the-counter medications, and have generally<BR>performed well. Consider however, the pilot suffering<BR>from the common head cold. Unfortunately,<BR>most aviators view a head cold as only a minor<BR>inconvenience that can be easily remedied using<BR>over-the counter antihistamines, acetaminophen, and<BR>other non-prescription pharmaceuticals. In fact, when<BR>8<BR>confronted with a stuffy nose, aviators typically are<BR>only concerned with the effects of a painful sinus<BR>block as cabin altitude changes. Then again, it is not<BR>the overt symptoms that local flight surgeons are<BR>concerned with. Rather, it is the accompanying inner<BR>ear infection and the increased likelihood of spatial<BR>disorientation when entering instrument meteorological<BR>conditions that is alarming - not to mention<BR>the side-effects of antihistamines, fatigue, and sleep<BR>loss on pilot decision-making. Therefore, it is incumbent<BR>upon any safety professional to account for these<BR>sometimes subtle medical conditions within the causal<BR>chain of events.<BR>Physical/Mental Limitations. The third, and final,<BR>substandard condition involves individual physical/<BR>mental limitations (Table 2). Specifically, this category<BR>refers to those instances when mission requirements<BR>exceed the capabilities of the individual at the<BR>controls. For example, the human visual system is<BR>severely limited at night; yet, like driving a car,<BR>drivers do not necessarily slow down or take additional<BR>precautions. In aviation, while slowing down<BR>isn’t always an option, paying additional attention to<BR>basic flight instruments and increasing one’s vigilance<BR>will often increase the safety margin. Unfortunately,<BR>when precautions are not taken, the result can<BR>be catastrophic, as pilots will often fail to see other<BR>aircraft, obstacles, or power lines due to the size or<BR>contrast of the object in the visual field.<BR>Similarly, there are occasions when the time required<BR>to complete a task or maneuver exceeds an<BR>individual’s capacity. Individuals vary widely in their<BR>ability to process and respond to information. Nevertheless,<BR>good pilots are typically noted for their<BR>ability to respond quickly and accurately. It is well<BR>documented, however, that if individuals are required<BR>to respond quickly (i.e., less time is available<BR>to consider all the possibilities or choices thoroughly),<BR>the probability of making an error goes up markedly.<BR>Consequently, it should be no surprise that when<BR>faced with the need for rapid processing and reaction<BR>times, as is the case in most aviation emergencies, all<BR>forms of error would be exacerbated.<BR>In addition to the basic sensory and information<BR>processing limitations described above, there are at<BR>least two additional instances of physical/mental<BR>limitations that need to be addressed, albeit they are<BR>often overlooked by most safety professionals. These<BR>limitations involve individuals who simply are not<BR>compatible with aviation, because they are either<BR>unsuited physically or do not possess the aptitude to<BR>fly. For example, some individuals simply don’t have<BR>the physical strength to operate in the potentially<BR>high-G environment of aviation, or for anthropometric<BR>reasons, simply have difficulty reaching the<BR>controls. In other words, cockpits have traditionally<BR>not been designed with all shapes, sizes, and physical<BR>abilities in mind. Likewise, not everyone has the<BR>mental ability or aptitude for flying aircraft. Just as<BR>not all of us can be concert pianists or NFL linebackers,<BR>not everyone has the innate ability to pilot an<BR>aircraft – a vocation that requires the unique ability<BR>to make decisions quickly and respond accurately in<BR>life threatening situations. The difficult task for the<BR>safety professional is identifying whether aptitude might<BR>have contributed to the accident causal sequence.<BR>Substandard Practices of Operators<BR>Clearly then, numerous substandard conditions of<BR>operators can, and do, lead to the commission of<BR>unsafe acts. Nevertheless, there are a number of<BR>things that we do to ourselves that set up these<BR>substandard conditions. Generally speaking, the substandard<BR>practices of operators can be summed up in<BR>two categories: crew resource mismanagement and<BR>personal readiness.<BR>Crew Resource Mismanagement. Good communication<BR>skills and team coordination have been the<BR>mantra of industrial/organizational and personnel<BR>psychology for decades. Not surprising then, crew<BR>resource management has been a cornerstone of aviation<BR>for the last few decades (Helmreich &amp; Foushee,<BR>1993). As a result, the category of crew resource<BR>mismanagement was created to account for occurrences<BR>of poor coordination among personnel. Within<BR>the context of aviation, this includes coordination both<BR>within and between aircraft with air traffic control<BR>facilities and maintenance control, as well as with facility<BR>and other support personnel as necessary. But aircrew<BR>coordination does not stop with the aircrew in<BR>flight. It also includes coordination before and after the<BR>flight with the brief and debrief of the aircrew.<BR>It is not difficult to envision a scenario where the<BR>lack of crew coordination has led to confusion and<BR>poor decision making in the cockpit, resulting in an<BR>accident. In fact, aviation accident databases are<BR>replete with instances of poor coordination among<BR>aircrew. One of the more tragic examples was the<BR>crash of a civilian airliner at night in the Florida<BR>Everglades in 1972 as the crew was busily trying to<BR>9<BR>troubleshoot what amounted to a burnt out indicator<BR>light. Unfortunately, no one in the cockpit was monitoring<BR>the aircraft’s altitude as the altitude hold was<BR>inadvertently disconnected. Ideally, the crew would<BR>have coordinated the trouble-shooting task ensuring<BR>that at least one crewmember was monitoring basic<BR>flight instruments and “flying” the aircraft. Tragically,<BR>this was not the case, as they entered a slow,<BR>unrecognized, descent into the everglades resulting<BR>in numerous fatalities.<BR>Personal Readiness. In aviation, or for that matter<BR>in any occupational setting, individuals are expected<BR>to show up for work ready to perform at optimal<BR>levels. Nevertheless, in aviation as in other professions,<BR>personal readiness failures occur when individuals<BR>fail to prepare physically or mentally for duty.<BR>For instance, violations of crew rest requirements,<BR>bottle-to-brief rules, and self-medicating all will affect<BR>performance on the job and are particularly<BR>detrimental in the aircraft. It is not hard to imagine<BR>that, when individuals violate crew rest requirements,<BR>they run the risk of mental fatigue and other adverse<BR>mental states, which ultimately lead to errors and<BR>accidents. Note however, that violations that affect<BR>personal readiness are not considered “unsafe act,<BR>violation” since they typically do not happen in the<BR>cockpit, nor are they necessarily active failures with<BR>direct and immediate consequences.<BR>Still, not all personal readiness failures occur as a<BR>result of violations of governing rules or regulations.<BR>For example, running 10 miles before piloting an<BR>aircraft may not be against any existing regulations,<BR>yet it may impair the physical and mental capabilities<BR>of the individual enough to degrade performance and<BR>elicit unsafe acts. Likewise, the traditional “candy bar<BR>and coke” lunch of the modern businessman may<BR>sound good but may not be sufficient to sustain<BR>performance in the rigorous environment of aviation.<BR>While there may be no rules governing such<BR>behavior, pilots must use good judgment when deciding<BR>whether they are “fit” to fly an aircraft.<BR>Unsafe Supervision<BR>Recall that in addition to those causal factors<BR>associated with the pilot/operator, Reason (1990)<BR>traced the causal chain of events back up the supervisory<BR>chain of command. As such, we have identified<BR>four categories of unsafe supervision: inadequate<BR>supervision, planned inappropriate operations, failure<BR>to correct a known problem, and supervisory<BR>violations (Figure 4). Each is described briefly below.<BR>Inadequate Supervision. The role of any supervisor<BR>is to provide the opportunity to succeed. To do this,<BR>the supervisor, no matter at what level of operation,<BR>must provide guidance, training opportunities, leadership,<BR>and motivation, as well as the proper role<BR>model to be emulated. Unfortunately, this is not<BR>always the case. For example, it is not difficult to<BR>conceive of a situation where adequate crew resource<BR>management training was either not provided, or the<BR>opportunity to attend such training was not afforded<BR>to a particular aircrew member. Conceivably, aircrew<BR>coordination skills would be compromised and if the<BR>aircraft were put into an adverse situation (an emergency<BR>for instance), the risk of an error being committed<BR>would be exacerbated and the potential for an<BR>accident would increase markedly.<BR>In a similar vein, sound professional guidance and<BR>oversight is an essential ingredient of any successful<BR>organization. While empowering individuals to make<BR>decisions and function independently is certainly<BR>essential, this does not divorce the supervisor from<BR>accountability. The lack of guidance and oversight<BR>10<BR>has proven to be the breeding ground for many of the<BR>violations that have crept into the cockpit. As such,<BR>any thorough investigation of accident causal factors<BR>must consider the role supervision plays (i.e., whether<BR>the supervision was inappropriate or did not occur at<BR>all) in the genesis of human error (Table 3).<BR>Planned Inappropriate Operations. Occasionally,<BR>the operational tempo and/or the scheduling of aircrew<BR>is such that individuals are put at unacceptable<BR>risk, crew rest is jeopardized, and ultimately performance<BR>is adversely affected. Such operations, though<BR>arguably unavoidable during emergencies, are unacceptable<BR>during normal operations. Therefore, the<BR>second category of unsafe supervision, planned inappropriate<BR>operations, was created to account for these<BR>failures (Table 3).<BR>Take, for example, the issue of improper crew<BR>pairing. It is well known that when very senior,<BR>dictatorial captains are paired with very junior, weak<BR>co-pilots, communication and coordination problems<BR>are likely to occur. Commonly referred to as the<BR>trans-cockpit authority gradient, such conditions<BR>likely contributed to the tragic crash of a commercial<BR>airliner into the Potomac River outside of Washington,<BR>DC, in January of 1982 (NTSB, 1982). In that<BR>accident, the captain of the aircraft repeatedly rebuffed<BR>the first officer when the latter indicated that<BR>the engine instruments did not appear normal. Undaunted,<BR>the captain continued a fatal takeoff in icing<BR>conditions with less than adequate takeoff thrust.<BR>The aircraft stalled and plummeted into the icy river,<BR>killing the crew and many of the passengers.<BR>Clearly, the captain and crew were held accountable.<BR>They died in the accident and cannot shed light<BR>on causation; but, what was the role of the supervisory<BR>chain? Perhaps crew pairing was equally responsible.<BR>Although not specifically addressed in the report,<BR>such issues are clearly worth exploring in many accidents.<BR>In fact, in that particular accident, several<BR>other training and manning issues were identified.<BR>Failure to Correct a Known Problem. The third<BR>category of known unsafe supervision, Failed to Correct<BR>a Known Problem, refers to those instances when<BR>deficiencies among individuals, equipment, training<BR>or other related safety areas are “known” to the<BR>supervisor, yet are allowed to continue unabated<BR>(Table 3). For example, it is not uncommon for<BR>accident investigators to interview the pilot’s friends,<BR>colleagues, and supervisors after a fatal crash only to<BR>find out that they “knew it would happen to him<BR>some day.” If the supervisor knew that a pilot was<BR>incapable of flying safely, and allowed the flight<BR>anyway, he clearly did the pilot no favors. The failure<BR>to correct the behavior, either through remedial training<BR>or, if necessary, removal from flight status, essentially<BR>signed the pilot’s death warrant - not to mention<BR>that of others who may have been on board.<BR>11<BR>Likewise, the failure to consistently correct or discipline<BR>inappropriate behavior certainly fosters an unsafe<BR>atmosphere and promotes the violation of rules. Aviation<BR>history is rich with by reports of aviators who tell<BR>hair-raising stories of their exploits and barnstorming<BR>low-level flights (the infamous “been there, done that”).<BR>While entertaining to some, they often serve to promulgate<BR>a perception of tolerance and “one-up-manship”<BR>until one day someone ties the low altitude flight record<BR>of ground-level! Indeed, the failure to report these<BR>unsafe tendencies and initiate corrective actions is yet<BR>another example of the failure to correct known problems.<BR>Supervisory Violations. Supervisory violations, on the<BR>other hand, are reserved for those instances when existing<BR>rules and regulations are willfully disregarded by<BR>supervisors (Table 3). Although arguably rare, supervisors<BR>have been known occasionally to violate the rules<BR>and doctrine when managing their assets. For instance,<BR>there have been occasions when individuals were<BR>permitted to operate an aircraft without current qualifications<BR>or license. Likewise, it can be argued that<BR>failing to enforce existing rules and regulations or flaunting<BR>authority are also violations at the supervisory level.<BR>While rare and possibly difficult to cull out, such<BR>practices are a flagrant violation of the rules and invariably<BR>set the stage for the tragic sequence of events that<BR>predictably follow.<BR>Organizational Influences<BR>As noted previously, fallible decisions of upper-level<BR>management directly affect supervisory practices, as<BR>well as the conditions and actions of operators. Unfortunately,<BR>these organizational errors often go unnoticed<BR>by safety professionals, due in large part to the lack of a<BR>clear framework from which to investigate them. Generally<BR>speaking, the most elusive of latent failures revolve<BR>around issues related to resource management, organizational<BR>climate, and operational processes, as detailed<BR>below in Figure 5.<BR>Resource Management. This category encompasses<BR>the realm of corporate-level decision making regarding<BR>the allocation and maintenance of organizational<BR>assets such as human resources (personnel), monetary<BR>assets, and equipment/facilities (Table 4). Generally,<BR>corporate decisions about how such resources should<BR>be managed center around two distinct objectives –<BR>the goal of safety and the goal of on-time, costeffective<BR>operations. In times of prosperity, both<BR>objectives can be easily balanced and satisfied in full.<BR>However, as we mentioned earlier, there may also be<BR>times of fiscal austerity that demand some give and<BR>take between the two. Unfortunately, history tells us<BR>that safety is often the loser in such battles and, as<BR>some can attest to very well, safety and training are<BR>often the first to be cut in organizations having<BR>financial difficulties. If cutbacks in such areas are too<BR>severe, flight proficiency may suffer, and the best<BR>pilots may leave the organization for greener pastures.<BR>Excessive cost-cutting could also result in reduced<BR>funding for new equipment or may lead to the purchase<BR>of equipment that is sub optimal and inadequately<BR>designed for the type of operations flown by<BR>the company. Other trickle-down effects include<BR>poorly maintained equipment and workspaces, and<BR>the failure to correct known design flaws in existing<BR>equipment. The result is a scenario involving unseasoned,<BR>less-skilled pilots flying old and poorly maintained<BR>aircraft under the least desirable conditions<BR>and schedules. The ramifications for aviation safety<BR>are not hard to imagine.<BR>Climate. Organizational Climate refers to a broad<BR>class of organizational variables that influence worker<BR>performance. Formally, it was defined as the<BR>“situationally based consistencies in the organization’s<BR>treatment of individuals” (Jones, 1988). In general,<BR>however, organizational climate can be viewed as the<BR>working atmosphere within the organization. One<BR>telltale sign of an organization’s climate is its structure,<BR>12<BR>13<BR>as reflected in the chain-of-command, delegation of<BR>authority and responsibility, communication channels,<BR>and formal accountability for actions (Table 4).<BR>Just like in the cockpit, communication and coordination<BR>are vital within an organization. If management<BR>and staff within an organization are not<BR>communicating, or if no one knows who is in charge,<BR>organizational safety clearly suffers and accidents do<BR>happen (Muchinsky, 1997).<BR>An organization’s policies and culture are also<BR>good indicators of its climate. Policies are official<BR>guidelines that direct management’s decisions about<BR>such things as hiring and firing, promotion, retention,<BR>raises, sick leave, drugs and alcohol, overtime,<BR>accident investigations, and the use of safety equipment.<BR>Culture, on the other hand, refers to the<BR>unofficial or unspoken rules, values, attitudes, beliefs,<BR>and customs of an organization. Culture is “the<BR>way things really get done around here.”<BR>When policies are ill-defined, adversarial, or conflicting,<BR>or when they are supplanted by unofficial<BR>rules and values, confusion abounds within the organization.<BR>Indeed, there are some corporate managers<BR>who are quick to give “lip service” to official safety<BR>policies while in a public forum, but then overlook<BR>such policies when operating behind the scenes.<BR>However, the Third Law of Thermodynamics tells us<BR>that, “order and harmony cannot be produced by<BR>such chaos and disharmony”. Safety is bound to<BR>suffer under such conditions.<BR>Operational Process. This category refers to corporate<BR>decisions and rules that govern the everyday activities<BR>within an organization, including the establishment<BR>and use of standardized operating procedures and formal<BR>methods for maintaining checks and balances (oversight)<BR>between the workforce and management. For<BR>example, such factors as operational tempo, time pressures,<BR>incentive systems, and work schedules are all<BR>factors that can adversely affect safety (Table 4). As<BR>stated earlier, there may be instances when those within<BR>the upper echelon of an organization determine that it<BR>is necessary to increase the operational tempo to a point<BR>that overextends a supervisor’s staffing capabilities.<BR>Therefore, a supervisor may resort to the use of inadequate<BR>scheduling procedures that jeopardize crew rest<BR>and produce sub optimal crew pairings, putting aircrew<BR>at an increased risk of a mishap. However, organizations<BR>should have official procedures in place to<BR>address such contingencies as well as oversight programs<BR>to monitor such risks.<BR>Regrettably, not all organizations have these procedures<BR>nor do they engage in an active process of<BR>monitoring aircrew errors and human factor problems<BR>via anonymous reporting systems and safety<BR>audits. As such, supervisors and managers are often<BR>unaware of the problems before an accident occurs.<BR>Indeed, it has been said that “an accident is one<BR>incident to many” (Reinhart, 1996). It is incumbent<BR>upon any organization to fervently seek out the “holes<BR>in the cheese” and plug them up, before they create a<BR>window of opportunity for catastrophe to strike.<BR>CONCLUSION<BR>It is our belief that the Human Factors Analysis<BR>and Classification System (HFACS) framework<BR>bridges the gap between theory and practice by providing<BR>investigators with a comprehensive, userfriendly<BR>tool for identifying and classifying the human<BR>causes of aviation accidents. The system, which is<BR>based upon Reason’s (1990) model of latent and<BR>active failures (Shappell &amp; Wiegmann, 1997a), encompasses<BR>all aspects of human error, including the<BR>conditions of operators and organizational failure.<BR>Still, HFACS and any other framework only contributes<BR>to an already burgeoning list of human error<BR>taxonomies if it does not prove useful in the operational<BR>setting. In these regards, HFACS has recently<BR>been employed by the U.S. Navy, Marine Corps,<BR>Army, Air Force, and Coast Guard for use in aviation<BR>accident investigation and analysis. To date, HFACS<BR>has been applied to the analysis of human factors data<BR>from approximately 1,000 military aviation accidents.<BR>Throughout this process, the reliability and<BR>content validity of HFACS has been repeatedly tested<BR>and demonstrated (Shappell &amp; Wiegmann, 1997c).<BR>Given that accident databases can be reliably analyzed<BR>using HFACS, the next logical question is<BR>whether anything unique will be identified. Early<BR>indications within the military suggest that the<BR>HFACS framework has been instrumental in the<BR>identification and analysis of global human factors<BR>safety issues (e.g., trends in aircrew proficiency;<BR>Shappell, et al., 1999), specific accident types (e.g.,<BR>controlled flight into terrain, CFIT; Shappell &amp;<BR>Wiegmann, 1997b), and human factors problems<BR>such as CRM failures (Wiegmann &amp; Shappell, 1999).<BR>Consequently, the systematic application of HFACS<BR>to the analysis of human factors accident data has<BR>afforded the U.S. Navy/Marine Corps (for which the<BR>14<BR>original taxonomy was developed) the ability to develop<BR>objective, data-driven intervention strategies.<BR>In a sense, HFACS has illuminated those areas ripe<BR>for intervention rather than relying on individual<BR>research interests not necessarily tied to saving lives<BR>or preventing aircraft losses.<BR>Additionally, the HFACS framework and the insights<BR>gleaned from database analyses have been used<BR>to develop innovative accident investigation methods<BR>that have enhanced both the quantity and quality<BR>of the human factors information gathered during<BR>accident investigations. However, not only are safety<BR>professionals better suited to examine human error in<BR>the field but, using HFACS, they can now track those<BR>areas (the holes in the cheese) responsible for the<BR>accidents as well. Only now is it possible to track the<BR>success or failure of specific intervention programs<BR>designed to reduce specific types of human error and<BR>subsequent aviation accidents. In so doing, research<BR>investments and safety programs can be either readjusted<BR>or reinforced to meet the changing needs of<BR>aviation safety.<BR>Recently, these accident analysis and investigative<BR>techniques, developed and proven in the military,<BR>have been applied to the analysis and investigation of<BR>U.S. civil aviation accidents (Shappell &amp; Wiegmann,<BR>1999). Specifically, the HFACS framework is currently<BR>being used to systematically analyze both commercial<BR>and General Aviation accident data to explore<BR>the underlying human factors problems associated<BR>with these events. The framework is also being employed<BR>to develop improved methods and techniques<BR>for investigating human factors issues during actual<BR>civil aviation accident investigations by Federal Aviation<BR>Administration and National Transportation<BR>Safety Board officials. Initial results of this project<BR>have begun to highlight human factors areas in need<BR>of further safety research. In addition, like their<BR>military counterparts, it is anticipated that HFACS<BR>will provide the fundamental information and tools<BR>needed to develop a more effective and accessible<BR>human factors accident database for civil aviation.<BR>In summary, the development of the HFACS<BR>framework has proven to be a valuable first step in the<BR>establishment of a larger military and civil aviation<BR>safety program. The ultimate goal of this, and any<BR>other, safety program is to reduce the aviation accident<BR>rate through systematic, data-driven investment.<BR>REFERENCES<BR>Bird, F. (1974). Management guide to loss control. Atlanta,<BR>GA: Institute Press.<BR>Heinrich, H.W., Petersen, D., &amp; Roos, N. (1980).<BR>Industrial accident prevention: A safety management<BR>approach (5th ed.). New York: McGraw-Hill.<BR>Helmreich, R.L., &amp; Foushee, H.C. (1993). Why crew<BR>resource management? Empirical and theoretical<BR>bases of human factors training in aviation. In<BR>E.L. Wiener, B.G. Kanki, &amp; R.L. Helmreich<BR>(Eds.), Cockpit resource management (pp. 3-45).<BR>San Diego, CA: Academic Press.<BR>Jones, A.P. (1988). Climate and measurement of consensus:<BR>A discussion of “organizational climate.”<BR>In S.G. Cole, R.G. Demaree &amp; W. Curtis, (Eds.),<BR>Applications of Interactionist Psychology: Essays in<BR>Honor of Saul B. Sells (pp. 283-90). Hillsdale, NJ:<BR>Earlbaum.<BR>Murray, S.R. (1997). Deliberate decision making by<BR>aircraft pilots: A simple reminder to avoid decision<BR>making under panic. The International Journal<BR>of Aviation Psychology, 7, 83-100.<BR>Muchinsky, P.M. (1997). Psychology applied to work<BR>(5th ed.). Pacific Grove, CA: Brooks/Cole Publishing<BR>Co.<BR>National Transportation Safety Board. (1982). Air<BR>Florida, Inc., Boeing 737-222, N62AF, Collision<BR>with 14th Street bridge, near Washington National<BR>Airport, Washington, D.C., January 13, 1982<BR>(Tech. Report NTSB-AAR-82-8). Washington:<BR>National Transportation Safety Board.<BR>Orasanu, J.M. (1993). Decision-making in the cockpit.<BR>In E.L. Wiener, B.G. Kanki, and R.L.<BR>Helmreich (Eds.), Cockpit resource management<BR>(pp. 137-72). San Diego, CA: Academic Press.<BR>Rasmussen, J. (1982). Human errors: A taxonomy for<BR>describing human malfunction in industrial installations.<BR>Journal of Occupational Accidents, 4,<BR>311-33.<BR>Reason, J. (1990). Human error. New York: Cambridge<BR>University Press.<BR>Reinhart, R.O. (1996). Basic flight physiology (2nd ed.).<BR>New York: McGraw-Hill.<BR>15<BR>Senders, J.W., and Moray, N.P. (1991). Human error:<BR>Cause, prediction and reduction. Hillsdale, NJ:<BR>Earlbaum.<BR>Shappell, S.A., and Wiegmann, D.A. (1996). U.S.<BR>naval aviation mishaps 1977-92: Differences between<BR>single- and dual-piloted aircraft. Aviation,<BR>Space, and Environmental Medicine, 67, 65-9.<BR>Shappell, S.A. and Wiegmann D.A. (1997a). A human<BR>error approach to accident investigation: The taxonomy<BR>of unsafe operations. The International<BR>Journal of Aviation Psychology, 7, 269-91.<BR>Shappell, S.A. &amp; Wiegmann, D.A. (1997b). Why<BR>would an experienced aviator fly a perfectly<BR>good aircraft into the ground? In Proceedings of<BR>the Ninth International Symposium on Aviation<BR>Psychology, (pp. 26-32). Columbus, OH: The<BR>Ohio State University.<BR>Shappell, S.A. and Wiegmann, D.A. (1997). A reliability<BR>analysis of the Taxonomy of Unsafe Operations.<BR>Aviation, Space, and Environmental Medicine,<BR>68, 620.<BR>Shappell, S.A. and Wiegmann, D.A. (1999a). Human<BR>error in commercial and corporate aviation: An<BR>analysis of FAR Part 121 and 135 mishaps using<BR>HFACS. Aviation, Space, and Environmental Medicine,<BR>70, 407.<BR>Shappell, S., Wiegmann, D., Fraser, J., Gregory, G.,<BR>Kinsey, P., and Squier, H (1999b). Beyond mishap<BR>rates: A human factors analysis of U.S. Navy/<BR>Marine Corps TACAIR and rotary wing mishaps<BR>using HFACS. Aviation, Space, and Environmental<BR>Medicine, 70, 416-17.<BR>Wiegmann, D.A. and Shappell, S.A. (1997). Human<BR>factors analysis of post-accident data: Applying<BR>theoretical taxonomies of human error. The International<BR>Journal of Aviation Psychology, 7, 67-81.<BR>Wiegmann, D.A. and Shappell, S.A. (1999). Human<BR>error and crew resource management failures in<BR>Naval aviation mishaps: A review of U.S. Naval<BR>Safety Center data, 1990-96. Aviation, Space, and<BR>Environmental Medicine, 70, 1147-51.</P>

tonyblairer 发表于 2010-12-8 09:57:25

谢谢

不错 非常好的

topgun008 发表于 2011-1-12 10:48:39

dingdingdingdingdingdingding

wendellc 发表于 2011-3-11 22:26:45

谢谢一起分享
页: [1]
查看完整版本: The Human Factors Analysis and Classification System–HFACS