航空论坛_航空翻译_民航英语翻译_飞行翻译

 找回密码
 注册
搜索
查看: 1429|回复: 4
打印 上一主题 下一主题

The Human Factors Analysis and Classification System–HFACS [复制链接]

Rank: 9Rank: 9Rank: 9

跳转到指定楼层
1#
发表于 2010-5-17 18:52:23 |只看该作者 |倒序浏览

The Human Factors Analysis and Classification System–HFACS

 

游客,如果您要查看本帖隐藏内容请回复

附件: 你需要登录才可以下载或查看附件。没有帐号?注册

Rank: 9Rank: 9Rank: 9

2#
发表于 2010-5-17 18:52:46 |只看该作者

DOT/FAA/AM-00/7
U.S. Department
of Transpor tation
Federal Aviation
Administration
Scott A. Shappell
FAA Civil Aeromedical Institute
Oklahoma City, OK 73125
Douglas A. Wiegmann
University of Illinois at Urbana-Champaign
Institute of Aviation
Savoy, IL 61874
February 2000
Final Report
This document is available to the public
through the National Technical Information
Service, Springfield, Virginia 22161.
Office of Aviation Medicine
Washington, DC 20591
The Human Factors
Analysis and Classification
System–HFACS
N O T I C E
This document is disseminated under the sponsorship of
the U.S. Department of Transportation in the interest of
information exchange. The United States Government
assumes no liability for the contents thereof.
i
Technical Report Documentation Page
1. Report No. 2. Government Accession No. 3. Recipient's Catalog No.
DOT/FAA/AM-00/7
4. Title and Subtitle 5. Report Date
The Human Factors Analysis and Classification System—HFACS February 2000
6. Performing Organization Code
7. Author(s) 8. Performing Organization Report No.
Shappell, S.A.1, and Wiegmann, D.A.2
9. Performing Organization Name and Address 10. Work Unit No. (TRAIS)
1FAA Civil Aeromedical Institute, Oklahoma City, OK 73125
2University of Illinois at Urbana-Champaign, Institute of Aviation,
Savoy, Ill. 61874 11. Contract or Grant No.
99-G-006
12. Sponsoring Agency name and Address 13. Type of Report and Period Covered
Office of Aviation Medicine
Federal Aviation Administration 14. Sponsoring Agency Code
800 Independence Ave., S.W.
Washington, DC 20591
15. Supplemental Notes
This work was performed under task # AAM-A –00-HRR-520
16. Abstract
Human error has been implicated in 70 to 80% of all civil and military aviation accidents. Yet, most accident
reporting systems are not designed around any theoretical framework of human error. As a result, most
accident databases are not conducive to a traditional human error analysis, making the identification of
intervention strategies onerous. What is required is a general human error framework around which new
investigative methods can be designed and existing accident databases restructured. Indeed, a comprehensive
human factors analysis and classification system (HFACS) has recently been developed to meet those needs.
Specifically, the HFACS framework has been used within the military, commercial, and general aviation sectors
to systematically examine underlying human causal factors and to improve aviation accident investigations.
This paper describes the development and theoretical underpinnings of HFACS in the hope that it will help
safety professionals reduce the aviation accident rate through systematic, data-driven investment strategies and
objective evaluation of intervention programs
17. Key Words 18. Distribution Statement
Aviation, Human Error, Accident Investigation, Database
Analysis
Document is available to the public through the
National Technical Information Service,
Springfield, Virginia 22161
19. Security Classif. (of this report) 20. Security Classif. (of this page) 21. No. of Pages 22. Price
Unclassified Unclassified 18
Form DOT F 1700.7 (8-72) Reproduction of completed page authorized

1
THE HUMAN FACTORS ANALYSIS AND CLASSIFICATION SYSTEM–HFACS
INTRODUCTION
Sadly, the annals of aviation history are littered with
accidents and tragic losses. Since the late 1950s, however,
the drive to reduce the accident rate has yielded
unprecedented levels of safety to a point where it is now
safer to fly in a commercial airliner than to drive a car or
even walk across a busy New York city street. Still, while
the aviation accident rate has declined tremendously
since the first flights nearly a century ago, the cost of
aviation accidents in both lives and dollars has steadily
risen. As a result, the effort to reduce the accident rate
still further has taken on new meaning within both
military and civilian aviation.
Even with all the innovations and improvements
realized in the last several decades, one fundamental
question remains generally unanswered: “Why do aircraft
crash?” The answer may not be as straightforward
as one might think. In the early years of aviation, it could
reasonably be said that, more often than not, the aircraft
killed the pilot. That is, the aircraft were intrinsically
unforgiving and, relative to their modern counterparts,
mechanically unsafe. However, the modern era of aviation
has witnessed an ironic reversal of sorts. It now
appears to some that the aircrew themselves are more
deadly than the aircraft they fly (Mason, 1993; cited in
Murray, 1997). In fact, estimates in the literature indicate
that between 70 and 80 percent of aviation accidents
can be attributed, at least in part, to human error
(Shappell & Wiegmann, 1996). Still, to off-handedly
attribute accidents solely to aircrew error is like telling
patients they are simply “sick” without examining the
underlying causes or further defining the illness.
So what really constitutes that 70-80 % of human
error repeatedly referred to in the literature? Some
would have us believe that human error and “pilot” error
are synonymous. Yet, simply writing off aviation accidents
merely to pilot error is an overly simplistic, if not
naive, approach to accident causation. After all, it is
well established that accidents cannot be attributed
to a single cause, or in most instances, even a single
individual (Heinrich, Petersen, and Roos, 1980). In
fact, even the identification of a “primary” cause is
fraught with problems. Rather, aviation accidents are
the end result of a number of causes, only the last of
which are the unsafe acts of the aircrew (Reason, 1990;
Shappell & Wiegmann, 1997a; Heinrich, Peterson, &
Roos, 1980; Bird, 1974).
The challenge for accident investigators and analysts
alike is how best to identify and mitigate the causal
sequence of events, in particular that 70-80 % associated
with human error. Armed with this challenge, those
interested in accident causation are left with a growing
list of investigative schemes to chose from. In fact, there
are nearly as many approaches to accident causation as
there are those involved in the process (Senders &
Moray, 1991). Nevertheless, a comprehensive framework
for identifying and analyzing human error continues
to elude safety professionals and theorists alike.
Consequently, interventions cannot be accurately targeted
at specific human causal factors nor can their
effectiveness be objectively measured and assessed. Instead,
safety professionals are left with the status quo.
That is, they are left with interest/fad-driven research
resulting in intervention strategies that peck around the
edges of accident causation, but do little to reduce the
overall accident rate. What is needed is a framework
around which a needs-based, data-driven safety program
can be developed (Wiegmann & Shappell, 1997).
Reason’s “Swiss Cheese” Model of Human Error
One particularly appealing approach to the genesis of
human error is the one proposed by James Reason
(1990). Generally referred to as the “Swiss cheese”
model of human error, Reason describes four levels of
human failure, each influencing the next (Figure 1).
Working backwards in time from the accident, the first
level depicts those Unsafe Acts of Operators that ultimately
led to the accident1. More commonly referred to
in aviation as aircrew/pilot error, this level is where most
accident investigations have focused their efforts and
consequently, where most causal factors are uncovered.
1 Reason’s original work involved operators of a nuclear power plant. However, for the purposes of this manuscript, the
operators here refer to aircrew, maintainers, supervisors and other humans involved in aviation.
2
After all, it is typically the actions or inactions of aircrew
that are directly linked to the accident. For instance,
failing to properly scan the aircraft’s instruments while
in instrument meteorological conditions (IMC) or penetrating
IMC when authorized only for visual meteorological
conditions (VMC) may yield relatively
immediate, and potentially grave, consequences. Represented
as “holes” in the cheese, these active failures are
typically the last unsafe acts committed by aircrew.
However, what makes the “Swiss cheese” model
particularly useful in accident investigation, is that it
forces investigators to address latent failures within the
causal sequence of events as well. As their name suggests,
latent failures, unlike their active counterparts, may lie
dormant or undetected for hours, days, weeks, or even
longer, until one day they adversely affect the unsuspecting
aircrew. Consequently, they may be overlooked by
investigators with even the best intentions.
Within this concept of latent failures, Reason described
three more levels of human failure. The first
involves the condition of the aircrew as it affects performance.
Referred to as Preconditions for Unsafe Acts, this
level involves conditions such as mental fatigue and
poor communication and coordination practices, often
referred to as crew resource management (CRM). Not
surprising, if fatigued aircrew fail to communicate and
coordinate their activities with others in the cockpit or
individuals external to the aircraft (e.g., air traffic control,
maintenance, etc.), poor decisions are made and
errors often result.
But exactly why did communication and coordination
break down in the first place? This is perhaps
where Reason’s work departed from more traditional
approaches to human error. In many instances, the
breakdown in good CRM practices can be traced
back to instances of Unsafe Supervision, the third level
of human failure. If, for example, two inexperienced
(and perhaps even below average pilots) are paired
with each other and sent on a flight into known
adverse weather at night, is anyone really surprised by
a tragic outcome? To make matters worse, if this
questionable manning practice is coupled with the
lack of quality CRM training, the potential for miscommunication
and ultimately, aircrew errors, is
magnified. In a sense then, the crew was “set up” for
failure as crew coordination and ultimately performance
would be compromised. This is not to lessen the
role played by the aircrew, only that intervention and
mitigation strategies might lie higher within the system.
Reason’s model didn’t stop at the supervisory level
either; the organization itself can impact performance
at all levels. For instance, in times of fiscal
austerity, funding is often cut, and as a result, training
and flight time are curtailed. Consequently, supervisors
are often left with no alternative but to task
“non-proficient” aviators with complex tasks. Not
surprisingly then, in the absence of good CRM training,
communication and coordination failures will
begin to appear as will a myriad of other preconditions,
all of which will affect performance and elicit
aircrew errors. Therefore, it makes sense that, if the
accident rate is going to be reduced beyond current
levels, investigators and analysts alike must examine
the accident sequence in its entirety and expand it
beyond the cockpit. Ultimately, causal factors at all
levels within the organization must be addressed if
any accident investigation and prevention system is
going to succeed.
In many ways, Reason’s “Swiss cheese” model of
accident causation has revolutionized common views
of accident causation. Unfortunately, however, it is
simply a theory with few details on how to apply it in
a real-world setting. In other words, the theory never
defines what the “holes in the cheese” really are, at
least within the context of everyday operations. Ultimately,
one needs to know what these system failures
or “holes” are, so that they can be identified during
accident investigations or better yet, detected and
corrected before an accident occurs.
Mishap
Latent Failures
Latent Failures
Latent Failures
Active Failures
Failed or
Absent Defenses
Organizational
Influences
Unsafe
Supervision
Preconditions
for
Unsafe Acts
Unsafe
Acts
Figure 1. The “Swiss cheese” model of human
error causation (adapted from Reason, 1990).
3
The balance of this paper will attempt to describe
the “holes in the cheese.” However, rather than attempt
to define the holes using esoteric theories with
little or no practical applicability, the original framework
(called the Taxonomy of Unsafe Operations) was
developed using over 300 Naval aviation accidents
obtained from the U.S. Naval Safety Center (Shappell
& Wiegmann, 1997a). The original taxonomy has
since been refined using input and data from other
military (U.S. Army Safety Center and the U.S. Air
Force Safety Center) and civilian organizations (National
Transportation Safety Board and the Federal
Aviation Administration). The result was the development
of the Human Factors Analysis and Classification
System (HFACS).
THE HUMAN FACTORS ANALYSIS AND
CLASSIFICATION SYSTEM
Drawing upon Reason’s (1990) concept of latent
and active failures, HFACS describes four levels of
failure: 1) Unsafe Acts, 2) Preconditions for Unsafe
Acts, 3) Unsafe Supervision, and 4) Organizational
Influences. A brief description of the major components
and causal categories follows, beginning with the
level most closely tied to the accident, i.e. unsafe acts.
Unsafe Acts
The unsafe acts of aircrew can be loosely classified
into two categories: errors and violations (Reason,
1990). In general, errors represent the mental or
physical activities of individuals that fail to achieve
their intended outcome. Not surprising, given the
fact that human beings by their very nature make
errors, these unsafe acts dominate most accident
databases. Violations, on the other hand, refer to the
willful disregard for the rules and regulations that
govern the safety of flight. The bane of many organizations,
the prediction and prevention of these appalling
and purely “preventable” unsafe acts, continue
to elude managers and researchers alike.
Still, distinguishing between errors and violations
does not provide the level of granularity required of
most accident investigations. Therefore, the categories
of errors and violations were expanded here
(Figure 2), as elsewhere (Reason, 1990; Rasmussen,
1982), to include three basic error types (skill-based,
decision, and perceptual) and two forms of violations
(routine and exceptional).
Errors
Skill-based errors. Skill-based behavior within the
context of aviation is best described as “stick-andrudder”
and other basic flight skills that occur without
significant conscious thought. As a result, these
skill-based actions are particularly vulnerable to failures
of attention and/or memory. In fact, attention
failures have been linked to many skill-based errors
such as the breakdown in visual scan patterns, task
fixation, the inadvertent activation of controls, and
the misordering of steps in a procedure, among others
(Table 1). A classic example is an aircraft’s crew that
becomes so fixated on trouble-shooting a burned out
warning light that they do not notice their fatal
Perceptual
Errors
Decision
Errors
Skill-Based
Errors
Errors
UNSAFE
ACTS
Routine Exceptional
Violations
Figure 2. Categories of unsafe acts committed by aircrews.
4
TABLE 1. Selected examples of Unsafe Acts of Pilot Operators (Note: This is not
a complete listing)
ERRORS
Skill-based Errors
Breakdown in visual scan
Failed to prioritize attention
Inadvertent use of flight controls
Omitted step in procedure
Omitted checklist item
Poor technique
Over-controlled the aircraft
Decision Errors
Improper procedure
Misdiagnosed emergency
Wrong response to emergency
Exceeded ability
Inappropriate maneuver
Poor decision
Perceptual Errors (due to)
Misjudged distance/altitude/airspeed
Spatial disorientation
Visual illusion
VIOLATIONS
Failed to adhere to brief
Failed to use the radar altimeter
Flew an unauthorized approach
Violated training rules
Flew an overaggressive maneuver
Failed to properly prepare for the flight
Briefed unauthorized flight
Not current/qualified for the mission
Intentionally exceeded the limits of the aircraft
Continued low-altitude flight in VMC
Unauthorized low-altitude canyon running
descent into the terrain. Perhaps a bit closer to home,
consider the hapless soul who locks himself out of the
car or misses his exit because he was either distracted,
in a hurry, or daydreaming. These are both examples
of attention failures that commonly occur during
highly automatized behavior. Unfortunately, while
at home or driving around town these attention/
memory failures may be frustrating, in the air they
can become catastrophic.
In contrast to attention failures, memory failures
often appear as omitted items in a checklist, place
losing, or forgotten intentions. For example, most of
us have experienced going to the refrigerator only to
forget what we went for. Likewise, it is not difficult
to imagine that when under stress during inflight
emergencies, critical steps in emergency procedures
can be missed. However, even when not particularly
stressed, individuals have forgotten to set the flaps on
approach or lower the landing gear – at a minimum,
an embarrassing gaffe.
The third, and final, type of skill-based errors
identified in many accident investigations involves
technique errors. Regardless of one’s training,
experience, and educational background, the manner
in which one carries out a specific sequence of events
may vary greatly. That is, two pilots with identical
training, flight grades, and experience may differ
significantly in the manner in which they maneuver
their aircraft. While one pilot may fly smoothly with
the grace of a soaring eagle, others may fly with the
darting, rough transitions of a sparrow. Nevertheless,
while both may be safe and equally adept at flying, the
techniques they employ could set them up for specific
failure modes. In fact, such techniques are as much a
factor of innate ability and aptitude as they are an
overt expression of one’s own personality, making
efforts at the prevention and mitigation of technique
errors difficult, at best.
Decision errors. The second error form, decision
errors, represents intentional behavior that proceeds
as intended, yet the plan proves inadequate or inappropriate
for the situation. Often referred to as “honest
mistakes,” these unsafe acts represent the actions
or inactions of individuals whose “hearts are in the
right place,” but they either did not have the appropriate
knowledge or just simply chose poorly.
5
Perhaps the most heavily investigated of all error
forms, decision errors can be grouped into three
general categories: procedural errors, poor choices,
and problem solving errors (Table 1). Procedural
decision errors (Orasanu, 1993), or rule-based mistakes,
as described by Rasmussen (1982), occur during
highly structured tasks of the sorts, if X, then do
Y. Aviation, particularly within the military and
commercial sectors, by its very nature is highly structured,
and consequently, much of pilot decision
making is procedural. There are very explicit procedures
to be performed at virtually all phases of flight.
Still, errors can, and often do, occur when a situation
is either not recognized or misdiagnosed, and the
wrong procedure is applied. This is particularly true
when pilots are placed in highly time-critical emergencies
like an engine malfunction on takeoff.
However, even in aviation, not all situations have
corresponding procedures to deal with them. Therefore,
many situations require a choice to be made
among multiple response options. Consider the pilot
flying home after a long week away from the family
who unexpectedly confronts a line of thunderstorms
directly in his path. He can choose to fly around the
weather, divert to another field until the weather
passes, or penetrate the weather hoping to quickly
transition through it. Confronted with situations
such as this, choice decision errors (Orasanu, 1993),
or knowledge-based mistakes as they are otherwise
known (Rasmussen, 1986), may occur. This is particularly
true when there is insufficient experience,
time, or other outside pressures that may preclude
correct decisions. Put simply, sometimes we chose
well, and sometimes we don’t.
Finally, there are occasions when a problem is not
well understood, and formal procedures and response
options are not available. It is during these ill-defined
situations that the invention of a novel solution is
required. In a sense, individuals find themselves
where no one has been before, and in many ways,
must literally fly by the seats of their pants. Individuals
placed in this situation must resort to slow and
effortful reasoning processes where time is a luxury
rarely afforded. Not surprisingly, while this type of
decision making is more infrequent then other forms,
the relative proportion of problem-solving errors
committed is markedly higher.
Perceptual errors. Not unexpectedly, when one’s
perception of the world differs from reality, errors
can, and often do, occur. Typically, perceptual errors
occur when sensory input is degraded or “unusual,”
as is the case with visual illusions and spatial disorientation
or when aircrew simply misjudge the aircraft’s
altitude, attitude, or airspeed (Table 1). Visual illusions,
for example, occur when the brain tries to “fill
in the gaps” with what it feels belongs in a visually
impoverished environment, like that seen at night or
when flying in adverse weather. Likewise, spatial
disorientation occurs when the vestibular system
cannot resolve one’s orientation in space and therefore
makes a “best guess” — typically when visual
(horizon) cues are absent at night or when flying in
adverse weather. In either event, the unsuspecting
individual often is left to make a decision that is based
on faulty information and the potential for committing
an error is elevated.
It is important to note, however, that it is not the
illusion or disorientation that is classified as a perceptual
error. Rather, it is the pilot’s erroneous response
to the illusion or disorientation. For example, many
unsuspecting pilots have experienced “black-hole”
approaches, only to fly a perfectly good aircraft into
the terrain or water. This continues to occur, even
though it is well known that flying at night over dark,
featureless terrain (e.g., a lake or field devoid of trees),
will produce the illusion that the aircraft is actually
higher than it is. As a result, pilots are taught to rely
on their primary instruments, rather than the outside
world, particularly during the approach phase of
flight. Even so, some pilots fail to monitor their
instruments when flying at night. Tragically, these
aircrew and others who have been fooled by illusions
and other disorientating flight regimes may end up
involved in a fatal aircraft accident.
Violations
By definition, errors occur within the rules and
regulations espoused by an organization; typically
dominating most accident databases. In contrast,
violations represent a willful disregard for the rules
and regulations that govern safe flight and, fortunately,
occur much less frequently since they often
involve fatalities (Shappell et al., 1999b).
6
While there are many ways to distinguish between
types of violations, two distinct forms have been identified,
based on their etiology, that will help the safety
professional when identifying accident causal factors.
The first, routine violations, tend to be habitual by
nature and often tolerated by governing authority (Reason,
1990). Consider, for example, the individual who
drives consistently 5-10 mph faster than allowed by law
or someone who routinely flies in marginal weather
when authorized for visual meteorological conditions
only. While both are certainly against the governing
regulations, many others do the same thing. Furthermore,
individuals who drive 64 mph in a 55 mph zone,
almost always drive 64 in a 55 mph zone. That is, they
“routinely” violate the speed limit. The same can typically
be said of the pilot who routinely flies into marginal
weather.
What makes matters worse, these violations (commonly
referred to as “bending” the rules) are often
tolerated and, in effect, sanctioned by supervisory authority
(i.e., you’re not likely to get a traffic citation
until you exceed the posted speed limit by more than 10
mph). If, however, the local authorities started handing
out traffic citations for exceeding the speed limit on the
highway by 9 mph or less (as is often done on military
installations), then it is less likely that individuals would
violate the rules. Therefore, by definition, if a routine
violation is identified, one must look further up the
supervisory chain to identify those individuals in authority
who are not enforcing the rules.
On the other hand, unlike routine violations, exceptional
violations appear as isolated departures from
authority, not necessarily indicative of individual’s typical
behavior pattern nor condoned by management
(Reason, 1990). For example, an isolated instance of
driving 105 mph in a 55 mph zone is considered an
exceptional violation. Likewise, flying under a bridge or
engaging in other prohibited maneuvers, like low-level
canyon running, would constitute an exceptional violation.
However, it is important to note that, while most
exceptional violations are appalling, they are not considered
“exceptional” because of their extreme nature.
Rather, they are considered exceptional because they are
neither typical of the individual nor condoned by authority.
Still, what makes exceptional violations particularly
difficult for any organization to deal with is
that they are not indicative of an individual’s behavioral
repertoire and, as such, are particularly difficult to
predict. In fact, when individuals are confronted with
evidence of their dreadful behavior and asked to
explain it, they are often left with little explanation.
Indeed, those individuals who survived such excursions
from the norm clearly knew that, if caught, dire
consequences would follow. Still, defying all logic,
many otherwise model citizens have been down this
potentially tragic road.
Preconditions for Unsafe Acts
Arguably, the unsafe acts of pilots can be directly
linked to nearly 80 % of all aviation accidents. However,
simply focusing on unsafe acts is like focusing on a fever
without understanding the underlying disease causing
it. Thus, investigators must dig deeper into why the
unsafe acts took place. As a first step, two major subdivisions
of unsafe aircrew conditions were developed:
substandard conditions of operators and the substandard
practices they commit (Figure 3).
PRECONDITIONS
FOR
UNSAFE ACTS
Substandard
Conditions of
Operators
Adverse
Physiological
States
Physical/
Mental
Limitations
Adverse
Mental
States
Personal
Readiness
Crew Resource
Mismanagement
Substandard
Practices of
Operators
Figure 3. Categories of preconditions of unsafe acts.
7
Substandard Conditions of Operators
Adverse mental states. Being prepared mentally is
critical in nearly every endeavor, but perhaps even
more so in aviation. As such, the category of Adverse
Mental States was created to account for those mental
conditions that affect performance (Table 2). Principal
among these are the loss of situational awareness,
task fixation, distraction, and mental fatigue due to
sleep loss or other stressors. Also included in this
category are personality traits and pernicious attitudes
such as overconfidence, complacency, and misplaced
motivation.
Predictably, if an individual is mentally tired for
whatever reason, the likelihood increase that an error
will occur. In a similar fashion, overconfidence and
other pernicious attitudes such as arrogance and
impulsivity will influence the likelihood that a violation
will be committed. Clearly then, any framework
of human error must account for preexisting adverse
mental states in the causal chain of events.
Adverse physiological states. The second category,
adverse physiological states, refers to those medical or
physiological conditions that preclude safe operations
(Table 2). Particularly important to aviation are
such conditions as visual illusions and spatial disorientation
as described earlier, as well as physical fatigue,
and the myriad of pharmacological and medical
abnormalities known to affect performance.
The effects of visual illusions and spatial disorientation
are well known to most aviators. However, less
well known to aviators, and often overlooked are the
effects on cockpit performance of simply being ill.
Nearly all of us have gone to work ill, dosed with
over-the-counter medications, and have generally
performed well. Consider however, the pilot suffering
from the common head cold. Unfortunately,
most aviators view a head cold as only a minor
inconvenience that can be easily remedied using
over-the counter antihistamines, acetaminophen, and
other non-prescription pharmaceuticals. In fact, when
8
confronted with a stuffy nose, aviators typically are
only concerned with the effects of a painful sinus
block as cabin altitude changes. Then again, it is not
the overt symptoms that local flight surgeons are
concerned with. Rather, it is the accompanying inner
ear infection and the increased likelihood of spatial
disorientation when entering instrument meteorological
conditions that is alarming - not to mention
the side-effects of antihistamines, fatigue, and sleep
loss on pilot decision-making. Therefore, it is incumbent
upon any safety professional to account for these
sometimes subtle medical conditions within the causal
chain of events.
Physical/Mental Limitations. The third, and final,
substandard condition involves individual physical/
mental limitations (Table 2). Specifically, this category
refers to those instances when mission requirements
exceed the capabilities of the individual at the
controls. For example, the human visual system is
severely limited at night; yet, like driving a car,
drivers do not necessarily slow down or take additional
precautions. In aviation, while slowing down
isn’t always an option, paying additional attention to
basic flight instruments and increasing one’s vigilance
will often increase the safety margin. Unfortunately,
when precautions are not taken, the result can
be catastrophic, as pilots will often fail to see other
aircraft, obstacles, or power lines due to the size or
contrast of the object in the visual field.
Similarly, there are occasions when the time required
to complete a task or maneuver exceeds an
individual’s capacity. Individuals vary widely in their
ability to process and respond to information. Nevertheless,
good pilots are typically noted for their
ability to respond quickly and accurately. It is well
documented, however, that if individuals are required
to respond quickly (i.e., less time is available
to consider all the possibilities or choices thoroughly),
the probability of making an error goes up markedly.
Consequently, it should be no surprise that when
faced with the need for rapid processing and reaction
times, as is the case in most aviation emergencies, all
forms of error would be exacerbated.
In addition to the basic sensory and information
processing limitations described above, there are at
least two additional instances of physical/mental
limitations that need to be addressed, albeit they are
often overlooked by most safety professionals. These
limitations involve individuals who simply are not
compatible with aviation, because they are either
unsuited physically or do not possess the aptitude to
fly. For example, some individuals simply don’t have
the physical strength to operate in the potentially
high-G environment of aviation, or for anthropometric
reasons, simply have difficulty reaching the
controls. In other words, cockpits have traditionally
not been designed with all shapes, sizes, and physical
abilities in mind. Likewise, not everyone has the
mental ability or aptitude for flying aircraft. Just as
not all of us can be concert pianists or NFL linebackers,
not everyone has the innate ability to pilot an
aircraft – a vocation that requires the unique ability
to make decisions quickly and respond accurately in
life threatening situations. The difficult task for the
safety professional is identifying whether aptitude might
have contributed to the accident causal sequence.
Substandard Practices of Operators
Clearly then, numerous substandard conditions of
operators can, and do, lead to the commission of
unsafe acts. Nevertheless, there are a number of
things that we do to ourselves that set up these
substandard conditions. Generally speaking, the substandard
practices of operators can be summed up in
two categories: crew resource mismanagement and
personal readiness.
Crew Resource Mismanagement. Good communication
skills and team coordination have been the
mantra of industrial/organizational and personnel
psychology for decades. Not surprising then, crew
resource management has been a cornerstone of aviation
for the last few decades (Helmreich & Foushee,
1993). As a result, the category of crew resource
mismanagement was created to account for occurrences
of poor coordination among personnel. Within
the context of aviation, this includes coordination both
within and between aircraft with air traffic control
facilities and maintenance control, as well as with facility
and other support personnel as necessary. But aircrew
coordination does not stop with the aircrew in
flight. It also includes coordination before and after the
flight with the brief and debrief of the aircrew.
It is not difficult to envision a scenario where the
lack of crew coordination has led to confusion and
poor decision making in the cockpit, resulting in an
accident. In fact, aviation accident databases are
replete with instances of poor coordination among
aircrew. One of the more tragic examples was the
crash of a civilian airliner at night in the Florida
Everglades in 1972 as the crew was busily trying to
9
troubleshoot what amounted to a burnt out indicator
light. Unfortunately, no one in the cockpit was monitoring
the aircraft’s altitude as the altitude hold was
inadvertently disconnected. Ideally, the crew would
have coordinated the trouble-shooting task ensuring
that at least one crewmember was monitoring basic
flight instruments and “flying” the aircraft. Tragically,
this was not the case, as they entered a slow,
unrecognized, descent into the everglades resulting
in numerous fatalities.
Personal Readiness. In aviation, or for that matter
in any occupational setting, individuals are expected
to show up for work ready to perform at optimal
levels. Nevertheless, in aviation as in other professions,
personal readiness failures occur when individuals
fail to prepare physically or mentally for duty.
For instance, violations of crew rest requirements,
bottle-to-brief rules, and self-medicating all will affect
performance on the job and are particularly
detrimental in the aircraft. It is not hard to imagine
that, when individuals violate crew rest requirements,
they run the risk of mental fatigue and other adverse
mental states, which ultimately lead to errors and
accidents. Note however, that violations that affect
personal readiness are not considered “unsafe act,
violation” since they typically do not happen in the
cockpit, nor are they necessarily active failures with
direct and immediate consequences.
Still, not all personal readiness failures occur as a
result of violations of governing rules or regulations.
For example, running 10 miles before piloting an
aircraft may not be against any existing regulations,
yet it may impair the physical and mental capabilities
of the individual enough to degrade performance and
elicit unsafe acts. Likewise, the traditional “candy bar
and coke” lunch of the modern businessman may
sound good but may not be sufficient to sustain
performance in the rigorous environment of aviation.
While there may be no rules governing such
behavior, pilots must use good judgment when deciding
whether they are “fit” to fly an aircraft.
Unsafe Supervision
Recall that in addition to those causal factors
associated with the pilot/operator, Reason (1990)
traced the causal chain of events back up the supervisory
chain of command. As such, we have identified
four categories of unsafe supervision: inadequate
supervision, planned inappropriate operations, failure
to correct a known problem, and supervisory
violations (Figure 4). Each is described briefly below.
Inadequate Supervision. The role of any supervisor
is to provide the opportunity to succeed. To do this,
the supervisor, no matter at what level of operation,
must provide guidance, training opportunities, leadership,
and motivation, as well as the proper role
model to be emulated. Unfortunately, this is not
always the case. For example, it is not difficult to
conceive of a situation where adequate crew resource
management training was either not provided, or the
opportunity to attend such training was not afforded
to a particular aircrew member. Conceivably, aircrew
coordination skills would be compromised and if the
aircraft were put into an adverse situation (an emergency
for instance), the risk of an error being committed
would be exacerbated and the potential for an
accident would increase markedly.
In a similar vein, sound professional guidance and
oversight is an essential ingredient of any successful
organization. While empowering individuals to make
decisions and function independently is certainly
essential, this does not divorce the supervisor from
accountability. The lack of guidance and oversight
10
has proven to be the breeding ground for many of the
violations that have crept into the cockpit. As such,
any thorough investigation of accident causal factors
must consider the role supervision plays (i.e., whether
the supervision was inappropriate or did not occur at
all) in the genesis of human error (Table 3).
Planned Inappropriate Operations. Occasionally,
the operational tempo and/or the scheduling of aircrew
is such that individuals are put at unacceptable
risk, crew rest is jeopardized, and ultimately performance
is adversely affected. Such operations, though
arguably unavoidable during emergencies, are unacceptable
during normal operations. Therefore, the
second category of unsafe supervision, planned inappropriate
operations, was created to account for these
failures (Table 3).
Take, for example, the issue of improper crew
pairing. It is well known that when very senior,
dictatorial captains are paired with very junior, weak
co-pilots, communication and coordination problems
are likely to occur. Commonly referred to as the
trans-cockpit authority gradient, such conditions
likely contributed to the tragic crash of a commercial
airliner into the Potomac River outside of Washington,
DC, in January of 1982 (NTSB, 1982). In that
accident, the captain of the aircraft repeatedly rebuffed
the first officer when the latter indicated that
the engine instruments did not appear normal. Undaunted,
the captain continued a fatal takeoff in icing
conditions with less than adequate takeoff thrust.
The aircraft stalled and plummeted into the icy river,
killing the crew and many of the passengers.
Clearly, the captain and crew were held accountable.
They died in the accident and cannot shed light
on causation; but, what was the role of the supervisory
chain? Perhaps crew pairing was equally responsible.
Although not specifically addressed in the report,
such issues are clearly worth exploring in many accidents.
In fact, in that particular accident, several
other training and manning issues were identified.
Failure to Correct a Known Problem. The third
category of known unsafe supervision, Failed to Correct
a Known Problem, refers to those instances when
deficiencies among individuals, equipment, training
or other related safety areas are “known” to the
supervisor, yet are allowed to continue unabated
(Table 3). For example, it is not uncommon for
accident investigators to interview the pilot’s friends,
colleagues, and supervisors after a fatal crash only to
find out that they “knew it would happen to him
some day.” If the supervisor knew that a pilot was
incapable of flying safely, and allowed the flight
anyway, he clearly did the pilot no favors. The failure
to correct the behavior, either through remedial training
or, if necessary, removal from flight status, essentially
signed the pilot’s death warrant - not to mention
that of others who may have been on board.
11
Likewise, the failure to consistently correct or discipline
inappropriate behavior certainly fosters an unsafe
atmosphere and promotes the violation of rules. Aviation
history is rich with by reports of aviators who tell
hair-raising stories of their exploits and barnstorming
low-level flights (the infamous “been there, done that”).
While entertaining to some, they often serve to promulgate
a perception of tolerance and “one-up-manship”
until one day someone ties the low altitude flight record
of ground-level! Indeed, the failure to report these
unsafe tendencies and initiate corrective actions is yet
another example of the failure to correct known problems.
Supervisory Violations. Supervisory violations, on the
other hand, are reserved for those instances when existing
rules and regulations are willfully disregarded by
supervisors (Table 3). Although arguably rare, supervisors
have been known occasionally to violate the rules
and doctrine when managing their assets. For instance,
there have been occasions when individuals were
permitted to operate an aircraft without current qualifications
or license. Likewise, it can be argued that
failing to enforce existing rules and regulations or flaunting
authority are also violations at the supervisory level.
While rare and possibly difficult to cull out, such
practices are a flagrant violation of the rules and invariably
set the stage for the tragic sequence of events that
predictably follow.
Organizational Influences
As noted previously, fallible decisions of upper-level
management directly affect supervisory practices, as
well as the conditions and actions of operators. Unfortunately,
these organizational errors often go unnoticed
by safety professionals, due in large part to the lack of a
clear framework from which to investigate them. Generally
speaking, the most elusive of latent failures revolve
around issues related to resource management, organizational
climate, and operational processes, as detailed
below in Figure 5.
Resource Management. This category encompasses
the realm of corporate-level decision making regarding
the allocation and maintenance of organizational
assets such as human resources (personnel), monetary
assets, and equipment/facilities (Table 4). Generally,
corporate decisions about how such resources should
be managed center around two distinct objectives –
the goal of safety and the goal of on-time, costeffective
operations. In times of prosperity, both
objectives can be easily balanced and satisfied in full.
However, as we mentioned earlier, there may also be
times of fiscal austerity that demand some give and
take between the two. Unfortunately, history tells us
that safety is often the loser in such battles and, as
some can attest to very well, safety and training are
often the first to be cut in organizations having
financial difficulties. If cutbacks in such areas are too
severe, flight proficiency may suffer, and the best
pilots may leave the organization for greener pastures.
Excessive cost-cutting could also result in reduced
funding for new equipment or may lead to the purchase
of equipment that is sub optimal and inadequately
designed for the type of operations flown by
the company. Other trickle-down effects include
poorly maintained equipment and workspaces, and
the failure to correct known design flaws in existing
equipment. The result is a scenario involving unseasoned,
less-skilled pilots flying old and poorly maintained
aircraft under the least desirable conditions
and schedules. The ramifications for aviation safety
are not hard to imagine.
Climate. Organizational Climate refers to a broad
class of organizational variables that influence worker
performance. Formally, it was defined as the
“situationally based consistencies in the organization’s
treatment of individuals” (Jones, 1988). In general,
however, organizational climate can be viewed as the
working atmosphere within the organization. One
telltale sign of an organization’s climate is its structure,
12
13
as reflected in the chain-of-command, delegation of
authority and responsibility, communication channels,
and formal accountability for actions (Table 4).
Just like in the cockpit, communication and coordination
are vital within an organization. If management
and staff within an organization are not
communicating, or if no one knows who is in charge,
organizational safety clearly suffers and accidents do
happen (Muchinsky, 1997).
An organization’s policies and culture are also
good indicators of its climate. Policies are official
guidelines that direct management’s decisions about
such things as hiring and firing, promotion, retention,
raises, sick leave, drugs and alcohol, overtime,
accident investigations, and the use of safety equipment.
Culture, on the other hand, refers to the
unofficial or unspoken rules, values, attitudes, beliefs,
and customs of an organization. Culture is “the
way things really get done around here.”
When policies are ill-defined, adversarial, or conflicting,
or when they are supplanted by unofficial
rules and values, confusion abounds within the organization.
Indeed, there are some corporate managers
who are quick to give “lip service” to official safety
policies while in a public forum, but then overlook
such policies when operating behind the scenes.
However, the Third Law of Thermodynamics tells us
that, “order and harmony cannot be produced by
such chaos and disharmony”. Safety is bound to
suffer under such conditions.
Operational Process. This category refers to corporate
decisions and rules that govern the everyday activities
within an organization, including the establishment
and use of standardized operating procedures and formal
methods for maintaining checks and balances (oversight)
between the workforce and management. For
example, such factors as operational tempo, time pressures,
incentive systems, and work schedules are all
factors that can adversely affect safety (Table 4). As
stated earlier, there may be instances when those within
the upper echelon of an organization determine that it
is necessary to increase the operational tempo to a point
that overextends a supervisor’s staffing capabilities.
Therefore, a supervisor may resort to the use of inadequate
scheduling procedures that jeopardize crew rest
and produce sub optimal crew pairings, putting aircrew
at an increased risk of a mishap. However, organizations
should have official procedures in place to
address such contingencies as well as oversight programs
to monitor such risks.
Regrettably, not all organizations have these procedures
nor do they engage in an active process of
monitoring aircrew errors and human factor problems
via anonymous reporting systems and safety
audits. As such, supervisors and managers are often
unaware of the problems before an accident occurs.
Indeed, it has been said that “an accident is one
incident to many” (Reinhart, 1996). It is incumbent
upon any organization to fervently seek out the “holes
in the cheese” and plug them up, before they create a
window of opportunity for catastrophe to strike.
CONCLUSION
It is our belief that the Human Factors Analysis
and Classification System (HFACS) framework
bridges the gap between theory and practice by providing
investigators with a comprehensive, userfriendly
tool for identifying and classifying the human
causes of aviation accidents. The system, which is
based upon Reason’s (1990) model of latent and
active failures (Shappell & Wiegmann, 1997a), encompasses
all aspects of human error, including the
conditions of operators and organizational failure.
Still, HFACS and any other framework only contributes
to an already burgeoning list of human error
taxonomies if it does not prove useful in the operational
setting. In these regards, HFACS has recently
been employed by the U.S. Navy, Marine Corps,
Army, Air Force, and Coast Guard for use in aviation
accident investigation and analysis. To date, HFACS
has been applied to the analysis of human factors data
from approximately 1,000 military aviation accidents.
Throughout this process, the reliability and
content validity of HFACS has been repeatedly tested
and demonstrated (Shappell & Wiegmann, 1997c).
Given that accident databases can be reliably analyzed
using HFACS, the next logical question is
whether anything unique will be identified. Early
indications within the military suggest that the
HFACS framework has been instrumental in the
identification and analysis of global human factors
safety issues (e.g., trends in aircrew proficiency;
Shappell, et al., 1999), specific accident types (e.g.,
controlled flight into terrain, CFIT; Shappell &
Wiegmann, 1997b), and human factors problems
such as CRM failures (Wiegmann & Shappell, 1999).
Consequently, the systematic application of HFACS
to the analysis of human factors accident data has
afforded the U.S. Navy/Marine Corps (for which the
14
original taxonomy was developed) the ability to develop
objective, data-driven intervention strategies.
In a sense, HFACS has illuminated those areas ripe
for intervention rather than relying on individual
research interests not necessarily tied to saving lives
or preventing aircraft losses.
Additionally, the HFACS framework and the insights
gleaned from database analyses have been used
to develop innovative accident investigation methods
that have enhanced both the quantity and quality
of the human factors information gathered during
accident investigations. However, not only are safety
professionals better suited to examine human error in
the field but, using HFACS, they can now track those
areas (the holes in the cheese) responsible for the
accidents as well. Only now is it possible to track the
success or failure of specific intervention programs
designed to reduce specific types of human error and
subsequent aviation accidents. In so doing, research
investments and safety programs can be either readjusted
or reinforced to meet the changing needs of
aviation safety.
Recently, these accident analysis and investigative
techniques, developed and proven in the military,
have been applied to the analysis and investigation of
U.S. civil aviation accidents (Shappell & Wiegmann,
1999). Specifically, the HFACS framework is currently
being used to systematically analyze both commercial
and General Aviation accident data to explore
the underlying human factors problems associated
with these events. The framework is also being employed
to develop improved methods and techniques
for investigating human factors issues during actual
civil aviation accident investigations by Federal Aviation
Administration and National Transportation
Safety Board officials. Initial results of this project
have begun to highlight human factors areas in need
of further safety research. In addition, like their
military counterparts, it is anticipated that HFACS
will provide the fundamental information and tools
needed to develop a more effective and accessible
human factors accident database for civil aviation.
In summary, the development of the HFACS
framework has proven to be a valuable first step in the
establishment of a larger military and civil aviation
safety program. The ultimate goal of this, and any
other, safety program is to reduce the aviation accident
rate through systematic, data-driven investment.
REFERENCES
Bird, F. (1974). Management guide to loss control. Atlanta,
GA: Institute Press.
Heinrich, H.W., Petersen, D., & Roos, N. (1980).
Industrial accident prevention: A safety management
approach (5th ed.). New York: McGraw-Hill.
Helmreich, R.L., & Foushee, H.C. (1993). Why crew
resource management? Empirical and theoretical
bases of human factors training in aviation. In
E.L. Wiener, B.G. Kanki, & R.L. Helmreich
(Eds.), Cockpit resource management (pp. 3-45).
San Diego, CA: Academic Press.
Jones, A.P. (1988). Climate and measurement of consensus:
A discussion of “organizational climate.”
In S.G. Cole, R.G. Demaree & W. Curtis, (Eds.),
Applications of Interactionist Psychology: Essays in
Honor of Saul B. Sells (pp. 283-90). Hillsdale, NJ:
Earlbaum.
Murray, S.R. (1997). Deliberate decision making by
aircraft pilots: A simple reminder to avoid decision
making under panic. The International Journal
of Aviation Psychology, 7, 83-100.
Muchinsky, P.M. (1997). Psychology applied to work
(5th ed.). Pacific Grove, CA: Brooks/Cole Publishing
Co.
National Transportation Safety Board. (1982). Air
Florida, Inc., Boeing 737-222, N62AF, Collision
with 14th Street bridge, near Washington National
Airport, Washington, D.C., January 13, 1982
(Tech. Report NTSB-AAR-82-8). Washington:
National Transportation Safety Board.
Orasanu, J.M. (1993). Decision-making in the cockpit.
In E.L. Wiener, B.G. Kanki, and R.L.
Helmreich (Eds.), Cockpit resource management
(pp. 137-72). San Diego, CA: Academic Press.
Rasmussen, J. (1982). Human errors: A taxonomy for
describing human malfunction in industrial installations.
Journal of Occupational Accidents, 4,
311-33.
Reason, J. (1990). Human error. New York: Cambridge
University Press.
Reinhart, R.O. (1996). Basic flight physiology (2nd ed.).
New York: McGraw-Hill.
15
Senders, J.W., and Moray, N.P. (1991). Human error:
Cause, prediction and reduction. Hillsdale, NJ:
Earlbaum.
Shappell, S.A., and Wiegmann, D.A. (1996). U.S.
naval aviation mishaps 1977-92: Differences between
single- and dual-piloted aircraft. Aviation,
Space, and Environmental Medicine, 67, 65-9.
Shappell, S.A. and Wiegmann D.A. (1997a). A human
error approach to accident investigation: The taxonomy
of unsafe operations. The International
Journal of Aviation Psychology, 7, 269-91.
Shappell, S.A. & Wiegmann, D.A. (1997b). Why
would an experienced aviator fly a perfectly
good aircraft into the ground? In Proceedings of
the Ninth International Symposium on Aviation
Psychology, (pp. 26-32). Columbus, OH: The
Ohio State University.
Shappell, S.A. and Wiegmann, D.A. (1997). A reliability
analysis of the Taxonomy of Unsafe Operations.
Aviation, Space, and Environmental Medicine,
68, 620.
Shappell, S.A. and Wiegmann, D.A. (1999a). Human
error in commercial and corporate aviation: An
analysis of FAR Part 121 and 135 mishaps using
HFACS. Aviation, Space, and Environmental Medicine,
70, 407.
Shappell, S., Wiegmann, D., Fraser, J., Gregory, G.,
Kinsey, P., and Squier, H (1999b). Beyond mishap
rates: A human factors analysis of U.S. Navy/
Marine Corps TACAIR and rotary wing mishaps
using HFACS. Aviation, Space, and Environmental
Medicine, 70, 416-17.
Wiegmann, D.A. and Shappell, S.A. (1997). Human
factors analysis of post-accident data: Applying
theoretical taxonomies of human error. The International
Journal of Aviation Psychology, 7, 67-81.
Wiegmann, D.A. and Shappell, S.A. (1999). Human
error and crew resource management failures in
Naval aviation mishaps: A review of U.S. Naval
Safety Center data, 1990-96. Aviation, Space, and
Environmental Medicine, 70, 1147-51.

使用道具 举报

Rank: 1

3#
发表于 2010-12-8 09:57:25 |只看该作者

谢谢

不错 非常好的

使用道具 举报

Rank: 1

4#
发表于 2011-1-12 10:48:39 |只看该作者
dingdingdingdingdingdingding

使用道具 举报

Rank: 1

5#
发表于 2011-3-11 22:26:45 |只看该作者
谢谢一起分享

使用道具 举报

您需要登录后才可以回帖 登录 | 注册


Archiver|航空论坛 ( 渝ICP备10008336号 )

GMT+8, 2024-5-6 08:23 , Processed in 0.031200 second(s), 12 queries .

Powered by Discuz! X2

© 2001-2011 MinHang.CC.

回顶部