航空论坛_航空翻译_民航英语翻译_飞行翻译

 找回密码
 注册
搜索
查看: 1659|回复: 4
打印 上一主题 下一主题

HFACS对商用航空事故分析DOT版 [复制链接]

Rank: 9Rank: 9Rank: 9

跳转到指定楼层
1#
发表于 2010-5-15 08:36:46 |只看该作者 |倒序浏览

HFACS对商用航空事故分析DOT版

 

游客,如果您要查看本帖隐藏内容请回复

附件: 你需要登录才可以下载或查看附件。没有帐号?注册

Rank: 9Rank: 9Rank: 9

2#
发表于 2010-5-15 08:37:00 |只看该作者

Douglas A. Wiegmann
University of Illinois at Urbana-Champaign
Institute of Aviation
Savoy, IL 61874
Scott A. Shappell
FAA Civil Aeromedical Institute
P.O. Box 25082
Oklahoma City, OK 73125
February 2001
Final Report
This document is available to the public
through the National Technical Information
Service, Springfield, Virginia 22161.
Office of Aviation Medicine
Washington, D.C. 20591
A Human Error Analysis of
Commercial Aviation Accidents
Using the Human Factors
Analysis and Classification
System (HFACS)
U.S. Department
of Transpor tation
Federal Aviation
Administration
DOT/FAA/AM-01/3
NOTICE
This document is disseminated under the sponsorship of
the U.S. Department of Transportation in the interest of
information exchange. The United States Government
assumes no liability for the contents thereof.
Technical Report Documentation Page
1. Report No. 2. Government Accession No. 3. Recipient's Catalog No.
DOT/FAA/AM-01/3
4. Title and Subtitle 5. Report Date
A Human Error Analysis of Commercial Aviation Accidents February 2001
Using the Human Factors Analysis and Classification System (HFACS)
6. Performing Organization Code
7. Author(s) 8. Performing Organization Report No.
Wiegmann, D.A.1, and Shappell, S.A.2
9. Performing Organization Name and Address 10. Work Unit No. (TRAIS)
11. Contract or Grant No.
1University of Illinois at Urbana-Champaign, Institute of Aviation,
Savoy, IL 61874
2FAA Civil Aeromedical Institute, P.O. Box 25082, Oklahoma City, OK 73125 99-G-006
12. Sponsoring Agency name and Address 13. Type of Report and Period Covered
Office of Aviation Medicine
Federal Aviation Administration 14. Sponsoring Agency Code
800 Independence Ave., S.W.
Washington, DC 20591
15. Supplemental Notes
Work was accomplished under task # AAM-A-00-HRR-520.
16. Abstract
The Human Factors Analysis and Classification System (HFACS) is a general human error framework
originally developed and tested within the U.S. military as a tool for investigating and analyzing the human
causes of aviation accidents. Based upon Reason’s (1990) model of latent and active failures, HFACS
addresses human error at all levels of the system, including the condition of aircrew and organizational
factors. The purpose of the present study was to assess the utility of the HFACS framework as an error
analysis and classification tool outside the military. Specifically, HFACS was applied to commercial aviation
accident records maintained by the National Transportation Safety Board (NTSB). Using accidents that
occurred between January 1990 and December 1996, it was demonstrated that HFACS reliably
accommodated all human causal factors associated with the commercial accidents examined. In addition, the
classification of data using HFACS highlighted several critical safety issues in need of intervention research.
These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation
arena.
17. Key Words 18. Distribution Statement
Aviation, Human Error, Accident Investigation,
Database Analysis, Commercial Aviation
Document is available to the public through the
National Technical Information Service,
Springfield, Virginia 22161
19. Security Classif. (of this report) 20. Security Classif. (of this page) 21. No. of Pages 22. Price
Unclassified Unclassified 17
Form DOT F 1700.7 (8-72) Reproduction of completed page authorized
i

ACKNOWLEDGMENTS
The authors thank Frank Cristina and Anthony Pape for their assistance in gathering,
organizing and analyzing the accident reports used in this study.
iii

1
A HUMAN ERROR ANALYSIS OF COMMERCIAL AVIATION ACCIDENTS USING THE
HUMAN FACTORS ANALYSIS AND CLASSIFICATION SYSTEM (HFACS)
INTRODUCTION
Humans, by their very nature, make mistakes; therefore,
it should come as no surprise that human error has
been implicated in a variety of occupational accidents,
including 70% to 80% of those in civil and military
aviation (O’Hare, Wiggins, Batt, & Morrison, 1994;
Wiegmann and Shappell, 1999; Yacavone, 1993). In
fact, while the number of aviation accidents attributable
solely to mechanical failure has decreased markedly over
the past 40 years, those attributable at least in part to
human error have declined at a much slower rate (Shappell
& Wiegmann, 1996). Given such findings, it would
appear that interventions aimed at reducing the occurrence
or consequences of human error have not been as
effective as those directed at mechanical failures. Clearly,
if accidents are to be reduced further, more emphasis
must be placed on the genesis of human error as it relates
to accident causation.
The prevailing means of investigating human error in
aviation accidents remains the analysis of accident and
incident data. Unfortunately, most accident reporting
systems are not designed around any theoretical framework
of human error. Indeed, most accident reporting
systems are designed and employed by engineers and
front-line operators with only limited backgrounds in
human factors. As a result, these systems have been useful
for identifying engineering and mechanical failures but
are relatively ineffective and narrow in scope where
human error exists. Even when human factors are addressed,
the terms and variables used are often ill-defined
and archival databases are poorly organized. The end
results are post-accident databases that typically are not
conducive to a traditional human error analysis, making
the identification of intervention strategies onerous
(Wiegmann & Shappell, 1997).
The Accident Investigation Process
To further illustrate this point, let us examine the
accident investigation and intervention process separately
for the mechanical and human components of an
accident. Consider first the occurrence of an aircraft
system or mechanical failure that results in an accident or
injury (Figure 1). A subsequent investigation takes place
that includes the examination of objective and quantifiable
information, such as that derived from the wreckage
and flight data recorder, as well as that from the application
of sophisticated analytical techniques like metallurgical
tests and computer modeling. This kind of
information is then used to determine the probable
mechanical cause(s) of the accident and to identify safety
recommendations.
Upon completion of the investigation, this “objective”
information is typically entered into a highlystructured
and well-defined accident database. These
data can then be periodically analyzed to determine
system-wide safety issues and provide feedback to investigators,
thereby improving investigative methods and
techniques. In addition, the data are often used to guide
organizations (e.g., the Federal Aviation Administration
[FAA], National Aeronautics and Space Administration
[NASA], Department of Defense [DoD], airplane manufacturers
and airlines) in deciding which research or
safety programs to sponsor. As a result, these needsbased,
data-driven programs, in turn, have typically
produced effective intervention strategies that either
prevent mechanical failures from occurring altogether,
or mitigate their consequences when they do happen. In
either case, there has been a substantial reduction in the
rate of accidents due to mechanical or systems failures.
In stark contrast, Figure 2 illustrates the current
human factors accident investigation and prevention
process. This example begins with the occurrence of an
aircrew error during flight operations that leads to an
accident or incident. A human performance investigation
then ensues to determine the nature and causes of
such errors. However, unlike the tangible and quantifiable
evidence surrounding mechanical failures, the evidence
and causes of human error are generally qualitative
and elusive. Furthermore, human factors investigative
and analytical techniques are often less refined and
sophisticated than those used to analyze mechanical and
engineering concerns. As such, the determination of
human factors causal to the accident is a tenuous practice
at best; all of which makes the information entered in the
accident database sparse and ill-defined.
2
As a result, when traditional data analyses are performed
to determine common human factors problems
across accidents, the interpretation of the findings and
the subsequent identification of important safety issues
are of limited practical use. To make matters worse,
results from these analyses provide limited feedback to
investigators and are of limited use to airlines and government
agencies in determining the types of research or
safety programs to sponsor. As such, many research
programs tend to be intuitively-, or fad-driven, rather
than data-driven, and typically produce intervention
strategies that are only marginally effective at reducing
the occurrence and consequence of human error. The
overall rate of human-error related accidents, therefore,
has remained relatively high and constant over the last
several years (Shappell & Wiegmann, 1996).
Addressing the Problem
If the FAA and the aviation industry are to achieve
their goal of significantly reducing the aviation accident
rate over the next ten years, the primary causes of aviation
accidents (i.e., human factors) must be addressed (ICAO,
1993). However, as illustrated in Figure 2, simply
increasing the amount of money and resources spent on
human factors research is not the solution. Indeed, a
great deal of resources and efforts are currently being
expended. Rather, the solution is to redirect safety efforts
so that they address important human factors issues.
However, this assumes that we know what the important
human factors issues are. Therefore, before research
efforts can be systematically refocused, a comprehensive
analysis of existing databases needs to be conducted to
determine those specific human factors responsible for
aviation accidents and incidents. Furthermore, if these
efforts are to be sustained, new investigative methods and
techniques will need to be developed so that data gathered
during human factors accident investigations can be
improved and analysis of the underlying causes of human
error facilitated.
To accomplish this improvement, a general human
error framework is needed around which new investigative
methods can be designed and existing postaccident
databases restructured. Previous attempts to do this have
met with encouraging, yet limited, success (O’Hare, et
al., 1994; Wiegmann & Shappell, 1997). This is primarily
because performance failures are influenced by a
Figure 1. General process of investigating and preventing aviation accidents involving mechanical or
systems failures.
Feedback
Mechanical
Failure
- Catastrophic failures
are infrequent
events
- When failures do
occur, they are often
less severe or
hazardous due to
effective
intervention
programs.
Data-Driven
Research
Research Sponsors
- FAA, DoD, NASA, & airplane
manufacturers provide
research funding.
- Research programs are needsbased
and data-driven.
Interventions are therefore
very effective.
Accident
Investigation
- Highly sophisticated
techniques and
procedures
- Information is
objective and
quantifiable
- Effective at
determining why the
failure occurred
Accident
Database
- Designed around
traditional
categories
- Variables are welldefined
and
causally related
- Organization and
structure facilitate
access and use
Database
Analysis
- Traditional
analyses are
clearly outlined
and readily
performed.
- Frequent analyses
help identify
common
mechanical and
engineering
safety issues.
Mitigation
Prevention
Effective
Intervention
and Prevention
Programs
3
variety of human factors that are typically not addressed
by traditional error frameworks. For instance, with few
exceptions (e.g., Rasmussen, 1982), human error taxonomies
do not consider the potential adverse mental
and physiological condition of the individual (e.g., fatigue,
illness, attitudes) when describing errors in the
cockpit. Likewise, latent errors committed by officials
within the management hierarchy such as line managers
and supervisors are often not addressed, even though it is
well known that these factors directly influence the
condition and decisions of pilots (Reason, 1990). Therefore,
if a comprehensive analysis of human error is to be
conducted, a taxonomy that takes into account the
multiple causes of human failure must be offered.
Recently, the Human Factors Analysis and Classification
System (HFACS) was developed to meet these needs
(Shappell & Wiegmann, 1997a, 2000a, and in press).
This system, which is based on Reason’s (1990) model of
latent and active failures, was originally developed for the
U.S. Navy and Marine Corps as an accident investigation
and data analysis tool. Since its original development,
however, HFACS has been employed by other military
organizations (e.g., U.S. Army, Air Force, and Canadian
Defense Force) as an adjunct to preexisting accident
investigation and analysis systems. To date, the HFACS
framework has been applied to more than 1,000 military
aviation accidents, yielding objective, data-driven intervention
strategies while enhancing both the quantity and
quality of human factors information gathered during
accident investigations (Shappell & Wiegmann, in press).
Other organizations such as the FAA and NASA have
explored the use of HFACS as a complement to preexisting
systems within civil aviation in an attempt to capitalize
on gains realized by the military (Ford, Jack, Crisp, &
Sandusky, 1999). Still, few systematic efforts have examined
whether HFACS is indeed a viable tool within the
civil aviation arena, even though it can be argued that the
similarities between military and civilian aviation outweigh
their differences. The purpose of the present study
was to empirically address this issue by applying the
HFACS framework, as originally designed for the military,
to the classification and analysis of civil aviation
accident data. Before beginning, however, a brief overview
of the HFACS system will be presented for those
Feedback
Human
Error
- Errors occur
frequently and are
the major cause of
accidents.
- Few safety programs
are effective at
preventing the
occurrence or
consequences of
these errors.
Research Sponsors
- FAA, DoD, NASA, & Airlines
provide funding for safety
research programs.
- Lack of good data leads to
research programs based
primarily on interests and
intuitions. Interventions are
therefore less effective.
Fad-Driven
Research
Mitigation
Prevention
Ineffective
Intervention
and Prevention
Programs
Accident
Investigation
- Less sophisticated
techniques and
procedures
- Information is
qualitative and
illusive
- Focus on “what”
happened but not
“why” it happened
Accident
Database
- Not designed
around any
particular human
error framework
- Variables often illdefined
- Organization and
structure difficult
to understand
Database
Analysis
- Traditional human
factors analyses
are onerous due
to ill-defined
variables and
database
structures.
- Few analyses have
been performed
to identify
underlying
human factors
safety issues.
Figure 2. General process of investigating and preventing aviation accidents involving human error.
4
readers who may not be familiar with the framework (for
a detailed description of HFACS, see Shappell and
Wiegmann, 2000a and 2001).
HFACS
Drawing upon Reason’s (1990) concept of latent and
active failures, HFACS describes human error at each of
four levels of failure: 1) unsafe acts of operators (e.g.,
aircrew), 2) preconditions for unsafe acts, 3) unsafe
supervision, and 4) organizational influences. A brief
description of each causal category follows (Figure 3).
Unsafe Acts of Operators
The unsafe acts of operators (aircrew) can be loosely
classified into one of two categories: errors and violations
(Reason, 1990). While both are common within most
settings, they differ markedly when the rules and regulation
of an organization are considered. That is, errors can
be described as those “legal” activities that fail to achieve
their intended outcome, while violations are commonly
defined as behavior that represents the willful disregard
for the rules and regulations. It is within these two
overarching categories that HFACS describes three types
of errors (decision, skill-based, and perceptual) and two
types of violations (routine and exceptional).
Errors
One of the more common error forms, decision errors,
represents conscious, goal-intended behavior that proceeds
as designed; yet, the plan proves inadequate or
inappropriate for the situation. Often referred to as
“honest mistakes,” these unsafe acts typically manifest as
poorly executed procedures, improper choices, or simply
the misinterpretation or misuse of relevant information.
In contrast to decision errors, the second error form,
skill-based errors, occurs with little or no conscious thought.
Just as little thought goes into turning one’s steering
wheel or shifting gears in an automobile, basic flight
Figure 3. Overview of the Human Factors Analysis and Classification System (HFACS).
Errors
Perceptual
Errors
Skill-Based
Errors
UNSAFE
ACTS
Decision
Errors
Routine Exceptional
Violations
Inadequate
Supervision
Planned
Inappropriate
Operations
Failed to
Correct
Problem
Supervisory
Violations
UNSAFE
SUPERVISION
Substandard
Conditions of
Operators
PRECONDITIONS
FOR
UNSAFE ACTS
Adverse
Physiological States
Physical/
Mental
Limitations
Adverse Mental
States
Personal
Readiness
Crew Resource
Mismanagement
Substandard
Practices of
Operators
Resource
Management
Organizational
Climate
Organizational
Process
ORGANIZATIONAL
INFLUENCES
5
skills such as stick and rudder movements and visual
scanning often occur without thinking. The difficulty
with these highly practiced and seemingly automatic
behaviors is that they are particularly susceptible to
attention and/or memory failures. As a result, skill-based
errors such as the breakdown in visual scan patterns,
inadvertent activation/deactivation of switches, forgotten
intentions, and omitted items in checklists often
appear. Even the manner (or skill) with which one flies
an aircraft (aggressive, tentative, or controlled) can
affect safety.
While, decision and skill-based errors have dominated
most accident databases and therefore, have been
included in most error frameworks, the third and final
error form, perceptual errors, has received comparatively
less attention. No less important, perceptual errors occur
when sensory input is degraded, or “unusual,” as is often
the case when flying at night, in the weather, or in other
visually impoverished environments. Faced with acting
on imperfect or less information, aircrew run the risk of
misjudging distances, altitude, and decent rates, as well
as a responding incorrectly to a variety of visual/vestibular
illusions.
Violations
Although there are many ways to distinguish among
types of violations, two distinct forms have been identified
based on their etiology. The first, routine violations,
tend to be habitual by nature and are often enabled by a
system of supervision and management that tolerates
such departures from the rules (Reason, 1990). Often
referred to as “bending the rules,” the classic example is
that of the individual who drives his/her automobile
consistently 5-10 mph faster than allowed by law. While
clearly against the law, the behavior is, in effect, sanctioned
by local authorities (police) who often will not
enforce the law until speeds in excess of 10 mph over the
posted limit are observed.
Exceptional violations, on the other hand, are isolated
departures from authority, neither typical of the individual
nor condoned by management. For example,
while driving 65 in a 55 mph zone might be condoned by
authorities, driving 105 mph in a 55 mph zone certainly
would not. It is important to note, that while most
exceptional violations are appalling, they are not considered
“exceptional” because of their extreme nature. Rather,
they are regarded as exceptional because they are neither
typical of the individual nor condoned by authority.
Preconditions for Unsafe Acts
Simply focusing on unsafe acts, however, is like focusing
on a patient’s symptoms without understanding the
underlying disease state that caused it. As such, investigators
must dig deeper into the preconditions for unsafe
acts. Within HFACS, two major subdivisions are described:
substandard conditions of operators and the
substandard practices they commit.
Substandard Conditions of the Operator
Being prepared mentally is critical in nearly every
endeavor; perhaps it is even more so in aviation. With
this in mind, the first of three categories, adverse mental
states, was created to account for those mental conditions
that adversely affect performance. Principal among these
are the loss of situational awareness, mental fatigue,
circadian dysrhythmia, and pernicious attitudes such as
overconfidence, complacency, and misplaced motivation
that negatively impact decisions and contribute to
unsafe acts.
Equally important, however, are those adverse physiological
states that preclude the safe conduct of flight.
Particularly important to aviation are conditions such as
spatial disorientation, visual illusions, hypoxia, illness,
intoxication, and a whole host of pharmacological and
medical abnormalities known to affect performance. For
example, it is not surprising that, when aircrews become
spatially disoriented and fail to rely on flight instrumentation,
accidents can, and often do, occur.
Physical and/or mental limitations of the operator, the
third and final category of substandard condition, includes
those instances when necessary sensory information
is either unavailable, or if available, individuals
simply do not have the aptitude, skill, or time to safely
deal with it. For aviation, the former often includes not
seeing other aircraft or obstacles due to the size and/or
contrast of the object in the visual field. However, there
are many times when a situation requires such rapid
mental processing or reaction time that the time allotted
to remedy the problem exceeds human limits (as is often
the case during nap-of-the-earth flight). Nevertheless,
even when favorable visual cues or an abundance of time
is available, there are instances when an individual simply
may not possess the necessary aptitude, physical ability,
or proficiency to operate safely.
6
Substandard Practices of the Operator
Often times, the substandard practices of aircrew will
lead to the conditions and unsafe acts described above.
For instance, the failure to ensure that all members of the
crew are acting in a coordinated manner can lead to
confusion (adverse mental state) and poor decisions in
the cockpit. Crew resource mismanagement, as it is referred
to here, includes the failures of both inter- and
intra-cockpit communication, as well as communication
with ATC and other ground personnel. This category
also includes those instances when crewmembers do not
work together as a team, or when individuals directly
responsible for the conduct of operations fail to coordinate
activities before, during, and after a flight.
Equally important, however, individuals must ensure
that they are adequately prepared for flight. Consequently,
the category of personal readiness was created to
account for those instances when rules such as disregarding
crew rest requirements, violating alcohol restrictions,
or self-medicating, are not adhered to. However, even
behaviors that do not necessarily violate existing rules or
regulations (e.g., running ten miles before piloting an
aircraft or not observing good dietary practices) may
reduce the operating capabilities of the individual and
are, therefore, captured here.
Unsafe Supervision
Clearly, aircrews are responsible for their actions and,
as such, must be held accountable. However, in many
instances, they are the unwitting inheritors of latent
failures attributable to those who supervise them (Reason,
1990). To account for these latent failures, the
overarching category of unsafe supervision was created
within which four categories (inadequate supervision,
planned inappropriate operations, failed to correct known
problems, and supervisory violations) are included.
The first category, inadequate supervision, refers to
failures within the supervisory chain of command, which
was a direct result of some supervisory action or inaction.
That is, at a minimum, supervisors must provide the
opportunity for individuals to succeed. It is expected,
therefore, that individuals will receive adequate training,
professional guidance, oversight, and operational leadership,
and that all will be managed appropriately. When
this is not the case, aircrews are often isolated, as the risk
associated with day-to-day operations invariably will
increase.
However, the risk associated with supervisory failures
can come in many forms. Occasionally, for example, the
operational tempo and/or schedule is planned such that
individuals are put at unacceptable risk and, ultimately,
performance is adversely affected. As such, the category
of planned inappropriate operations was created to account
for all aspects of improper or inappropriate crew
scheduling and operational planning, which may focus
on such issues as crew pairing, crew rest, and managing
the risk associated with specific flights.
The remaining two categories of unsafe supervision,
the failure to correct known problems and supervisory
violations, are similar, yet considered separately within
HFACS. The failure to correct known problems refers to
those instances when deficiencies among individuals,
equipment, training, or other related safety areas are
“known” to the supervisor, yet are allowed to continue
uncorrected. For example, the failure to consistently
correct or discipline inappropriate behavior certainly
fosters an unsafe atmosphere but is not considered a
violation if no specific rules or regulations were broken.
Supervisory violations, on the other hand, are reserved
for those instances when existing rules and regulations
are willfully disregarded by supervisors when managing
assets. For instance, permitting aircrew to operate an
aircraft without current qualifications or license is a
flagrant violation that invariably sets the stage for the
tragic sequence of events that predictably follow.
Organizational Influences
Fallible decisions of upper-level management can
directly affect supervisory practices, as well as the conditions
and actions of operators. Unfortunately, these organizational
influences often go unnoticed or unreported by
even the best-intentioned accident investigators.
Traditionally, these latent organizational failures generally
revolve around three issues: 1) resource management,
2) organizational climate, and 3) operational
processes. The first category, resource management, refers
to the management, allocation, and maintenance of
organizational resources, including human resource
management (selection, training, staffing), monetary
safety budgets, and equipment design (ergonomic specifications).
In general, corporate decisions about how
such resources should be managed center around two
distinct objectives – the goal of safety and the goal of ontime,
cost-effective operations. In times of prosperity,
both objectives can be easily balanced and satisfied in
full. However, there may also be times of fiscal austerity
that demand some give and take between the two.
Unfortunately, history tells us that safety is often the loser in
such battles, as safety and training are often the first to be cut
in organizations experiencing financial difficulties.
7
Organizational climate refers to a broad class of organizational
variables that influence worker performance
and is defined as the “situationally based consistencies in
the organization’s treatment of individuals” (Jones, 1988).
One telltale sign of an organization’s climate is its
structure, as reflected in the chain-of-command, delegation
of authority and responsibility, communication
channels, and formal accountability for actions. Just like
in the cockpit, communication and coordination are
vital within an organization. However, an organization’s
policies and culture are also good indicators of its climate.
Consequently, when policies are ill-defined,
adversarial, or conflicting, or when they are supplanted
by unofficial rules and values, confusion abounds, and
safety suffers within an organization.
Finally, operational process refers to formal processes
(operational tempo, time pressures, production quotas,
incentive systems, schedules, etc.), procedures (performance
standards, objectives, documentation, instructions
about procedures, etc.), and oversight within the
organization (organizational self-study, risk management,
and the establishment and use of safety programs).
Poor upper-level management and decisions concerning
each of these organizational factors can also have a
negative, albeit indirect, effect on operator performance
and system safety.
Summary
The HFACS framework bridges the gap between
theory and practice by providing safety professionals
with a theoretically based tool for identifying and classifying
the human causes of aviation accidents. Because the
system focuses on both latent and active failures and their
interrelationships, it facilitates the identification of the
underlying causes of human error. To date, HFACS has
been shown to be useful within the context of military
aviation, as both a data analysis framework and an
accident investigation tool. However, HFACS has yet to
be applied systematically to the analysis and investigation
of civil aviation accidents. The purpose of the present
research project, therefore, was to assess the utility of the
HFACS framework as an error analysis and classification
tool within commercial aviation.
The specific objectives of this study were three-fold.
The first objective was to determine whether the HFACS
framework, in its current form, would be comprehensive
enough to accommodate all of the underlying human
causal-factors associated with commercial aviation accidents,
as contained in the accident databases maintained
by the FAA and NTSB. In other words, could the
framework capture all the relevant human error data or
would a portion of the database be lost because it was
unclassifiable? The second objective was to determine
whether the process of reclassifying the human causal
factors using HFACS was reliable. That is, would different
users of the system agree on how causal factors should
be coded using the framework? Finally, the third objective
was to determine whether reclassifying the data using
HFACS yield a benefit beyond what is already known
about commercial aviation accident causation. Specifically,
would HFACS highlight any heretofore unknown
safety issues in need of further intervention research?
METHOD
Data
A comprehensive review of all accidents involving
Code of Federal Air Regulations (FAR) Parts 121 and
135 Scheduled Air Carriers between January 1990 and
December 1996 was conducted using database records
maintained by the NTSB and the FAA. Of particular
interest to this study were those accidents attributable, at
least in part, to the aircrew. Consequently, not included
were accidents due solely to catastrophic failure, maintenance
error, and unavoidable weather conditions such as
turbulence and wind shear. Furthermore, only those
accidents in which the investigation was completed, and
the cause of the accident determined, were included in
this analysis. One hundred nineteen accidents met these
criteria, including 44 accidents involving FAR Part 121
operators and 75 accidents involving FAR Part 135
operators.
HFACS Classification
The 119 aircrew-related accidents yielded 319 causal
factors for further analyses. Each of these NTSB causal
factors was subsequently coded independently by both
an aviation psychologist and a commercially-rated pilot
using the HFACS framework. Only those causal factors
identified by the NTSB were analyzed. That is, no new
causal factors were created during the error-coding process.
RESULTS
HFACS Comprehensiveness
All 319 (100%) of the human causal factors associated
with aircrew-related accidents were accommodated using
the HFACS framework. Instances of all but two
HFACS categories (i.e., organizational climate and
personal readiness) were observed as least once in the
8
accident database. Therefore, no new HFACS categories
were needed to capture the existing causal factors, and no
human factors data pertaining to the aircrew were left
unclassified during the coding process.
HFACS Reliability
Disagreements among raters were noted during the
coding process and ultimately resolved by discussion.
Using the record of agreement and disagreement between
the raters, the reliability of the HFACS system was
assessed by calculating Cohen’s kappa — an index of
agreement that has been corrected for chance. The obtained
kappa value was .71, which generally reflects a
“good” level of agreement according to criteria described
by Fleiss (1981).
HFACS Analyses
Unsafe Acts
Table 1 presents percentages of FAR Parts 121 and
135 aircrew-related accidents associated with each of the
HFACS categories. An examination of the table reveals
that at the unsafe acts level, skill-based errors were
associated with the largest percentage of accidents. Approximately
60% of all aircrew-related accidents were
associated with at least one skill-based error. This percentage
was relatively similar for FAR Part 121 carriers
(63.6%) and FAR Part 135 carriers (58.7%). Figure 4,
panel A, illustrates that the proportion of accidents
associated with skill-based errors has remained relatively
unchanged over the seven-year period examined in the
study. Notably, however, the lowest proportion of accidents
associated with skill-based errors was observed in
the last two years of the study (1995 and 1996).
Among the remaining categories of unsafe acts, accidents
associated with decision errors constituted the next
highest proportion (i.e., roughly 29% of the accidents
examined, Table 1). Again, this percentage was roughly
equal across both FAR Part 121 (25.0%) and Part 135
(30.7%) accidents. With the exception of 1994, in which
the percentage of aircrew-related accidents associated
with decision errors reached a high of 60%, the proportion
of accidents associated with decision errors remained
relatively constant across the years of the study
(Figure 4, panel B).
Table 1. Percentage of Accidents Associated with each HFACS category.
HFACS Category FAR Part 121 FAR Part 135 Total
Organizational Influences
Resource Management
Organizational Climate
Organizational Process
4.5 (2)
0.0 (0)
15.9 (7)
1.3 (1)
0.0 (0)
4.0 (3)
2.5 (3)
0.0 (0)
8.4 (10)
Unsafe Supervision
Inadequate Supervision
Planned Inappropriate Operations
Failed to Correct Known Problem
Supervisory Violations
2.3 (1)
0.0 (0)
0.0 (0)
0.0 (0)
6.7 (5)
1.3 (1)
2.7 (2)
2.7 (2)
5.0 (6)
0.8 (1)
1.7 (2)
1.7 (2)
Preconditions of Unsafe Acts
Adverse Mental States
Adverse Physiological Sates
Physical/mental Limitations
Crew-resource Mismanagement
Personal Readiness
Unsafe Acts
Skill-based Errors
Decision Errors
Perceptual Errors
Violations
13.6 (6)
4.5 (2)
2.3 (1)
40.9 (18)
0.0 (0)
63.6 (28)
25.0 (11)
20.5 (9)
25.0 (11)
13.3 (10)
0.0 (0)
16.0 (12)
22.7 (17)
0.0 (0)
58.7 (44)
30.7 (23)
10.7 (8)
28.0 (21)
13.4 (16)
1.7 (2)
10.9 (13)
29.4 (35)
0.0 (0)
60.5 (72)
28.6 (34)
14.3 (17)
26.9 (32)
Note: Numbers in table are percentages of accidents that involved at least one instance of an HFACS category. Numbers in
parentheses indicate accident frequencies. Because more than one causal factor is generally associated with each accident,
the percentages in the table will not equal 100%.
9
Similar to accidents associated with decision errors,
those attributable at least in part to violations of rules and
regulations were associated with 26.9% of the accidents
examined. Again, no appreciable difference was evident
when comparing the relative percentages across FAR
Parts 121 (25.0%) and 135 (28.0%). However, an
examination of Figure 4, panel C, reveals that the relative
proportion of accidents associated with violations increased
appreciably from a low of 6% in 1990 to a high
of 46% in 1996.
Finally, the proportion of accidents associated with
perceptual errors was relatively low. In fact, only 17 of the
119 accidents (14.3%) involved some form of perceptual
error. While it appeared that the relative proportion of
Part 121 accidents associated with perceptual errors was
higher than Part 135 accidents, the low number of
occurrences precluded any meaningful comparisons across
either the type of operation or calendar year.
Preconditions for Unsafe Acts
Within the preconditions level, CRM failures were
associated with the largest percentage of accidents. Approximately
29% of all aircrew-related accidents were
associated with at least one CRM failure. A relatively
larger percentage of FAR Part 121 aircrew-accidents
involved CRM failures (40.9%) than did FAR Part 135
aircrew-related accidents (22.7%). However, the percentage
of accidents associated with CRM failures remained
relatively constant over the seven-year period for
both FAR Part 121 and 135 carriers (Figure 4, panel d).
The next largest percentage of accidents was associated
with adverse mental states (13.4%), followed by
physical/mental limitations (10.9%) and adverse physiological
states (1.7%). There were no accidents associated
with personal readiness issues. The percentage of
accidents associated with physical/mental limitation was
higher for FAR Part 135 carriers (16%) compared with
FAR Part 121 carriers (2.3%), but accidents associated
with adverse mental or adverse physiological states were
relatively equal across carriers. Again, however, the low
number of occurrences in each of these accident categories
precluded any meaningful comparisons across
calendar year.
Supervisory and Organizational Factors
Very few of the NTSB reports that implicated the
aircrew as contributing to an accident also cited some
form of supervisory or organizational failure (see Table
0
10
20
30
40
50
60
70
80
90 91 92 93 94 95 96
Percentage
Year
A.
0
10
20
30
40
50
60
70
80
90 91 92 93 94 95 96
Percentage
Year
D.
0
10
20
30
40
50
60
70
80
90 91 92 93 94 95 96
Percentage
Year
C.
0
10
20
30
40
50
60
70
80
90 91 92 93 94 95 96
Percentage
Year
B.
Figure 4. Percentage of aircrew related accidents associated with skill-based errors
(Panel A), decision errors (Panel B), violations (Panel C) and CRM failures (Panel D)
across calendar years. Lines represent seven year averages.
10
1). Indeed, only 16% of all aircrew-related accidents
involved some form of either supervisory or organizational
involvement. Overall, however, a larger proportion
of aircrew-related accidents involving FAR Part 135
carriers involved supervisory failures (9.3%) than did
those accidents involving FAR Part 121 carriers (2.3%).
In contrast, a larger proportion of aircrew-related accidents
involving FAR Part 121 carriers involved organizational
factors (20.5%) than did those accidents involving
FAR Part 135 carriers (4.0%).
DISCUSSION
HFACS Comprehensiveness
The HFACS framework was found to accommodate
all 319 causal factors associated with the 119 accidents
involving FAR Parts 121 and 135 scheduled carriers
across the seven-year period examined. This finding
suggests that the error categories within HFACS, originally
developed for use in the military, are applicable
within commercial aviation as well. Still, some of the
error-factors within the HFACS framework were never
observed in this commercial aviation accident database.
For example, no instances of such factors as organizational
climate or personal readiness were observed. In
fact, very few instances of supervisory factors were evident
at all in the data.
One explanation for the scarcity of such factors could
be that, contrary to Reason’s model of latent and active
failures upon which HFACS is based, such supervisory
and organizational factors simply do not play as large of
a role in the etiology of commercial aviation accidents as
once expected. Consequently, the HFACS framework
may need to be pared down or simplified for use with
commercial aviation. Another explanation, however, is
that these factors do contribute to most accidents, yet
they are rarely identified using existing accident investigation
processes. Nevertheless, the results of this study
indicate that the HFACS framework was able to capture
all existing causal factors and no new error-categories or
aircrew cause-factors were needed to analyze the commercial
accident data.
HFACS Reliability
The HFACS system was found to produce an acceptable
level of agreement among the investigators who
participated in this study. Furthermore, even after this
level of agreement between investigators was corrected
for chance, the obtained reliability index was considered
“good” by conventional standards. Still, this reliability
index was somewhat lower than those observed in studies
using military aviation accidents which, in some instances,
have resulted in nearly complete agreement
among investigators (Shappell & Wiegmann, 1997b).
One possible explanation for this discrepancy is the
difference in both the type and amount of information
available to investigators across these studies. Unlike the
present study, previous analysts using HFACS to analyze
military accident data often had access to privileged and
highly detailed information about the accidents, which
presumably allowed for a better understanding of the
underlying causal factors and, hence, produced higher
levels of reliabilities. Another possibility is that the
definitions and examples currently used to describe
HFACS are too closely tied to military aviation and are
therefore somewhat ambiguous to those within a commercial
setting. Indeed, the reliability of the HFACS
framework has been shown to improve within the commercial
aviation domain when efforts are taken to provide
examples and checklists that are more compatible
with civil aviation accidents (Wiegmann, Shappell,
Cristina & Pape, 2000).
HFACS Analysis
Given the large number of accident causal factors
contained in the NTSB database, each accident appeared,
at least on the surface, to be relatively unique. As
such, commonalties or trends in specific error forms
across accidents were not readily evident in the data. Still,
the recoding of the data using HFACS did allow for
similar error-forms and causal factors across accidents to
be identified and the major human causes of accidents to
be discovered.
Specifically, the HFACS analysis revealed that the
highest percentage of all aircrew-related accidents as
associated with skill-based errors. Furthermore, this proportion
was lowest during the last two years of this study,
suggesting that accidents associated with skill-based errors
may be on the decline. To some, the finding that
skill-based errors were frequently observed among the
commercial aviation accidents examined is not surprising
given the dynamic nature and complexity of piloting
commercial aircraft, particularly in the increasingly
congested U.S. airspace. The question remains, however,
as to the driving force behind the possible reduction
in such errors. Explanations could include
improved aircrew training practices or perhaps better
selection procedures. Another possibility might be the
recent transition within the regional commuter industry
from turboprop to jet aircraft. Such aircraft are
11
generally more reliable and contain advanced automation
to help off-load the attention and memory demands
placed on pilots during flight.
Unfortunately, the industry-wide intervention programs
and other changes that were made during the
1990s were neither systematically applied nor targeted at
preventing specific error types, such as skill-based errors.
Consequently, it is impossible to determine whether all
or only a few of these efforts are responsible for the
apparent decline in skill-based errors. Nevertheless, given
that an error analysis has now been conducted on the
accident data, future invention programs can be strategically
targeted at reducing skill-based errors. Furthermore,
the effectiveness of such efforts can be objectively
evaluated so that efforts can be either reinforced or
revamped to improve safety. Additionally, intervention
ideas can now also be shared across organizations that
have performed similar HFACS analyses. One example
is the U.S. Navy and Marine Corps, which have recently
initiated a systematic intervention program for addressing
their growing problem with accidents associated with
skill-based errors in the fleet (Shappell & Wiegmann,
2000b). As a result, lessons learned in the military can
now be communicated and shared with the commercial
aviation industry, and vice versa.
The observation that both CRM failures and decision
errors are associated with a large percentage of aircrewrelated
accidents is also not surprising, given that these
findings parallel the results of similar HFACS and human
error analyses of both military and civil aviation
accidents (O’Hare et al., 1994; Wiegmann & Shappell,
1999). What is surprising, or at least somewhat disconcerting,
is the observation that both the percentage and
rate of aircrew-related accidents associated with both
CRM and decision errors have remained relatively stable.
Indeed, both the FAA and aviation industry have invested
a great deal of resources into intervention strategies
specifically targeted at improving CRM and
aeronautical decision making (ADM), with apparently
little overall effect.
The modest impact that CRM and ADM programs
have had on reducing accidents may be due to a variety
of factors, including the general lack of systematic
analyses of accidents associated with these problems.
Consequently, most CRM and ADM training programs
use single case studies to educate aircrew, rather
then focus on the fundamental causes of these problems
in the cockpit using a systematic analysis of the
accident data. Another possible explanation for the
general lack of CRM and ADM effectiveness is that
many established training programs involve classroom
exercises that are not followed up by simulator
training that requires CRM and ADM principles to be
applied. More recent programs, such as the Advanced
Qualification Program (AQP), have been developed
to take this next step of integrating ADM and CRM
principles into the cockpit. Given that the current
HFACS analyses has identified the accidents associated
with these problems, at least across a seven-year
period, more fine-grained analyses can be conducted
to identify the specific problems areas in need of
training. Furthermore, the effectiveness of the AQP
program and other ADM training in reducing aircrew
accidents associated with CRM failures and decision
errors can be systematically tracked and evaluated.
The percentage of aircrew-related accidents associated
with violations (e.g., not following federal regulations or
a company’s standard operating procedures) exhibited a
slight increase across the years examined in this study.
Some authors (e.g., Geller, 2000) have suggested that
violations, such as taking short-cuts in procedures or
breaking rules, are often induced by situational factors
that reinforce unsafe acts while punishing safe actions.
Not performing a thorough preflight inspection due to
the pressure to achieve an on-time departure would be
one example. However, according to Reason’s (1990)
model of active and latent failures, such violation-inducing
situations are often set up by supervisory and management
policies and practices.
Such theories suggest that the best strategy for reducing
violations by aircrew is to enforce the rules and to
hold both the aircrew and their supervisors/organizations
accountable. Indeed, this strategy has been effective
with the Navy and Marine Corps in reducing aviation
mishaps associate with violations (Shappell, et al., 1999).
Still, as mentioned earlier, very few of the commercial
accident reports examined in this study cited supervisory
or organizational factors as accident causes, suggesting
that more often than not, aircrews were the only ones
responsible for the violations. Again, more thorough
accident investigations may need to be performed to
identify possible supervisory and organizational issues
associated with these events.
Although pilots flying with FAR Part 135 scheduled
carriers had fewer annual flight hours during the years
covered in this study (NTSB, 2000), the overall number
of accidents associated with most error types was generally
higher for FAR Part 135 scheduled carriers, compared
with FAR Part 121 scheduled carriers. This finding
is likely due, at least in part, to the fact that most pilots
12
flying aircraft operating under FAR Part 135 are younger
and much less experienced. Furthermore, such pilots
often fly less sophisticated and reliable aircraft into areas
that are less likely to be controlled by ATC. As a result,
they may frequently find themselves in situations that
exceed their training or abilities. Such a conclusion is
supported by the findings presented here, since a larger
percentage of FAR Part 135 aircrew-related accidents
were associated with the physical/mental limitations of
the pilot. However, a smaller percentage FAR Part 135
aircrew accidents were associated with CRM failures,
possibly because some FAR Part 135 aircraft are singlepiloted,
which simply reduces the opportunity for
CRM failures.
These differences between FAR Parts 121 and 135
schedule carriers may be less evident in future aviation
accident data since the federal regulations were changed
in 1997. Such changes require FAR Part 135 carriers
operating aircraft that carry ten or more passengers to
now operate under more stringent FAR Part 121 rules.
Thus, the historical distinction in the database between
FAR Part 135 and 121 operators has become somewhat
blurred in the years extending beyond the current analysis.
Therefore, future human-error analyses and comparisons
across these different types of commercial
operations will therefore need to consider these changes.
SUMMARY AND CONCLUSIONS
This investigation demonstrates that the HFACS
framework, originally developed for and proven in the
military, can be used to reliably identify the underlying
human factors problems associated with commercial
aviation accidents. Furthermore, the results of this study
highlight critical areas of human factors in need of
further safety research and provide the foundation upon
which to build a larger civil aviation safety program.
Ultimately, data analyses such as that presented here will
provide valuable insight aimed at the reduction of aviation
accidents through data-driven investment strategies
and objective evaluation of intervention programs. The
HFACS framework may also prove useful as a tool for
guiding future accident investigations in the field and
developing better accident databases, both of which
would improve the overall quality and accessibility of
human factors accident data.
Still, the HFACS framework is not the only possible
system upon which such programs might be developed.
Indeed, there often appears to be as many human error
frameworks as there are those interested in the topic
(Senders & Moray, 1991). Indeed, as the need for better
applied human error analysis methods has become more
apparent, an increasing number of researchers have proposed
other comprehensive frameworks similar to HFACS
(e.g., O’Hare, in press). Nevertheless, HFACS is, to date,
the only system that has been developed to meet a specific
set of design criteria, including comprehensiveness, reliability,
diagnosticity, and usability, all of which have
contributed to the framework’s validity as an accident
analysis tool (Shappell & Wiegmann, in press). Furthermore,
HFACS has been shown to have utility as an erroranalysis
tool in other aviation-related domains such as
ATC (HFACS-ATC; Pounds, Scarborough, & Shappell,
2000) and aviation maintenance (HFACS-ME; Schmidt,
Schmorrow, & Hardee, 1998), and is currently being
evaluated within other complex systems such as medicine
(currently referred to as HFACS-MD). Finally, it
is important to remember that neither HFACS nor
any other error-analysis tool can “fix” the problems
once they have been identified. Such fixes can only be
derived by those organizations, practitioners and human
factors professionals who are dedicated to improving
aviation safety.
REFERENCES
Bird, F. (1974). Management guide to loss control. Atlanta,
GA: Institute Press.
Fleiss, J. (1981). Statistical Methods for Rates and Proportions.
New York: John Wiley.
Ford, C., Jack, T., Crisp, V. & Sandusky, R. (1999).
Aviation accident causal analysis. Advances in
Aviation Safety Conference Proceedings, (P-343).
Warrendale, PA: Society of Automotive Engineers
Inc.
Geller, E. (March, 2000). Behavioral safety analysis: A
necessary precursor to corrective action. Professional
Safety, 29-32.
International Civil Aviation Organization (1993).
Investigation of human factors in accidents and
incidents (Human Factors Digest #7), Montreal:
Canada.
13
Jones, A. (1988). Climate and measurement of consensus:
A discussion of “organizational climate.” In S.
Cole, R.Demaree & W. Curtis, (Eds.), Applications
of Interactionist Psychology: Essays in Honor of
Saul B. Sells (pp. 283-290). Hillsdale, NJ:
Earlbaum.
National Transportation Safety Board (2000). Aviation
accident statistics. [On-line]. Available:
www.ntsb.gov/aviation/Stats.htm
O’Hare, D. (in press). The Wheel of Misfortune. Ergonomics.
O’Hare, D., Wiggins, M., Batt, R., and Morrison, D.
(1994). Cognitive failure analysis for aircraft accident
investigation. Ergonomics, 37, 1855-69.
Pounds, J., Scarborough, A., & Shappell, S. (2000). A
human factors analysis of Air Traffic Control operational
errors (Abstract). Aviation, Space and
Environmental Medicine, 71, pp. 329
Rasmussen, J. (1982). Human errors: A taxonomy for
describing human malfunction in industrial installations.
Journal of Occupational Accidents, 4,
pp. 311-33.
Reason, J. (1990). Human error. New York: Cambridge
University Press.
Schmidt, J., Schmorrow, D., & Hardee, M. (1998). A
preliminary human factors analysis of Naval Aviation
maintenance related mishaps. Proceedings of
the 1998 Airframe/Engine Maintenance and Repair
Conference (P329), Long Beach, CA.
Senders, J., & Moray, N. (1991). Human error: Cause,
prediction and reduction. Hillsdale, NJ: Earlbaum.
Shappell, S., & Wiegmann, D. (1996). U. S. Naval
Aviation mishaps 1977-92: Differences between
single- and dual-piloted aircraft. Aviation, Space,
and Environmental Medicine, 67, 65-9.
Shappell, S. & Wiegmann D. (1997a). A human error
approach to accident investigation: The taxonomy
of unsafe operations. The International Journal of
Aviation Psychology, 7, pp. 269-91.
Shappell, S. & Wiegmann, D. (1997b). A reliability
analysis of the Taxonomy of Unsafe Operations
(Abstract). Aviation, Space, and Environmental
Medicine, 69, pp. 620.
Shappell, S. & Wiegmann, D. (2000a). The Human
Factors Analysis and Classification System
(HFACS). (Report Number DOT/FAA/AM-00/7).
Washington DC: Federal Aviation Administration.
Shappell, S. & Wiegmann, D. (2000b). Is proficiency
eroding among U.S. Naval aircrews? A quantitative
analysis using the Human Factors Analysis
and Classification System (HFACS). Proceedings
of the 44th meeting of the Human Factors and Ergonomics
Society.
Shappell, S. & Wiegmann, D. (2001). Applying Reason:
The Human Factors Analysis and Classification
System (HFACS). Human Factors and Aerospace
Safety, 1, 59-86.
Shappell, S., Wiegmann, D., Fraser, J., Gregory, G.,
Kinsey, P., & Squier, H (1999). Beyond mishap
rates: A human factors analysis of U.S. Navy/
Marine Corps TACAIR and rotary wing mishaps
using HFACS (Abstract). Aviation, Space, and
Environmental Medicine, 70, pp. 416-7.
Wiegmann, D. & Shappell, S. (1997). Human factors
analysis of post-accident data: Applying theoretical
taxonomies of human error. The International
Journal of Aviation Psychology, 7, pp. 67-81.
Wiegmann, D. & Shappell, S. (1999). Human error and
crew resource management failures in Naval aviation
mishaps: A review of U.S. Naval Safety Center
data, 1990-96. Aviation, Space, and Environmental
Medicine, 70, pp. 1147-51.
Wiegmann, D., Shappell, S., Cristina, F. & Pape, A.
(2000). A human factors analysis of aviation accident
data: An empirical evaluation of the HFACS
framework (Abstract). Aviation, Space and Environmental
Medicine, 71, pp. 328.
Yacavone, D. W. (1993). Mishap trends and cause
factors in Naval aviation: A review of Naval Safety
Center data, 1986-90. Aviation, Space and Environmental
Medicine, 64, 392-5.

使用道具 举报

Rank: 1

3#
发表于 2010-5-22 16:28:56 |只看该作者
没有中文版的啊?英文的看起来太费劲了

使用道具 举报

Rank: 1

4#
发表于 2010-5-28 23:39:47 |只看该作者
看看,有用

使用道具 举报

Rank: 1

5#
发表于 2010-10-19 21:37:06 |只看该作者
的点对点的点对点的点对点的点对点的的的

使用道具 举报

您需要登录后才可以回帖 登录 | 注册


Archiver|航空论坛 ( 渝ICP备10008336号 )

GMT+8, 2024-12-23 00:54 , Processed in 0.032002 second(s), 13 queries .

Powered by Discuz! X2

© 2001-2011 MinHang.CC.

回顶部