航空论坛_航空翻译_民航英语翻译_飞行翻译

 找回密码
 注册
搜索
查看: 1547|回复: 5
打印 上一主题 下一主题

评估系统安全 [复制链接]

Rank: 9Rank: 9Rank: 9

跳转到指定楼层
1#
发表于 2010-4-6 23:34:54 |只看该作者 |倒序浏览
游客,如果您要查看本帖隐藏内容请回复
附件: 你需要登录才可以下载或查看附件。没有帐号?注册

Rank: 9Rank: 9Rank: 9

2#
发表于 2010-4-9 11:55:03 |只看该作者
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of
a Conference. London: Royal Aeronautical Society.
CULTURE, THREAT, AND ERROR:
ASSESSING SYSTEM SAFETY
Robert L. Helmreich1
University of Texas Human Factors Research Project
The University of Texas at Austin
Austin, Texas USA
1 Research supporting this paper was supported by Federal Aviation Administration Grants 92-G-017 and 99-G-004
Abstract
Flightdeck behaviour and flight safety are influenced by the national, organisational, and professional cultures of crewmembers. Natural limitations on human performance and the complexity of the operating environment ensure that error will be an inevitable occurrence. The positive and negative safety implications of the three cultures are described.
Effective action to establish a safety culture in an organisation must be based on data regarding the organisation’s practices and the threats inherent in the operational context. Multiple sources of data are described, including the Line Operations Safety Audit (LOSA) in which expert observers collect data in the cockpit during normal operations. Information collected includes external risks and errors, crew errors, and crew actions to mitigate and manage risk and error. Models of threat and error are presented and are illustrated with data collected during LOSA observations in three airlines. Five types of error are defined: procedural, communication, proficiency, decision making, and intentional non-compliance (violations). The significance of violations for safety is discussed.
The role of Crew Resource Management training (CRM) in organisations is explored. Misconceptions about the definition and evaluation of CRM skills are discussed. The paper concludes with a description of CRM skills as countermeasures against threat and error. Actions necessary for a safety culture are reviewed.
Cultures – the Missing Element
Early CRM programmes and investigations of human error in accidents viewed the cockpit as an isolated universe. With growing sophistication, we now understand that flight operations are part of a complex system that is heavily influenced by cultures. Helmreich and Merritt (Ref 1) have described three intersecting cultures that surround every flight crew – national, organisational, and professional. Geert Hofstede (Ref 2) aptly defines culture as ‘the software of the mind,’ but more technically, culture consists of the shared norms, values, and practices associated with a nation, organisation, or profession. We shall not be concerned with all the facets of national culture, but with those aspects that may influence behaviour in the cockpit.
National Culture. Two related dimension of national culture identified by Hofstede (Ref 2) have particular relevance for aviation, Individualism-Collectivism (IC) and Power Distance (PD). Those from individualistic, low PD cultures tend to focus on the self, autonomy, and personal gain while those from collectivist; high PD cultures show great concern for the group and harmonious relationships and deference to leaders. A third, relevant dimension has been labelled Rules and Order (Ref 1) and is conceptually similar to Hofstede’s concept of Uncertainty Avoidance. Those high on this attribute believe that rules should not be broken, that written procedures are needed for all situations, and that strict time limits should be observed. Rules and Order has proved to yield large and highly statistically significant differences across national cultures. At the high end, Taiwan and many Asian culture are most rule oriented while the United States, Great Britain, and other members of the British Empire define the other end of the continuum with much lower concern for rules and written procedures.
Looking at the cockpit environment, national culture influences how juniors relate to seniors, including their willingness to speak up with critical information. It is reflected in the way information is shared. Through Rules and Order, culture influences adherence to SOPs. We have also found, unexpectedly, that culture is strongly associated with liking for automation and attitudes about its appropriate use (Ref 3).
2
Organisational Culture. Organisations can function within a national culture or can extend across national boundaries. An organisation’s culture reflects its attitudes and policies about human error, the openness of communications between management and flightcrew, and the level of trust between flightcrew and senior management. Organisational culture also influences norms regarding adherence to regulations and SOPs. These factors, along with the willingness to allocate needed resources, are components of an organisation’s safety culture.
Professional Culture. Many professions such as aviation have strong cultures and develop their own norms and values along with recognisable physical characteristics such as uniforms or badges. The positive aspects of the professional culture are shown in strong motivation to do well and a in a high level of professional pride. There is also a negative component that is manifested in a sense of personal invulnerability. In our research we have found that the majority of pilots of all nations agree that their decision making is as good in emergencies as normal situations, that their performance is not affected by personal problems, that they do not make more errors under high stress, and that they can leave behind personal problems when flying. While the positive aspects of professional culture undoubtedly contribute to aviation’s splendid safety record, the ‘macho’ attitude of invulnerability can lead to risk taking, failure to rely on fellow crewmembers, and error.
Building Safety on the Three Cultures
No national culture is optimal for safety. Cultures that value harmony and teamwork may also endorse autocratic leadership and inhibit input from juniors. Pilots in cultures that do not value adherence to SOPs highly may be creative in dealing with novel situations not covered by procedures. In contrast, belief in strict adherence to rules may be associated with more difficulty in dealing with unforeseen emergencies. Because cultural values are so deeply ingrained, it is unlikely that exhortation, edict, or generic training programmes can modify them. The challenge is to develop organisational initiatives that are congruent with cultures while enhancing safety. We have proposed (Ref 1) that error management can serve as a universally valued goal that can be embraced by individuals from every culture. Under the twin goals of threat and error management, training programmes can be mounted that do not affront cultural values but still lead to desired behaviours. Adherence to SOPs may be increased by concrete examples of error and adverse outcomes associated with violations of rules. An essential component of successful safety initiatives involves changing the sense of personal invulnerability that is associated with pilots’ professional culture. This can be accomplished through training that uses scientific data to demonstrate concretely the limitations of human performance and the inevitability of error. All of these initiatives aid formation of a safety culture and all require accurate information about the national, professional, and organisational culture and customary operational practices.
CRM as Defence Built on Data
Crew Resource Management (CRM) programmes are the logical vehicle for training in safety-related behaviours. CRM in practice provides one of the primary lines of defence against external threat and crew error. Effective CRM training is data driven and curricula are built on knowledge about the cultures that surround the pilot force and the behavioural norms and practices in line operations. There are five critical sources of data that inform training, operations, and safety. Each source provides partial information on some aspect of flight operations. They are: 1) Data from formal evaluations of performance in training and on the line; 2) Incident reports; 3) Surveys of flightcrew perceptions of safety and human factors; 4) Flight Operations Quality Assurance
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.
(FOQA) programs using flight data recorders to provide information on parameters of flight. (It should be noted that FOQA data provide a reliable indication of what happens but not why things happen.); and 5) Line Operations Safety Audits (LOSA). I will focus on what we have learned from LOSA.
The nature and value of LOSA. Line Operations Safety audits are programmes that use expert observers to collect data about crew behaviour and situational factors on normal flights. They are conducted under strict non-jeopardy conditions, meaning that no crews are at risk for observed actions.2 Observers code observed threats to safety and how they are addressed, errors and their management, and specific behaviours that have been associated with accidents and incidents (and that form the basis for contemporary CRM training). Data are collected using the University of Texas Line/LOS Error Checklist (Ref 4). Data from LOSA provide a valid picture of system operations that can guide organisational strategy in safety, operations, and training. A particular strength of LOSA is that the process identifies examples of superior performance that can be reinforced and used as models for training. Data collected in LOSA are proactive and can be used immediately to prevent adverse events. The University of Texas project has participated in eight audits with a total of over 3,500 flights observed. In this paper, data from the three most recent audits, which include threat recognition and error management, are discussed. These three LOSA projects were conducted both in the U.S. and in international operations and involved two U.S. and one non-U.S. carrier.
A Model of Threat and Error Management
Data are most valuable when they fit within a theoretical or conceptual framework. Our research group has developed a general model of threat and error management in aviation that is shown below in Figure 1.
2 In practice, members of the University of Texas project have trained observers from participating airlines and also serve as observers. Their presence across all organisations allows us to make valid cross-airline comparisons. ExternalThreatsInternalThreatsCRMBehaviorsOutcomesExpectedEvents andRisksUnexpectedEvents andRisksExternalErrorCrew-BasedErrorsThreatRecognitionand ErrorAvoidanceBehaviorsA SafeFlightRecovery toA SafeFlightIncidents / AccidentAdditional ErrorError Detection and ResponseBehaviorsFigure 1. A model of threat and error management in aviation
. As the model indicates, risk comes from both expected and unexpected threats. Expected threats include such factors as terrain, predicted weather, and airport conditions while those unexpected include ATC commands, system malfunctions, and operational pressures.
Risk can also be increased by errors made outside the cockpit, for example, by ATC, maintenance, and dispatch. External threats are countered by the defences provided by CRM behaviours. When successful, these lead to a safe flight. The response by the crew to recognised external threat or error might be an
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.
error, reinstating the cycle of error detection and response. In addition, crews themselves may err in the absence of any external precipitating factor. Again CRM behaviours stand as the last line of defence. If the CRM defences are successful, error is managed and there is recovery to a safe flight. If the defences are breached, they may result in additional crew error or an incident or accident.
In our three-airline database, 72% of all flights faced one or more threats, with a range of 0 to 11. The average was two threats per flight. The most common threats encountered were difficult terrain on 58% of flights, adverse weather on 28%, aircraft malfunctions on 15%, unusual ATC commands on 11%, external errors (including ATC, maintenance, ground handlers, etc.) on 8%, and operational pressures on 8%. Threats occurred most frequently in the descent, approach and landing phase (40%).
A Model of Flightcrew Error Management
Errors made within the cockpit have received the most attention from safety investigations and crew error has been implicated in around two-thirds of air crashes (Ref 5).3 Our analyses of error have led us to reclassify and redefine error in the aviation context. Operationally, flightcrew error is defined as action or inaction that leads to deviation from crew or organisational intentions or expectations. Our definition classifies five types of error: 1) Intentional non-compliance errors or violations of SOPs or regulations. Examples include omitting required briefings or checklists; 2) Procedural errors in which the intention is correct but the execution flawed. These include the usual slips, lapses, and mistakes such as incorrect data entries or flying the wrong heading; 3) Communication errors that occur when information is incorrectly transmitted or interpreted. Examples include incorrect readback to ATC or communicating wrong course to the other pilot; 4) Proficiency errors that indicate a lack of knowledge or stick and rudder skill; and 5) Operational decision errors in which crews make a discretionary decision that unnecessarily increases risk. Examples include extreme manoeuvres on approach, choosing to fly into adverse weather, or over-reliance on automation.
3 Early investigations tended to focus on the crew as the sole causal factor. Today, of course, we realize that almost all accidents are system accidents as discussed by Helmreich & Foushee (Ref 5) and Reason (Ref 8).
Response to error and error outcomes. Three responses to crew error are identified: 1) Trap – the error is detected and managed before it becomes consequential or leads to additional error; 2) Exacerbate – the error is detected but the crew’s action or inaction leads to a negative outcome; 3) Fail to respond – the crew fails to react to the error either because it is undetected or is ignored.
Definition and classification of errors and responses are based on the observable process without consideration of the outcome. There are three outcomes: 1) Inconsequential – the error has no effect on the safe completion of the flight. This is the modal outcome, a fact that demonstrates the robust nature of the aviation system; 2) Undesired aircraft state – the error results in the aircraft being in a condition that increases risk. This includes incorrect vertical or lateral navigation, unstable approaches, low fuel state, and hard or otherwise improper landings. A landing on the wrong runway, at the wrong airport, or in the wrong country would be classified as an undesired aircraft state. 3) Additional error – the response to error, as we have noted, can result in an additional error that again initiates the cycle of response.
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.
Undesired aircraft states can be mitigated, exacerbated, or ignored. For example, recognising an unstable approach and going-around would mitigate the situation. Crew actions may exacerbate the situation, increasing the severity of the state and the level of risk. Just as with error response, there can
also be a failure to respond to the situation. There are three possible resolutions of the undesired aircraft state: 1) Recovery indicating that the risk has been eliminated; 2) Additional error where the actions initiate a new cycle of error and management; and 3) crew-based incident or accident. The full error management model is shown graphically in Figure 2.
The model aids analysis of all aspects of error, response, and outcome. The failure or success of defences such as CRM behaviours can also be evaluated. Errors thus classified can be used not only to guide organisational response but also as scenarios for training, either in classroom or LOFT. Intentional NoncomplianceProceduralCommunicationProficiencyOperational DecisionInconsequentialIncident/AccidentError TypesUndesired StateResponsesError ResponsesErrorOutcomesAdditional ErrorTrapExacerbateFail to RespondMitigateExacerbateFail to RespondUndesired StateOutcomesRecoveryAdditional ErrorUndesired StateFigure 2. A model of flightcrew error management.
Error Results from LOSA
Examination of the aggregate data from the first three LOSAs in which error was measured is instructive (Ref 6). Errors were committed by 73% of the crews observed. The range of errors on any flight was from zero to fourteen, with an average of two per flight. As Figure 3 illustrates, the most frequent type of error in the data from the three-airline study was intentional non-compliance (or violation) followed by procedural. The high percentage of procedural non-compliance is alarming and will be discussed later. Procedural errors doubtless have multiple causes. They can result from the inherent limitations of humans accomplishing difficult tasks, often under high workload conditions or they may be an indication that procedures themselves are sub-optimal. 010203040506070ViolationsProceduralCommunicationsOperational DecisionProficiencyPercentage of ErrorsFigure 3. Distribution of error types
Of all the errors observed, 18% were trapped, 5% were exacerbated, and 77% elicited no response. Errors differ by types in whether or not they become consequential. By consequential, we mean action resulting in an additional error or an undesired aircraft state. While proficiency and operational decision errors are least common, they are more likely to become consequential as shown in Table 1.
Error Type
Consequential
Intentional
Noncompliance
2%
Procedural
23%
Communication
11%
Proficiency
69%
Decision
51%
Table 1. Percentage of each error type becoming consequential.
Of particular significance is the fact that there are very large differences in threat, error, and percentage of errors becoming consequential between fleets within airlines and between airlines as shown in Tables 2 and 3.
This finding indicates the importance of airlines determining the status of their own operations rather than assuming that their organisation conforms to some industry standard.
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.
Threat and Error
Airline
A
Airline B
Airline C
Threats per segment
3.3
2.5
0.4
Errors per segment
.86
1.9
2.5
Error Management - % consequential
18%
25%
7%
Table 2. Threats and errors in three airlines.
Aircraft Type
Intentional non-compliance
Procedural
Advanced Tech 1
40%
31%
Advanced Tech 2
30%
44%
Old Tech1
17%
55%
Old Tech. 2
53%
20%
Table 3. Percentages of error types within fleets in one airline.
Variability in error and error management can result from differences in the operating environment as well as from differing organisational cultures and subcultures. Note in particular the range of variables that became consequential, terminating in an undesired aircraft state. There is certainly differential risk in the three organisations. There are some common factors in the three airlines, but the data that follow are shown primarily to show the type of information that can be obtained and its utility for safety.
Phase of flight is also strongly associated with the occurrence of errors and their consequences. Consistent with analyses of world-wide approach and landing accidents (Ref 7), the highest percentage of errors occurs during this phase of flight with the highest percentage becoming consequential. Clearly, special attention should be directed toward enhancing performance in this phase. The distribution of errors by phase of flight and their consequences is shown in Table 4.
Phase of
Flight
Errors
Consequential
Preflight/
Taxi
23%
8%
Takeoff/
Climb
24%
12%
Cruise
12%
15%
Descent/
Approach/
Landing
41%
22%
Table 4. Percent of errors and their consequences by phase of flight.
Specific errors. Although a wide array of error classifications was observed, some major problem areas emerged. Earlier audits pointed to the appropriate use of automated systems as an industry-wide problem (Ref 3, Ref 8). Consistent with these findings, the most frequent classification of error in LOSA involved the operation of automated systems (mode control panel and flight management computer). Errors included wrong settings, wrong modes, and failure to verify settings, along with numerous others. Overall, these accounted for 31% of all errors. The second highest classification (24%) was checklist errors such as non-standard terminology, procedural errors, performance from memory, and failure to use required challenge and response methods. The third highest category consisted of sterile cockpit violations, accounting for 13%. Fourth highest at 8% were ATC-related crew errors such as missed calls, omitted information, and accepting instructions that unnecessarily increase risk (i.e., slam dunk approaches). The fifth highest category (5%) consisted of briefing errors, such as failure to conduct required briefings or leaving out required information. The remainder of errors fell into a variety of categories.
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.
Violations Matter
The high proportion of intentional non-compliance errors found in the data dismayed us and the management at collaborating airlines. Several points regarding these violations should be considered. First, as we have noted, there were very large differences between airlines and between fleets within airlines. Hence, one cannot generalise from these data about the general frequency of violations in the global aviation system. This point is further emphasised by the fact that the three carriers included in the study all came from countries that scored very low on commitment to rules, as measured by our Rules and Order scale (Ref 1). It would be incorrect to assume that pilots from other cultures, especially those high in adherence to procedures, would be equally cavalier in disregarding formal rules. (Our data indicate that pilots from countries in that are current or former members of the British Empire score lowest on adherence to rules.) On the other hand, the universal pilot belief in personal invulnerability may foster a generally cavalier disregard for rules. The fact that many rules are broken does not imply that violating pilots have a death wish or have contempt for formal requirements. One must also consider the possibility that the proliferation of regulations may have created a contradictory, unwieldy, and inefficient operating environment that invites violations (Ref 9).
Although many violations may be committed with the good intention of increasing operational efficiency, organisations cannot and should not tolerate disregard for established procedures. There are several compelling reasons for this. One is, of course, that standardisation of operations cannot be achieved with idiosyncratic adherence to procedures. There is also compelling evidence for the threat to safety associated with violations. First, a Flight Safety Foundation analysis of global approach and landing accidents found that more than 40% involved violations of SOPs (Ref 7). Second, analysis of LOSA data indicate that those who commit intentional non-compliance errors commit 25% more errors of other types of errors. Although the percentage of violations that became consequential was low, it can be concluded that violators are place flights at greater risk because of their general propensity to err. Further analyses may give us greater insight into the nature of this relationship.
CRM as Countermeasures
One of the most informative aspects of LOSA data is the ability to link threat recognition and error management with the specific behavioural markers that form the core of CRM. These markers emerge very clearly in observer ratings of the actions taken by effective crews. Those who deal proactively with threat and error exhibit the following behaviours:
• Active Captain leadership
• Briefing known threats
• Asking questions, speaking up
• Decisions made and reviewed
• Operational plans clearly communicated
• Preparing/planning for threats
• Distributing workload and tasks
• Vigilance through monitoring and challenging
Leadership is an overarching behaviour that governs interaction on the flightdeck. Although much of the attention in early CRM programmes was directed toward overcoming the effects of autocratic captains who fail to solicit or accept critical information from junior crewmembers, many current problems seem to be associated with weak leadership and the abdication of authority. While the importance of the identified markers is not surprising, the results do provide important
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.
validation of the importance of CRM-related behaviours.
Revisiting CRM
The relevance of core CRM behaviours to threat and error avoidance and management is important at a time when some have asserted that CRM is a failed concept. The contention that CRM has failed its mission because accidents still occur is based on a profound misunderstanding of human capabilities and limitations. By their very nature, humans are inevitably prone to errors and no training, however sophisticated or intensive, can change human nature. Humans will make errors and accidents and incidents will occur in complex systems.
Although CRM programmes were clearly rooted in efforts to reduce ‘pilot error’ accidents, over the years understanding of the goals of programmes has faded, perhaps in part because of the extension of the training to flight attendants and other personnel (Ref 10). An indication of this misunderstanding can been seen in the work of scientists in the United States who recently defined CRM as ‘Instructional strategies that seek to improve teamwork in the cockpit.’ While effective teamwork is clearly important, it is not the primary goal of CRM training. The following is a more accurate representation of current, effective CRM programmes: CRM consists of the effective utilisation of all available human, informational, and equipment resources toward the goal of safe and efficient flight. More specifically, it is the active process employed by crewmembers to identify existing and potential threats and to develop, communicate, and implement plans and actions to avoid or mitigate perceived threats. CRM also supports the avoidance, management, and mitigation of human errors. The secondary benefits of effective CRM programmes are improved morale and enhanced efficiency of operations.
Criticisms of CRM often fail to recognize the variability in programs. Some are carefully designed and reflective of their organization’s culture, others are mere exercises in compliance with requirements. Good programs do have a measurable, positive effect on crew performance, and, hence, safety (Ref 5).
CRM has become a source of contention in the United States recently because some members of pilots’ organisations object to evaluation of CRM, wanting to limit evaluation to technical aspects of flying. Fear of inequitable evaluation is certainly a legitimate concern of pilots whose livelihoods may be threatened. However, contemporary CRM programmes focus on specific and well-defined behaviours. Those behaviours chosen for evaluation in the European Union (which are closely related to the ‘Behavioural Markers’ defined by our research group) are objective and observable (Ref 11).
Some have argued that CRM should ultimately disappear as it becomes fully integrated into technical training. We once supported this notion, but with hindsight we now realise that it is and should be a separate aspect of training. CRM falls at the interface between safety departments, flight training, and flight operations. CRM programmes represent ongoing training driven by objective data illustrating operational issues. CRM is not a one-time intervention, but rather a critical and continuing component of a safety culture.
Data, CRM and Safety Culture
The analysis of data from a variety of sources (training evaluations, incident reports, surveys, and LOSA) aid organisations in the diagnosis and understanding of their culture and its subcultures. Without an understanding of its own cultures, organisations cannot mount effective programmes to optimise them. Data on how crews deal with threat and avoid and manage error help organisations develop and maintain a safety culture. The LOSA data set can be used by management to set priorities based on threats that crews face and how effectively they respond to those threats. _____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.
LOSA data, in particular, are of enormous value because they are proactive and allow organisations to take appropriate action before accidents and incidents occur. Proactive interventions are a defining characteristic of an effective safety culture.
Data also identify critical areas for ongoing CRM training. However, as noted earlier, CRM is not a universal panacea for safety problems in the aviation system. Accidents and incidents almost always have multiple roots and many cannot be eliminated by training alone.
The two models are useful for training by illustrating concretely the sources of risk and the process of risk avoidance and error management. The model and the data also demonstrate clearly how the core behaviours of CRM serve in risk avoidance and error management. By engaging the models and real data in the training process, the acceptance and impact of training should be increased. Training should not simply espouse the avoidance of error, which is an impossibility for humans. Rather it should focus on strategies to reduce the consequences of errors and to mitigate undesired aircraft states. By examining the types of errors committed and the nature of effective and ineffective responses, resources can be deployed more effectively.
Intentional non-compliance errors should signal the need for action since no organisation can function safely with widespread disregard for its rules and procedures. One implication of violations is a culture of complacency and disregard for rules, which calls for strong leadership and positive role models. Another possibility is that procedures themselves are poorly designed and inappropriate, which signals the need for review and revision. More likely, both conditions prevail and require multiple solutions. One carrier participating in LOSA has addressed both with considerable success.
Procedural errors may reflect inadequately designed SOPs or the failure to employ basic CRM behaviours such as monitoring and cross checking as countermeasures against error. The data themselves can help make the case for the value of CRM. Similarly, many communications errors can be traced to inadequate practice of CRM, for example in verifying ATC communications or confirming interactions with automation.
Proficiency errors can indicate the need for more extensive training before pilots are released to the line. LOSA thus provides another checkpoint for the training department in calibrating its programs by showing issues that may not have generalised from the training setting to the line.
Decision errors also signal inadequate CRM as crews may have failed to exchange and evaluate perceptions of threat in the operating environment. They may also be a result of the failure to revisit and review decisions made.
The value of data showing operational areas of strength must be recognised. Training based on positive examples is superior to that based on negatives. It is important for organisations to recognise those things they do particularly well and to reinforce them.
Organisations nurturing a safety culture must deal with those issues identified by LOSA and other data sources, interventions may include revising procedures, changing the nature and scope of technical training, changing scheduling and rostering practices, establishing or enhancing a safety department, and a variety of other actions (Ref 12).
Summary
There are basic steps that every organisation needs to follow to establish a proactive safety culture that is guided by the best possible data on its operations.
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.
These include the following:
• Establish trust
• Adopt a credible, non-punitive policy toward error (not violation)
• Demonstrate commitment to taking action to reduce error-inducing conditions
• Collect ongoing data that show the nature and types of errors occurring
• Provide training in threat and error management strategies for crews
• Provide training in evaluating and reinforcing threat and error management for instructors and evaluators
Trust is a critical element of a safety culture, since it is the lubricant that enables free communication. It is gained by demonstrating a non-punitive attitude toward error (but not violations) and showing in practice that safety concerns are addressed. Data collection to support the safety culture must be ongoing and findings must be widely disseminated.4 CRM training must make clear the penultimate goals of threat and error management. Ancillary benefits such as improved teamwork and morale are splendid, but not the driving force. Finally, instructors and check airmen need special training in both evaluating and reinforcing the concepts and in relating them to specific behaviours.
If all of the needed steps are followed and management’s credibility is established, a true safety culture will emerge and the contribution of CRM to safety will be recognised.
As a final note, I would add that our models of threat and error are quite consistent with James Reason’s justifiably celebrated Swiss cheese model (Ref 13). Our models are embedded in the aviation system and defined to highlight actions taken to manage both threat and error and to delineate multiple outcomes.
4 One airline had its LOSA report bound and placed copies in every aircraft as well as every base for crews to peruse.
References
1. Helmreich, R.L., & Merritt, A.C. (1998). Culture at work in aviation and medicine: National, organisational, and professional influences. Aldershot, U.K.: Ashgate.
2. Hofstede, G. (1980). Culture’s Consequences: International Differences in Work-Related Values. Beverley Hills, CA: Sage
3. Sherman, P.J., Helmreich, R.L., & Merritt, A.C. (1997). National culture and flightdeck automation: Results of a multination survey. International Journal of Aviation Psychology, 7, 311-329.
4. Helmreich, R.L., Klinect, J.R., Wilhelm, J.A & Jones, S.G. (1999). The Line/LOS Error Checklist, Version 6.0: A checklist for human factors skills assessment, a log for off-normal external events, and a worksheet for cockpit crew error management. University of Texas Aerospace Crew Research Project Technical Report 99-01.
5. Helmreich, R.L., & Foushee, H.C. (1993). Why Crew Resource Management? Empirical and theoretical bases of human factors training in aviation. In E. Wiener, B. Kanki, & R. Helmreich (Eds.), Cockpit Resource Management (pp. 3-45). San Diego, CA: Academic Press.
6. Klinect, J.R., & Wilhelm, J.A. (1999). Human error in line operations: Data from line audits. In Proceedings of the Tenth International Symposium on Aviation Psychology, The Ohio State University.
7. Khatwa, R. & Helmreich, R. (1998). Analysis of critical factors during approach and landing in accidents and normal flight. Flight Safety Digest. 17, 1-256.
8. Helmreich, R.L., Hines, W.E., & Wilhelm, J.A. (1996). Crew performance in advanced technology aircraft: Observations in 4 airlines. The University of Texas Aerospace Crew Research Project Technical Report 96-8.
9. Reason, J. (1997). Managing the Risks of Organisational Accidents. Aldershot, U.K.: Ashgate.
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.
10. Helmreich, R.L., Merritt, A.C., & Wilhelm, J.A. (1999). The evolution of Crew Resource Management in commercial aviation. International Journal of Aviation Psychology, 9, 19-32.
11. van Avermaete, J.A. G. & Kruijsen, E.A.C. (Eds). (1998). NOTECHS: The evaluation of non-technical skills of multi-pilot aircrew in relation to the JAR-FCL requirements. National Aerospace Laboratory (NL), Deutsches Zentrum fur Luft- und Raumfarht (DLR), Laboratoire IMASSA (FR), & University of Aberdeen (GB).
12. Helmreich, R.L., Wilhelm, J.A., Klinect, J.R., & Merritt, A.C., (in press). Error and resource management across organizational, professional, and national cultures. In E. Salas, C.A. Bowers, & E. Edens (Eds.), Applying resource management in organizations: A guide for training professionals. Princeton, NJ: Erlbaum.
13. Reason, J. (1990). Human error. New York: Cambridge University Press.
University of Texas Crew Research Project Website:
www.psy.utexas.edu/psy/helmreich/nasaut.htm
_____________________________________________________________________________________________________________________
University of Texas at Austin Human Factors Research Project: 257
Helmreich, R.L. (in press). Culture, threat, and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a
Conference. London: Royal Aeronautical Society.

使用道具 举报

Rank: 1

3#
发表于 2010-4-28 23:31:29 |只看该作者

回复 1# 航空 的帖子

感谢提供!!

使用道具 举报

Rank: 1

4#
发表于 2010-5-10 10:42:53 |只看该作者
感谢提供111111!!!

使用道具 举报

Rank: 1

5#
发表于 2010-7-12 12:58:49 |只看该作者
要是中文的就好了

使用道具 举报

Rank: 1

6#
发表于 2010-10-19 21:33:00 |只看该作者
的点对点的点对点的点对点的点对点的的

使用道具 举报

您需要登录后才可以回帖 登录 | 注册


Archiver|航空论坛 ( 渝ICP备10008336号 )

GMT+8, 2024-5-6 20:22 , Processed in 0.031200 second(s), 12 queries .

Powered by Discuz! X2

© 2001-2011 MinHang.CC.

回顶部