Doc 9906-AN/472
THE QUALITY ASSURANCE MANUAL FOR FLIGHT PROCEDURE DESIGN
VOLUME 2 – FLIGHT PROCEDURE DESIGNER TRAINING (Development of Flight Procedure Designer Training Programme) Notice to Users This document is an unedited advance version of an ICAO publication as approved, in principle, by the Secretary General, which is rendered available to the public for convenience. The final edited version may still undergo alterations in the process of editing. Consequently, ICAO accepts no responsibility or liability of any kind should the final text of this publication be at variance from that appearing here. Advance edition (unedited) AMENDMENTS RECORD OF AMENDMENTS AND CORRIGENDA
The issue of amendments is announced regularly in the ICAO Journal and in the supplement to the Catalogue of ICAO Publications and Audio-visual Training Aids, which holders of this publication should consult. The space below is provided to keep a record of such amendments. AMENDMENT No. Date Applicable Date Entered Entered by
CORRIGENDA No. Date of Issue Date Entered Entered by
—— — — — — — —
PREFACE The Quality Assurance Manual for Flight Procedure Design (Doc xxxx) consists of four volumes: Volume 1 – Flight Procedure Design Quality Assurance System; Volume 2 – Flight Procedure Designer Training; Volume 3 – Flight Procedure Design Software Validation; and Volume 4 – Flight Procedures Design Construction Instrument flight procedures based on conventional ground-based navigational aids have always demanded a high level of quality control. However, with the implementation of area navigation and associated airborne database navigation systems, even small errors in data could lead to catastrophic results. This significant change in data quality requirements (accuracy, resolution and integrity) has led to the requirement of a systemic quality assurance process (often part of a State Safety Management System). The Procedures for Air Navigation Services — Aircraft Operations (PANS-OPS, Doc 8168) Volume II, Part 1, Section 2, Chapter 4, Quality Assurance), refers to this manual and requires that the State take measures to ‘control’ the quality of the processes associated with the construction of instrument flight procedures. To this end, this manual has been assembled to provide guidance in attaining these stringent requirements for quality assurance in the procedure design process. All four volumes address crucial areas related to the attainment, maintenance and continual improvement of procedure design quality. Data quality management, procedure designer training, and validation of software are all integral elements of a quality assurance programme. Volume 1 – Flight Procedure Design Quality Assurance System, provides guidance for quality-assurance in the procedure design processes, such as procedure design documentation, verification and validation methods, guidelines about the acquisition/processing of source information/data. It also provides a generic process flow diagram for the design and the implementation of flight procedures. Volume 2 – Flight Procedure Designer Training, provides guidance for the establishment of flight procedure designer training. Training is the starting point for any quality assurance programme. This volume provides guidance for the establishment of a training programme. Volume 3 – Flight Procedure Design Software Validation, provides guidance for the validation (not certification) of procedure design tools, notably with regard to criteria. Volume 4 – Flight Procedures Design Construction (to be incorporated later). Note.— In the independent volumes, when a reference is made to the term "manual", in the context of this document, without any further specification, it is presumed to refer to this volume of the Quality Assurance Manual for Flight Procedure Design. TABLE OF CONTENTS
PREFACE ............................................................................................................................................... 1 TABLE OF CONTENTS ........................................................................................................................ 2 ABBREVIATIONS................................................................................................................................. 4 DEFINITIONS ........................................................................................................................................ 6 FOREWORD......................................................................................................................................... 10 CHAPTER 1 - INTRODUCTION ........................................................................................................12
1.1 General.............................................................................................................................. 12
1.2 Target audience of the manual.......................................................................................... 12
1.3 Goal of the manual............................................................................................................ 14
1.4 Structure of the manual.....................................................................................................14
1.5 How to use the manual...................................................................................................... 15
1.6 Use of automation............................................................................................................. 16
CHAPTER 2 -GENERAL PROVISIONS FOR COMPETENCY BASED TRAINING AND ASSESSMENT.............................................................................. 17 2.1 Introduction....................................................................................................................... 17
2.2 Competency-based approach to training and assessment ................................................. 17
2.3 The competency framework ............................................................................................. 18
2.4 Skills, knowledge and attitudes (SKA)............................................................................. 42 Attachment A to Chapter 2 Sample Evidence and Assessment Guide ........................................ 44 Attachment B to Chapter 2 Procedure Design Process Flow Diagram........................................ 51 CHAPTER 3 - DESIGNING CURRICULUM ..................................................................................... 60
3.1 Introduction....................................................................................................................... 60
3.2 Training phases ................................................................................................................. 61
3.3 Determining the prerequisite skills, knowledge and attitude ............................................ 62
3.4 Process to derive training objectives from the competency framework ........................... 64
3.5 Process of sequencing objectives and organizing modules of training............................. 67
3.6 Developing mastery tests .................................................................................................. 69
3.7 Considerations in designing modules and course materials.............................................. 72 Attachment A to Chapter 3 Example of a flight procedure designer training programme .......... 74 Attachment B to Chapter 3 Test selection criteria ....................................................................... 82
CHAPTER 4 – INSTRUCTOR COMPETENCIES.............................................................................. 85
4.1 Flight procedure design instructor competencies.............................................................. 85
CHAPTER 5 -VALIDATION AND POST-TRAINING EVALUATION OF FLIGHT PROCEDURE DESIGNER TRAINING ..................................................... 86 5.1 Introduction....................................................................................................................... 86
5.2 Purpose of evaluation........................................................................................................ 86
5.3 Evaluation approach.......................................................................................................... 86
5.4 Level 1: Evaluation of trainee reaction............................................................................. 87
5.5 Level 2: Evaluation of trainee mastery learning ............................................................... 88
5.6 Level 3: Evaluation of on-the-job performance................................................................ 88
5.7 Level 4: Evaluation of results/impact ............................................................................... 89
Attachment A to Chapter 5: Course module opinion sample survey............................................ 90
Attachment B to Chapter 5: Course validation sample survey ..................................................... 91
ABBREVIATIONS AIP -Aeronautical Information Publication AIRAC - Aeronautical information regulation and control AIS -Aeronautical Information Service ANSP -Air Navigation Service Provider ARINC -Aeronautical Radio, Inc. ARP - Aerodrome reference point ATC -Air traffic control ATS - Air traffic services ATM – Air traffic management CAA -Civil Aviation Authority CAT I/II/III - Category of approach CNF – Computer navigation fix CRC - Cyclic redundancy check CTA - Control area DA (H) – Decision altitude (height) DME - Distance measuring equipment ETRF 89 - European Terrestrial Reference Framework 1989 EUROCAE - European Organization for Civil Aviation Equipment FAA -Federal Aviation Administration FAF -Final approach fix FAS -Final approach segment FMS - Flight management system FPAP - Flight path alignment point FPCP - Flight path control point GBAS -Ground-based augmentation system GNSS - Global navigation satellite system GRS 80 - Geodetic Reference System 1980 IAF - Initial approach fix ICAO -International Civil Aviation Organization IF -Intermediate fix IFR -Instrument flight rules ILS -Instrument landing system LNAV - Lateral navigation LTP - Landing threshold point MAHF - Missed approach holding fix MAPt - Missed approach point MDA (H) -Minimum descent altitude/height MLS - Microwave landing system MSL -Mean sea level NAD 83 - North American Datum 1983 NDB -Non-directional radio beacon NM -Nautical mile NOTAM -Notice to airmen OCA(H) - Obstacle clearance altitude/height PDSP Procedure design service provider RNAV - Area navigation (also, random area navigation) RNP -Required navigation performance RTCA - RTCA (formerly Radio Technical Commission for Aeronautics) SARPS - ICAO Standards and Recommended Practices SID -Standard instrument departure SKA -Skills, knowledge, attitudes STAR - Standard terminal arrival TAA -Terminal arrival area TACAN -UHF tactical air navigation aid VFR - Visual flight rules VNAV -Vertical navigation VOR - Very high frequency omnidirectional radio range VORTAC - Combination VOR and TACAN WGS-84 - World Geodetic System 1984 WP -Waypoint DEFINITIONS
When the following terms are used in this document, they have the following meanings. Accuracy. The degree of conformance between the estimated or measured value and its true value. Aerodrome. A defined area on land or water (including any buildings, installations and equipment) intended to be used either wholly or in part for the arrival, departure and surface movement of aircraft. Aerodrome data. Data relating to an aerodrome including the dimensions, co-ordinates, elevations and other pertinent details of runways, taxiways, buildings, installations, equipment, facilities and local procedures. Aeronautical data. Data relating to aeronautical facts, inter alia, such as airspace structure, airspace classifications (controlled, uncontrolled, Class A, B, C... F, G), name of controlling agency, communication frequencies, airways/air routes, altimeter transition altitudes/flight levels, colocated instrument procedure (and its airspace as assessed by design criteria), area of magnetic unreliability, magnetic variation. AIRAC. An acronym for aeronautical information regulation and control, signifying a system aimed at advance notification based on common effective dates, of circumstances that necessitate significant changes in operating practices. Air traffic management (ATM). A generic term relating to the management of ATS. Air traffic services (ATS). A generic term meaning variously, flight information service, alerting service, air traffic advisory service, and air traffic control service (area control service, approach control service or aerodrome control service). Cartographic map. A representation of a portion of the Earth, its culture and relief, with properly referenced terrain, hydrographic, hypsometric and cultural data depicted on a sheet of paper. Civil Aviation Authority (CAA). The relevant aviation authority designated by the State responsible for providing air traffic services in the airspace concerned; sometimes referred to as the ‘State Authority’. Competency. A combination of skills, knowledge and attitudes required to perform a task to the prescribed standard. Competency-based training and assessment. Training and assessment that are characterized by a performance orientation, emphasis on standards of performance and their measurement and the development of training to the specified performance standards. Competency element. An action that constitutes a task that has a triggering event and a terminating event that clearly defines its limits, and has an observable outcome. Competency framework. A competency framework consists of competency units, competency elements, performance criteria, evidence and assessment guide and range of variables. Competency units, competency elements and performance criteria are derived from job and tasks analyses of procedure designers and describe observable outcomes. Competency unit. A discrete function consisting of a number of competency elements. Datum. Any quantity or set of quantities that may serve as a reference or basis for the calculation of other quantities. (ISO 19104*). Digital elevation model (DEM). The representation of a portion of the Earth’s surface by continuous elevation values at all intersections of a defined grid, referenced to common datum. Note.— Digital terrain model (DTM) is sometimes referred to as DEM. Enabling objective. A training objective derived from performance criteria in the competency framework. In order to achieve enabling objectives, a trainee requires skills, knowledge and attitudes. Error. An action or inaction by the designer that leads to deviations from criteria. Error management. The process of detecting and responding to errors with countermeasures that reduce or eliminate the errors or the consequence of errors. Evidence and assessment guide. A guide that provides detailed information (e.g. tolerances) in the form of evidence that an instructor or an evaluator can use to determine if a candidate meets the requirements of the competency standard. Integrity:. A degree of assurance that an aeronautical data and its value has not been lost nor altered since the data origination or authorized amendment. Maintenance (continuous). The continuous maintenance of an instrument procedure is an ongoing process triggered by the State AIS through notification of any critical changes to the instrument procedure environment that would necessitate timely revision of the instrument procedure design. Examples of critical changes would be the erection of an obstacle within a determined radius of an Airport Reference Point, or the planned decommissioning of an associated secondary navigation aid, or the planned extension/reduction of a runway. It is assumed that the State AIS would respond by NOTAM to any unplanned critical change to the instrument procedure environment. The State AIS would notify the procedure designer of the NOTAM action and would then expect the procedure designer to take maintenance/corrective action as required. Maintenance (cyclical). The cyclical maintenance of an instrument procedure is a planned systemic review at a predetermined interval of the procedure design. Mastery test. A test that evaluates a trainee’s ability to perform a terminal objective. A mastery test should match as closely as possible the conditions, behaviours and standards of terminal objectives Material-dependent training. A well-documented and repeatable training package that has been tested and proven to be effective. Navaid data. Data relating to both ground-based and space-based navigational aids including service volume, frequency, identification, transmission power and limitations of operation. Performance criteria. A simple, evaluative statement on a required outcome of the competency element and a description of the criteria used to judge if the required level of performance has been achieved. Several performance criteria can be associated to a competency element. Procedure design service provider. A body that provides procedure design services. It may also be a training provider providing procedure designer training. Progress test: A test that measures a trainee’s ability to meet key enabling objectives. Obstacle data. Any man-made fixed or temporary object which has vertical significance in relation to adjacent and surrounding features and which is considered as a potential hazard to the safe passage of aircraft or manmade fixed or temporary objects that extend above a defined surface intended to protect aircraft in flight. Obstacle/terrain data collection surface. A defined surface intended for the purpose of collecting obstacle/terrain data. Range of variables (conditions). The conditions under which the competency units must be performed. Raster map. An electronic representation of a cartographic map with properly referenced terrain, hydrographic, hypsometric and cultural data. Recognized source. A source of data that is either recognized by the State or a source that has professional credentials to provide a specific type of data. Reference geodetic datum. The numerical or geometrical quantity or set of such quantities (mathematical model) which serves as a reference for computing other quantities in a specific geographic region such as the latitude and longitude of a point. A minimum set of parameters required to define location and orientation of the local reference system with respect to the global reference system/frame. Resolution. The number of units or digits to which a measured or calculated value is expressed and used. The smallest difference between two adjacent values that can be represented in a data storage, display or transfer system. Stakeholder: An individual or party with vested interests in an instrument procedure design. Skills, knowledge, attitudes (SKA). The skills/knowledge/attitudes (SKAs) are what an individual requires to perform an enabling objective derived from performance criteria. A skill is the ability to perform an activity that contributes to the effective completion of a task. Knowledge is specific information required for the trainee to develop the skills and attitudes for the effective accomplishment of tasks. Attitude is the mental state of a person that influences behaviour, choices and expressed opinions. Standard instrument departure (SID). A designated instrument flight rule (IFR) departure route linking the aerodrome or a specified runway of the aerodrome with a specified significant point, normally on a designated ATS route, at which the en route phase of flight can be commenced. Standard terminal arrival (STAR). A designated instrument flight rule (IFR) arrival route linking a significant point, normally on an ATS route, with a point from which a published instrument approach procedure can be commenced. Terminal arrival area (TAA). Designated airspace in the vicinity of an aerodrome used for transitioning to an initial approach fix on a GNSS approach without having to navigate on published routes from the en route airways. Terminal objective. A training objective derived from a competency element in the competency framework which a trainee will achieve when successfully completing instruction. Terminating event. A cue or indicator that a task has been completed. Terrain data. Data pertaining to the natural surface of the earth excluding man-made obstacles, and can be represented as a cartographic map, an electronic raster map, an electronic vector data map or an electronic Digital Elevation Model (DEM). Traceability. The degree that a system or a data product can provide a record of the changes made to that product and thereby enable an audit trail to be followed from the end-user to the data originator. Training objective. A clear statement that is comprised of three parts, i.e. the desired performance or what the trainee is expected to be able to do at the end of particular stages of training, the performance standard that must be attained to confirm the trainee’s level of competence and the conditions under which the trainee will demonstrate competence. Training provider. In the context of this manual, a body that provides procedure designer training. Triggering event. A cue or indicator that a task should be initiated. Validation. Confirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled (ISO 9000*). Vector data. The digitized version of graphic or rasterized data, usually having three dimensional attributes. Verification. Confirmation, through the provision of objective evidence, that specified requirements have been fulfilled (ISO 9000*). FOREWORD
|