You are on page 1of 60

STANDARD MEASUREMENTS FOR FLIGHT

SIMULATION QUALITY

ARINC REPORT 433-2

PUBLISHED: April 5, 2013

AN DOCUMENT
Prepared by FSEMC
Published by
AERONAUTICAL RADIO, INC.
2551 RIVA ROAD, ANNAPOLIS, MARYLAND 21401-7435
DISCLAIMER

THIS DOCUMENT IS BASED ON MATERIAL SUBMITTED BY VARIOUS PARTICIPANTS


DURING THE DRAFTING PROCESS. NEITHER AEEC, AMC, FSEMC NOR ARINC HAS
MADE ANY DETERMINATION WHETHER THESE MATERIALS COULD BE SUBJECT TO
VALID CLAIMS OF PATENT, COPYRIGHT OR OTHER PROPRIETARY RIGHTS BY
THIRD PARTIES, AND NO REPRESENTATION OR WARRANTY, EXPRESS OR
IMPLIED, IS MADE IN THIS REGARD.

ARINC INDUSTRY ACTIVITIES USES REASONABLE EFFORTS TO DEVELOP AND


MAINTAIN THESE DOCUMENTS. HOWEVER, NO CERTIFICATION OR WARRANTY IS
MADE AS TO THE TECHNICAL ACCURACY OR SUFFICIENCY OF THE DOCUMENTS,
THE ADEQUACY, MERCHANTABILITY, FITNESS FOR INTENDED PURPOSE OR
SAFETY OF ANY PRODUCTS, COMPONENTS, OR SYSTEMS DESIGNED, TESTED,
RATED, INSTALLED OR OPERATED IN ACCORDANCE WITH ANY ASPECT OF THIS
DOCUMENT OR THE ABSENCE OF RISK OR HAZARD ASSOCIATED WITH SUCH
PRODUCTS, COMPONENTS, OR SYSTEMS. THE USER OF THIS DOCUMENT
ACKNOWLEDGES THAT IT SHALL BE SOLELY RESPONSIBLE FOR ANY LOSS, CLAIM
OR DAMAGE THAT IT MAY INCUR IN CONNECTION WITH ITS USE OF OR RELIANCE
ON THIS DOCUMENT, AND SHALL HOLD ARINC, AEEC, AMC, FSEMC AND ANY
PARTY THAT PARTICIPATED IN THE DRAFTING OF THE DOCUMENT HARMLESS
AGAINST ANY CLAIM ARISING FROM ITS USE OF THE STANDARD.

THE USE IN THIS DOCUMENT OF ANY TERM, SUCH AS SHALL OR MUST, IS NOT
INTENDED TO AFFECT THE STATUS OF THIS DOCUMENT AS A VOLUNTARY
STANDARD OR IN ANY WAY TO MODIFY THE ABOVE DISCLAIMER. NOTHING
HEREIN SHALL BE DEEMED TO REQUIRE ANY PROVIDER OF EQUIPMENT TO
INCORPORATE ANY ELEMENT OF THIS STANDARD IN ITS PRODUCT. HOWEVER,
VENDORS WHICH REPRESENT THAT THEIR PRODUCTS ARE COMPLIANT WITH
THIS STANDARD SHALL BE DEEMED ALSO TO HAVE REPRESENTED THAT THEIR
PRODUCTS CONTAIN OR CONFORM TO THE FEATURES THAT ARE DESCRIBED AS
MUST OR SHALL IN THE STANDARD.

ANY USE OF OR RELIANCE ON THIS DOCUMENT SHALL CONSTITUTE AN


ACCEPTANCE THEREOF “AS IS” AND BE SUBJECT TO THIS DISCLAIMER.

This document is published information as defined by 15 CFR Section 734.7 of the Export Administration Regulations (EAR). As publicly available technology under 15 CFR 74.3(b)(3), it is not
subject to the EAR and does not have an ECCN. It may be exported without an export license.
©2013 BY
AERONAUTICAL RADIO, INC.
2551 RIVA ROAD ANNAPOLIS, MARYLAND
21401-7435 USA

ARINC REPORT 433-2

STANDARD MEASURMENTS FOR FLIGHT SIMULATION QUALITY

Published: April 5, 2013

Prepared by the FSEMC


Report 433 Adopted by the FSEMC Steering Committee April 3, 2001
Summary of Document Supplements
Supplement Adoption Date Published
Report 433-1 October 8, 2007 December 14, 2007
Report 433-2 February 7, 2013 April 5, 2013
A description of the changes introduced by each supplement is included at the end of this document.
FOREWORD

Aeronautical Radio, Inc., the AEEC, and ARINC Standards

ARINC organizes aviation industry committees and participates in related industry


activities that benefit aviation at large by providing technical leadership and guidance.
These activities directly support aviation industry goals: promote safety, efficiency,
regularity, and cost-effectiveness in aircraft operations.

ARINC Industry Activities organizes and provides the secretariat for international aviation
organizations (AEEC, AMC, FSEMC) which coordinate the work of aviation industry
technical professionals and lead the development of technical standards for airborne
electronic equipment, aircraft maintenance equipment and practices, and flight simulator
equipment used in commercial, military, and business aviation. The AEEC, AMC, and
FSEMC develop consensus-based, voluntary standards that are published by ARINC and
are known as ARINC Standards. The use of ARINC Standards results in substantial
technical and economic benefit to the aviation industry.

There are three classes of ARINC Standards:

a) ARINC Characteristics – Define the form, fit, function, and interfaces of avionics
and other airline electronic equipment. ARINC Characteristics indicate to
prospective manufacturers of airline electronic equipment the considered and
coordinated opinion of the airline technical community concerning the requisites of
new equipment including standardized physical and electrical characteristics to
foster interchangeability and competition.

b) ARINC Specifications – Are principally used to define either the physical


packaging or mounting of avionics equipment, data communication standards, or
a high-level computer language.

c) ARINC Reports – Provide guidelines or general information found by the airlines


to be good practices, often related to avionics maintenance and support.

The release of an ARINC Standard does not obligate any organization or ARINC to
purchase equipment so described, nor does it establish or indicate recognition or the
existence of an operational requirement for such equipment, nor does it constitute
endorsement of any manufacturer’s product designed or built to meet the ARINC
Standard.

In order to facilitate the continuous product improvement of this ARINC Standard, two
items are included in the back of this volume:

An Errata Report solicits any corrections to existing text or diagrams that may be
included in a future Supplement to this ARINC Standard.

An ARINC IA Project Initiation/Modification (APIM) form solicits any proposals for


the addition of technical material to this ARINC Standard.

ii
TABLE OF CONTENTS
ARINC REPORT 433

1.0 INTRODUCTION ........................................................................................................... 1


1.1 Purpose .......................................................................................................................... 1
1.2 Background ....................................................................................................................2
1.3 Applicability .................................................................................................................... 2
1.4 Related Documents ....................................................................................................... 2
2.0 TERMINOLOGY ............................................................................................................ 3
2.1 Definitions ...................................................................................................................... 3
3.0 DATA COLLECTION ..................................................................................................... 5
3.1 Suggested Information to be Collected .......................................................................... 5
3.1.1 Simulator Session Information .................................................................................. 5
3.1.2 Device Quality Rating (Instructor and/or Crew) ......................................................... 6
3.1.3 STD Discrepancies (Instructor, Crew, and STD Engineering) .................................. 7
3.1.4 Maintenance Activity (Preventive, Planned) ............................................................ 10
3.1.5 Configuration Control Information ........................................................................... 11
4.0 DATA ANALYSIS ......................................................................................................... 15
4.1 Introduction ..................................................................................................................15
4.2 Formulas and Computation .......................................................................................... 15
4.3 Frequency of Useful Collection and Reporting of Information...................................... 21
4.4 Benefits of Data Analysis ............................................................................................. 22
4.5 Data Users ...................................................................................................................22
4.5.1 Internal Customers .................................................................................................. 22
4.5.2 External Customers ................................................................................................. 22

ATTACHMENTS
ATTACHMENT 1 EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING,
AND REPORTS ........................................................................................... 23
ATTACHMENT 2 EXAMPLE ATA CODES .............................................................................. 32

APPENDICES
APPENDIX A ACRONYMS ...................................................................................................... 41

iii
ARINC REPORT 433 – Page 1

1.0 INTRODUCTION

1.0 INTRODUCTION
1.1 Purpose
The training industry uses a wide range of training equipment from Full Flight
Simulators (FFS), Maintenance Trainers (MT), and Flat Panel Trainers (FPT) to
Computer Based Training (CBT) and door/cabin trainers. These training devices are
commonly referred to as Synthetic Training Devices (STD). It is vitally important to
all airlines and third party training companies that utilize such STDs that these
devices be available and fully functional in order to fulfill the training mission. An
important tool for meeting this goal is through the use of metrics.
Metrics can be defined as a system of parameters or ways of quantitative and
periodic assessment of a process that is to be measured along with the procedures
to carry out such measurement and the procedures for the interpretation of the
assessment in light of previous comparable assessments. This document is
intended as a guide on how to apply metrics and correspondingly measure STD
quality. It is hoped that the guidelines set forth in this document will allow STD
operators, manufacturers, suppliers, and other related businesses to more clearly
communicate requirements and assess synthetic training device performance
throughout its life cycle.
Essential questions that lead to establishing measures for STD quality are:
• Is the STD meeting your operational needs?
• Are our customers satisfied?
• How do we know when the STD quality is degrading?
• How can we utilize engineering and maintenance efforts on the STD to
better meet the training needs (i.e., improve quality)?
• To what capacity is the STD being utilized?
This document is intended as a guide on how to provide metrics for a quality plan.
COMMENTARY
Feedback received following a review of the proposed changes to
ARINC Report 433 suggested that some of the data items collected
and/or calculated may not be widely utilized and could be removed
from the report for simplification. This issue was discussed at length
during the update of ARINC Report 433. After further consideration, it
was determined that while some of the users may not use all of the
metrics defined, these data points may be useful to other disciplines,
such as asset management or accounting, in order to make decisions
concerning new STD purchases or upgrades. Retention of these
metrics allows for a broader application of this ARINC Report, while
not diminishing the value of the guidance to the maintenance and
engineering organizations. Attempts have been made throughout the
document to further clarify the metric definitions and improve the
usefulness of the report.
ARINC REPORT 433 – Page 2

1.0 INTRODUCTION

1.2 Background
The FSEMC Synthetic training device Metrics (FSM) Task Group was originally
envisioned to focus on the issue of effective synthetic training device performance
management and methods for measuring synthetic training device quality including
availability, disruptions per training hour, defects per training hour, training hours per
year, etc.
Early on, the task group recognized that a Quality Assurance Program (QAP) tied to
measurement of synthetic training device performance could lead to improving and
maintaining training quality.
The FSM Task Group also focused on identifying what information, in the form of
measurements, should be included to support FAA, EASA, and other regulatory
Quality Assurance Programs. For the synthetic training device industry, this
represents an excellent opportunity to provide timely input to help formulate future
regulatory policy.
The EASA and FAA cosponsored an industry working group in Hoofddorp in 2001
with a follow up in Atlanta tasked with rewriting and updating ICAO Document 9625
Manual of Criteria for the Qualification of Flight Simulators. This document was
subsequently incorporated into JAR-FSTD A and CFR 14 Part 60. Both of these
regulatory documents specify that STD sponsors establish and maintain a Quality
Management System (QMS). Both regulations recognize this document as a
resource acceptable for defining and quantifying the metrics necessary for an
effective QAP. This document can be used in whole or in part at the discretion of the
operator.
1.3 Applicability
This document will address measures pertaining to and directly associated with
synthetic training devices. The measures set forth herein for the synthetic training
devices should also be applicable in part or total to most other types of STDs the
user may so choose.
1.4 Related Documents
The latest version of the following documents applies:
ARINC Report 434: Synthetic Training Device (STD)—Life Cycle Support
Joint Aviation Requirements JAR-FSTD A: Aeroplane Flight Simulators
European Aviation Safety Agency (EASA)
Part ORA (Organisation Requirements for Aircrew)
Subpart FSTD – Requirements for Organisations Operating
Flight Simulation Training Devices (FSTDs) and the Qualification
of FSTDs
IATA Flight Training Device Support Documentation Requirements, Annex 2
Federal Aviation Administration CFR 14: Aeronautics and Space
Part 60 – Flight Simulation Training Device Initial Qualification
and Continuing Qualification and Use
ARINC REPORT 433 – Page 3

2.0 TERMINOLOGY

2.0 TERMINOLOGY
2.1 Definitions

AOG STD unavailable for training (derived from aircraft on ground).


Configuration Time Time used to configure the STD for a training session (Section 4.2).
Device Failure Time The time the STD is unavailable for training due to unscheduled
maintenance and as defined by sim support (i.e., time required to return the
device to a status of available for training as recorded by sim support).
COMMENTARY
Lost Training Time and Device Failure Time are two
measurements of an interrupt duration from a different
perspective.
Device Failure Time is calculated from the start of the event
that prevented potential training from occurring until the
device is ready for training, as recorded by sim support.
Lost Training Time is the total time lost as reported by the
crew due to a training interruption.
Example 1
A crew flying a mission has the visual blank out while on their
approach to land. Sim support is called, they reset the visuals,
and the device is available for training in 10 minutes from the
time the discrepancy occurred. The crew needs to get the sim
reset and restart their approach. This takes an additional 10
minutes. The Device Failure Time is 10 minutes; the Lost
Training Time is 20 minutes.
Example 2
A crew flying a LOFT scenario has completed 2 hours of the 4
hour block when a failure occurs that stops the training. Sim
support is called, and the device is available for training in 40
minutes from the time the discrepancy occurred. The crew
cannot complete the LOFT training scenario in the time
remaining. The Device Failure Time is 40 minutes; the Lost
Training Time is 4 hours.

Diagnosed Category Describes the actual STD systems that caused the event. Defined by
technical personnel.
Discrepancy Any entry recorded against the device. These might be known as log
gripes, complaints, snags, failures, tech. notes, anomalies, or defects.
Engineering Time The time the STD is used by Engineering (not counted as down time)
(Section 4.2).
Facility Down Time The STD is unavailable for any use for reasons beyond the operator’s
control (e.g., facility has lost power, flood, earthquake, etc.) (Section 4.2).
Interrupt An event that suspends a flight crew’s (or other user’s) STD session.
ARINC REPORT 433 – Page 4

2.0 TERMINOLOGY

Issue Ageing The length of time an issue remains unresolved.


Lost Training Time Time lost during the training session as defined by the flight crew or users
of the device (e.g., time required to return the point of training just prior to
the interrupt as recorded by the flight crew or device user). See comments
under Device Failure Time.
Maintenance Time The time the STD is used by maintenance (not counted as down time)
(Section 4.2).
Open Time The time the STD was available for training, but not utilized.
Other Time The time the STD is used for company demonstrations, tours, non-
accredited training, etc. (Section 4.2).
Out of Service A period of time the STD is shutdown (e.g., for a move or facility work)
(Section 4.2).

Sim Support The personnel that maintain the STD.


STD Down Time Unscheduled unavailability of the STD (Section 4.2).
Support Time The time the STD is used for Maintenance, Engineering, Regulatory,
Configuration activities, and Out of Service Time (Section 4.2).
Symptom Category Describes the STD systems affected from the crew’s perspective. Defined
by discrepancy originator.
Training Day The time that a STD is used for non-support activities (Section 4.2).
Training Time The time the STD is used for training (Section 4.2).
Work Around An operational event which causes the crew or instructor to utilize an
alternate means to successfully complete the training.
ARINC REPORT 433 – Page 5

3.0 DATA COLLECTION

3.0 DATA COLLECTION


3.1 Suggested Information to be Collected
In order to provide for accurate and detailed analysis and quality assessment,
specific information pertinent to the event needs to be recorded. The following
subsections and figures provide suggested data items and examples of collection
methods.
3.1.1 Simulator Session Information
• STD Identification number
• Instructor name
• Course and period number (scheduled session time)
• Lost training time
• Session date and start/stop times
• Session use:
o Training
o Engineering
o Maintenance
o Regulatory Authority
o Other (e.g., Demonstrations)
The following Figure 3.1.1-1 provides an example of how this information can be
logged.

Figure 3.1.1-1– Simulator Session Information


Note: Lost training time in this example is captured in the Training
Quality Assessment and in the STD Discrepancies.
ARINC REPORT 433 – Page 6

3.0 DATA COLLECTION

Key to Figure 3.1.1-1


1. STD Identification number
2. Scheduled period
3. Actual session date start/stop times
4. Participant name and employee number
5. Seat the participant will take during the training session (also used to identify
which participant was the instructor during the session)
6. Course number, if the session is being used by Flight Training. Session use,
if not being used by Flight Training
3.1.2 Device Quality Rating (Instructor and/or Crew)
• Device Quality (Rating Scale of 1 to 5)
o 1 = Unsatisfactory: No training completed
o 2 = Poor: Some training completed
o 3 = Acceptable: All training completed, many workaround and or many
interrupts
o 4 = Good: All training completed, few workaround and or few interrupts
o 5 = Excellent: All training completed, no workaround and no interrupts
• Lost training time
• Number of interrupts
The following Figure 3.1.2-1 provides an example of how this information can be
logged. Note in this example, the operator has elected to use only four device
quality ratings and is shown as Device Performance.

Figure 3.1.2-1 – Training Session Quality Rating Tool


Key to Figure 3.1.2-1
1. Number of interrupts
2. Lost training time (as perceived by instructor and/or crew)
ARINC REPORT 433 – Page 7

3.0 DATA COLLECTION

3. Training effectiveness (rating scale of 1 to 4):


a. 1 = Unsatisfactory: Training could not be completed.
b. 2 = Debriefed Satisfactory: Training was completed in debrief because
workarounds and interrupts precluded further STD training.
c. 3 = Satisfactory: Any workarounds or interrupts did not preclude completion
of training.
d. 4 = Outstanding: All training was completed, no workarounds and no
interrupts.
3.1.3 STD Discrepancies (Instructor, Crew, and STD Engineering)
• Unique event number
• Name and organization of person who entered the discrepancy
• Date and time the discrepancy was entered
• Description of problem (includes date and time)
• Discrepancy priority (one suggested set of categories)
o STD Down
o Able to work around
o No impact on training
• Type of event: discrepancy or interrupt
• STD Downtime: If the event interrupts the user period, then down time is
calculated using the beginning of event (includes date and time) and end of
event (includes date and time).
• Symptom Category
• The assignment of categories to STD Discrepancies, accomplished by
discrepancy originator, allows the STD operator to analyze what areas of the
device are prone to problems from the crew’s perspective. This category
should be assigned by the user who experiences the discrepancy and
records their perception of what major area of the device was affected. The
following list contains the minimum recommended categories:
o Visual
o Motion
o Control Loading
o IOS
o Instruments
o Input Devices (switches, knobs, etc.)
o Aircraft System
o Aircraft Hardware (seats, upholstery)
o Facility (power, HVAC)
o Other (list item)
ARINC REPORT 433 – Page 8

3.0 DATA COLLECTION

• Diagnosed Category
The assignment of categories to the solution of STD Discrepancies by
technical personnel, allows the STD operator to access and analyze the
information without the need to read narrative commentary. This category
should be assigned by the person(s) who resolve the discrepancy and
records what area of the device caused the issue. The following should be
considered as a minimum recommended list of categories:
o Host
o Visual (IG, Display, Scene)
o Visual Display
o Motion (Hydraulic, Mechanical, Electronic, Electrical)
o Control Loading (Hydraulic, Mechanical, Electronic, Electrical)
o IOS (UI, Computer, Input Devices)
o Interface
o Instruments (Simulated, Stimulated)
o Input Devices (switches, knobs, etc.)
o Aircraft Hardware (seats, upholstery)
o Hardware
o Software
o Facility
o Other (list item)
COMMENTARY
As an alternative the operator may wish to use ATA codes as means
to track STD discrepancies. In addition to the standard ATA codes for
aircraft systems, IATA Flight Training Device Support Documentation
Requirements, Annex 2 identifies Chapters 115 and 116 for STD
systems. For an example of further defining the use of system codes
for tracking, see Attachment 2.
• Action taken to address problem (includes date, time, and
personnel involved in action)
• Current status of the discrepancy
• Name and organization of person who signed off discrepancy
(indicated discrepancy closed)
• Date and time the discrepancy was signed off (closed)
The following Figure 3.1.3-1 provides an example of how this information can be
logged.
ARINC REPORT 433 – Page 9

3.0 DATA COLLECTION

Figure 3.1.3-1– STD Discrepancies


Key to Figure 3.1.3-1
1. Date and time the discrepancy was entered.
2. Type of event: unscheduled (discrepancy) or troublecall (interrupt).
3. Unique event number (comprised of the STD ID number and an
automatically generated event number).
4. Name and organization of person who entered discrepancy.
5. Category (includes both the category initially assessed and the actual
category into which the event falls).
6. Description of the problem (includes date, time, and name of person who
entered the discrepancy).
7. Actions taken to address problem (includes date, time, and personnel
involved in action).
8. Down time (calculated in hours and minutes from the moment the STD
becomes unavailable for training to the moment it becomes available for
training).
9. Current status of the discrepancy.
10. Name and organization of person who closed (signed off) discrepancy,
including date and time.
ARINC REPORT 433 – Page 10

3.0 DATA COLLECTION

3.1.4 Maintenance Activity (Preventive, Planned)


Preventive and planned maintenance activities are required for a quality system. All
operators should refer to their appropriate Regulatory Authorities for any compulsory
requirements.
One reference for the elements of a Quality Management System is described in
detail in CFR 14 Part 60. This system records information about STD maintenance
and repair which is useful for failure analysis, inventory needs, and determining
maintenance costs. While most preventive maintenance tasks are developed using
the manufacturer’s recommendations, the user may desire to change, add or delete
tasks based on special needs or operational experience.
Regulatory testing may be included in a preventative maintenance activity to ensure
they meet regulatory requirements.
The following items may be considered by the operator for gathering data on the
maintenance activity:
• Unique event number (also called tracking number or job number)
• Description of activity to be performed (e.g., replace hydraulic filters)
COMMENTARY
Preventive maintenance tasks and schedules need to be constructed
based on the manufacturer’s recommended maintenance schedules.
If inspections are being accomplished, the task could also identify the
parameters or conditions necessary to complete the task.
• What Category of equipment or system is affected? ATA
Chapters or system description could also be used here to
classify preventive maintenance tasks.
• What work has been completed?
• What materials are required to complete the tasks?
• How much time is required?
o In man-hours?
o In maintenance support time?
• What is the task’s present status?
• Information about the equipment or asset being serviced: Part
Number, Serial Number, Description
• Due date of maintenance activity
• Date completed
• Sign-Off, Person, Date and Time
The following Figure 3.1.4-1 provides an example of how this information can be
logged.
ARINC REPORT 433 – Page 11

3.0 DATA COLLECTION

Figure 3.1.4-1 – Maintenance Activity


Key to Figure 3.1.4-1
1. Date and time the maintenance activity was entered or generated
2. Type of event: preventive maintenance or scheduled maintenance
3. Unique event number (comprised of the STD ID number and an
automatically generated event number)
4. Name and organization of person that entered discrepancy
5. Category
6. Description of the activity (includes date, time, and name of person who
entered the activity)
7. Actions taken (includes date, time, and personnel involved in action).
8. Current status of the discrepancy
9. Name and organization of person who closed (signed off) activity, including
date and time
10. List of parts used for maintenance activity including part number, serial
number, description, and dates ordered, due, received, and installed
3.1.5 Configuration Control Information
Changes to the STD’s configuration should be controlled. Changes can be made by
modification to the hardware, software or firmware design of the STD. All operators
should refer to their appropriate Regulatory Authorities for any compulsory
requirements.
The following items may be considered by the operator for gathering data on the
configuration control activity:
ARINC REPORT 433 – Page 12

3.0 DATA COLLECTION

• Unique identifier
• Management approval authority
• Date and time management approved change
• Reason for the change
o Aircraft change
o Aircraft manufacturer data update
o Vendor data update
o Service Bulletin
o STD manufacturer update
o Instructor request
o STD operator’s maintenance/engineering staff
o Regulatory Authority requirement
o STD discrepancy
o Other
• Diagnosed Category (See Section 3.1.3)
• Type of change
o Hardware - Part Number
o Software – Load Number
o Avionics – Part Number and Load Number (if applicable)
o Etc.
• Description of change to be made
• Work completed
• Name and organization of personnel doing work
• Date and time of work
• Current status
• Name and organization of person performing flight check (if required)
• Name and organization of person who indicated the configuration change
was completed
• Date and time the configuration change was completed
• Due Date
• Flight Check of STD required—Yes/No (if yes, date and time of flight check)
• Regulatory Notification required – Yes/No (if yes, the date of notification and
agency(ies))
• Regulatory Approval(s) if required, date of approval and agency(ies)
• Supporting data references for the change
The following Figure 3.1.5-1 provides an example of how this information can be
logged.
ARINC REPORT 433 – Page 13

3.0 DATA COLLECTION

Figure 3.1.5-1 – Configuration Control Information


Key to Figure 3.1.5-1
1. Management approval authority
2. Unique event number
3. Date and time the configuration change was completed
4. Name and organization of person who indicated the configuration change
was completed
5. Reason for the change (if from change to aircraft, engineering order is
recorded here)
6. Reason for the change (if from revision to engineering order, revision
number is recorded here)
7. Reason for the change (if from a discrepancy on the STD, the originating
event number is recorded here)
8. Type of change
9. Description of change
10. Work done (summary for display to crews)
11. Work done (full explanation including name and organization of personnel
doing work and date and time of work)
12. Approval status: waiting approval, not applicable, approved
13. Current status of work
14. Date and time simulation management approved change
ARINC REPORT 433 – Page 14

3.0 DATA COLLECTION

15. Date and time flight management approved change.


16. Functional check date (i.e., flight test of the STD)
17. In training date
ARINC REPORT 433 – Page 15

4.0 DATA ANALYSIS

4.0 DATA ANALYSIS


4.1 Introduction
This section provides formulas and tools for analyzing the data collected.
4.2 Formulas and Computation
Unless otherwise stated, all formulas and computations are based on a standard 24
hour day. In order to provide proper analysis, planned events need to be compared
against actual results, as suggested in the following formulas.
These formulas and computations may be adapted for weekly, monthly, etc.
evaluations as necessary.
STD quality is not a single measurement; rather, it is a combination of the
performance indicators and formulas provided below.
Planned time refers to what is scheduled on the STD and Actual time is what is
recorded at the end of the day.
Table 4.2-1 – Suggested Formulas Used to Illustrate Data Analysis

Planned Training Day (PTD) = 24 hours a day – Planned Support Time

Planned Support Time = Planned Configuration Time + Planned


Engineering Time + Planned Maintenance
Time + Planned Regulatory + Planned Out of
Service

Actual Training Day (ATD) = 24 hours a day – Actual Support Time – STD
Down Time

Actual Support Time = Actual Maintenance Time + Actual


Engineering Time + Actual Regulatory Time
+ Actual Configuration Time + Actual Out of
Service Time

Training Availability = (PTD-LTT)/PTD*100


o Expressed as a percentage of PTD as a
function of LTT.
o Training Availability is from a crew
perspective.
Example (see Table 4.2.-1 for data):
Training Availability = (18-4)/18 *100 = 77.77%
ARINC REPORT 433 – Page 16

4.0 DATA ANALYSIS

Device Availability = (PTD-STD Down Time)/PTD *100


o Expressed as a percentage of PTD as a
function of STD Down Time
o Device Availability is from a sim support
perspective. This metric takes into
account all events that could affect the
availability.
Example (See Table 4.2-1 for data):
Device Availability = (18 – 2)/18 *100 = 88.88%

Device Reliability = (PTD-Device Failure Time)/PTD *100


o Expressed as a percentage of PTD as a
function of Device Failure Time
o Device Reliability is a metric that takes
into account device specific events.
Example (See Table 4.2-1 for data):
Device Reliability = (18 – 1)/18 *100 = 94.44%

STD Utilization = (Actual training time + Actual other


time)/PTD*100
o Expressed as a percentage of PTD as a
function of STD uses
Note: This could be greater than 100%,
thus implying more use of the
simulator than what was planned.
ARINC REPORT 433 – Page 17

4.0 DATA ANALYSIS

Average Discrepancy Turn Around Sum of total discrepancy open times/total


Time = number of discrepancies
Where the discrepancy open time = time
removed from discrepancy list minus the
date opened
o Expressed as a number of days, time,
etc.
o Normally an average across a time
period, device or fleet
o This metric can be expressed several
different ways depending on application
Note: Priority level may be considered when
utilizing this metric for evaluation purposes (see
Section 3.1.3 for priority level)
Example
Discrepancy Time Open(hrs)
1 20
2 64
3 180
ADTAT= (20+64+180)/3
= 88 hours/discrepancy

Number of Interrupts = Count of suspension of training events in a


given time period
o Expressed as a number of interrupts per
day, week, month
o Can be evaluated per device or across a
fleet
Count of entries recorded against a STD
Number of Discrepancies =
o Expressed as an average number of
discrepancies per day, week, month
o Can be evaluated per device or across a
fleet
o Can be categorized by priority
ARINC REPORT 433 – Page 18

4.0 DATA ANALYSIS

Issue Ageing Length of time each simulator issue has


been unresolved
o Expressed as the number of issues
open by different periods of time (0-
30 days, 31-90 days, 91-180 days, 181-
365 days, over 1 year)
o Displayed in a bar graph showing
trending over 12 months

Average hours between interrupt = (Actual training time + Actual other time) /
Number of interrupts
o Expressed as an average quality
rating per day, week, month, etc.
o Can be evaluated per device or across a
fleet

Average Quality Rating = Sum of training session quality ratings /


Number of training session quality ratings
o Expressed as an average quality rating
per day, week, month, etc.
o Can be evaluated per device or across a
fleet

Table 4.2-1 – Example Data to Illustrate Data Analysis

Planned Categories Hours Actual Categories Hours


Total Time Available 24 Total Time Available 24
Planned Training Day 18 Actual Training Day 19
Training Time 16 Training Time 18
Other Time (e.g Demos) 1 Other Time (e.g. Demos) 1
Open Time 1 Open Time 0
Planned Support Time 6 Actual Support Time 3
Engineering Time 0.5 Engineering Time 0.5
Maintenance Time 3 Maintenance Time 0
Regulatory Time 0.5 Regulatory Time 0.5
Configuration Time 1 Configuration Time 1
Out of Service Time 1 Out of Service Time 1
STD Down Time 2
Facility Down Time 1
Device Failure Time 1

Lost Training Time (See Section 2.1) 4


Crew recorded training interruptions 3
Late start to training 1
ARINC REPORT 433 – Page 19

4.0 DATA ANALYSIS

Total Time Available (Planned)

6
Planned Training
Day

Planned Support
Time

18

Total Time Available (Actual)


Actual Training
Day
2
3 Actual Support
Time

STD Down Time

19

Planned Training Day

16

Training Time
Other Time (e.g Demos)
6 18
Open Time
Planned Support Time

1
1
ARINC REPORT 433 – Page 20

4.0 DATA ANALYSIS

Actual Training Day

18
Actual Support Time
2 STD Down Time
19 Training Time
3
Other Time (e.g. Demos)
0 Open Time

Planned Support Time

Planned Training Day

0.5 Engineering Time


1
Maintenance Time

18 6 Regulatory Time
3 1
Configuration Time
0.5
Out of Service Time

Actual Support Time

Actual Training Day


STD Down Time
Engineering Time
Maintenance Time
1 Regulatory Time
21 Configuration Time
Out of Service Time
3
19

0.5 0.5
0
ARINC REPORT 433 – Page 21

4.0 DATA ANALYSIS

STD Down Time (Maint. Perspective)

1 Actual Training Day


Actual Support Time
3
Facility Down Time
2 Device Failure Time
19

Lost Training Time (Crew Perspective)

1 Crew recorded training


interruptions

Late start to training

Figure 4.2-1 – Example Metrics


4.3 Frequency of Useful Collection and Reporting of Information
Information should be collected on a real-time basis, where applicable.
Reports should be created and distributed to internal and external customers (e.g.,
regulatory agencies) on an as required basis. Certain functions, such as
maintenance-related reporting, may be preferred monthly, weekly, or perhaps even
daily.
In addition, reports should be available to any user of the STD to display current
information on availability, interrupts, downtime, STD discrepancies, and additional
information that includes but is not limited to:
1. Known problems
2. Training Restrictions
3. Limitations (e.g., not affecting the qualification of the STD)
ARINC REPORT 433 – Page 22

4.0 DATA ANALYSIS

4.4 Benefits of Data Analysis


The provision of on-line, on-demand, or regularly reported information will help to
better manage the day-to-day operation, quality, and availability of an STD.
Information which is automatically gathered and regularly reported allows operators
to attain established goals, and allows the re-deployment of resources to
concentrate on meeting required objectives.
Tracking of measures will allow a trend analysis to be completed on key
parameters. The system should be capable, for example, of identifying significant
downward trends, before the actual information statistic goes below a pre-
determined value. Thus, STD operators can be more proactive in addressing
potential problems before they have a negative impact on flight crew training.
Self-auditing and meeting accepted measurable goals should demonstrate that an
STD is being maintained to an acceptable quality level. This information could be
used as validation information for presentation to Regulatory Authorities during STD
evaluations, or as substantiating best practices during regulatory spot-checking. As
these systems are developed and instituted, and acceptable access protocols are in
place, sharing of this information via electronics means can be explored.
4.5 Data Users
4.5.1 Internal Customers
• STD Engineering and Maintenance
• Contract Training/Sales
• Aircraft Engineering and Maintenance
• Test Pilots
• Fleet Captains and Training Management
• Flight Standards Management
• Quality Assurance Management
• Instructors and Crews
• Scheduling/Planning
• Accounting
4.5.2 External Customers
• Regulatory Authorities
• STD Manufacturers and Vendors
• Contract Customers
ARINC REPORT 433 – Page 23

ATTACHMENT 1
EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING, AND REPORTS

ATTACHMENT 1 EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION


GATHERING, AND REPORTS
Data for 8/7/00 to 8/13/00
Period

Discrepancies
discrepancies

discrepancies

discrepancies
Daily Actual

a' severity

b' severity

c' severity
Planned Actual Simulator Simulator

Total
Actual Training
Simulator Training Training Down Time Down
Training Time
Time (Hrs) Time (Hrs) (Hrs) Time %
Time hr/fault

A320#1 85 81.3 11.61 1.3 1.53% 1 1 2 4 20.33


A320#2 72 65.3 9.33 0.0 0.00% 0 0 3 3 21.77
B747-COMBI 64 37.9 5.41 1.0 1.56% 1 2 4 7 5.41
B737#2 56 51.4 7.34 0.0 0.00% 0 0 0 0 51.40
B737-436#1 108 102.1 14.59 0.0 0.00% 4 0 2 6 17.02
B737-436#2 124 116.8 16.69 1.6 1.29% 2 0 13 15 7.79
B747-436#1 81 73.5 10.50 0.0 0.00% 0 0 1 1 73.50
B747-436#2 100 97.9 13.99 0.8 0.80% 5 0 5 10 9.79
B747-436#3 108 106 15.14 0.0 0.00% 1 2 4 7 15.14
B747-436#4 84 76.3 10.90 0.3 0.36% 2 0 5 7 10.90
B757#1 96 82.8 11.83 0.0 0.00% 2 4 0 6 13.80
B757#2 76 76 10.86 0.0 0.00% 0 0 2 2 38.00
B767 125 118.4 16.91 4.3 3.44% 2 1 5 8 14.80
B777#1 56 53.3 7.61 0.0 0.00% 1 1 0 3 17.77
B777#2 50 47.8 6.83 0.0 0.00% 0 0 2 2 23.90
B777#3 0 0 0.00 0.0 0.00% 0 0 0 0 0.00
B1-11 30.6 30.6 4.37 0.0 0.00% 0 0 0 0 30.60
B757-FBS 6.6 6.6 0.94 0.0 0.00% 0 0 0 0 6.60
TOTAL 1322.2 1224 174.86 9.3 0.70% 21 11 48 81 15.11

1. Data Period (Days) 7


Example 1 – Synthetic Training Device (STD) Utilization Information
ARINC REPORT 433 – Page 24

ATTACHMENT 1
EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING, AND REPORTS

# Training Session Quality Ratings

No. of Interrupts
No. of Training

No. of Training

Session Count
Training Down
STD Identifier

Lost Training

Unsatisfactory
No. of Maint.
Time (Mins)

Interrupted

Acceptable
ATD (Hrs)

Total Sim
Not Rated
Sessions
Interrupt

Excellent
Classes
Number

Classes

Good
Time

Poor
3 400-1 371.02 50 115 33 27 0 321 15 51 13 4 1 32 116
4 400-2 400.65 325 126 38 36 0 488 9 61 19 4 2 32 127
5 400-3 426 120 122 20 17 0 413 11 49 7 1 0 55 123
6 400-4 430.37 195 119 15 11 0 558 2 64 11 3 1 38 119
7 400-5 261.58 50 75 28 19 0 256 0 43 9 1 0 22 75
727-2
8 727-4 195.64 0 63 24 14 0 275 2 14 14 2 0 32 64
9 727-5 443.83 105 141 36 25 0 404 4 50 11 4 1 71 141
10 727-6 519.31 10 158 38 36 1 373 25 63 5 1 1 65 160
11 727-7 525.86 10 162 24 23 0 236 26 59 8 0 1 69 163
12 737-2 40.38 0 13 3 2 0 26 2 3 1 0 0 7 13
13 737-3 430 45 115 41 29 2 1180 10 36 1 1 4 64 116
14 737-4 393.68 60 121 10 7 1 74 4 39 4 1 0 75 123
15 737-5 481.27 0 134 22 22 0 168 25 35 4 0 0 70 134
16 737-6 430.3 10 133 20 19 0 125 19 38 1 1 0 74 133
17 737-7 458.46 0 135 31 30 0 261 22 45 7 0 1 61 136
18 737-8 497.5 115 145 43 43 0 639 9 47 15 4 2 70 147
20 747-1 245.08 110 68 10 6 0 298 17 20 4 2 0 25 68
21 747-2 376.52 215 109 26 25 0 271 7 43 5 1 0 53 109
22 757-1 402.31 80 128 18 17 0 341 26 58 6 0 0 39 129
23 757-2 370.55 7 123 25 21 0 593 24 56 10 0 1 33 124
24 757-3 442.6 200 137 32 27 0 408 24 50 4 3 0 56 137
27 767-1 331.87 0 98 21 15 0 173 9 35 12 2 0 43 101
28 767-2 378.86 30 108 10 8 0 175 15 46 14 1 0 34 110
29 767-3 403.73 295 117 35 33 0 770 4 36 31 11 2 42 126
30 777-1 315.54 180 95 21 20 0 1807 23 31 7 1 1 33 96
31 777-2 260.57 40 78 12 10 0 298 16 17 3 0 0 43 79
32 777-3 281.1 50 77 17 13 1 198 17 13 7 4 0 37 78

Example 2 – Simulator Fleet Monthly Measures


ARINC REPORT 433 – Page 25

ATTACHMENT 1
EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING, AND REPORTS

MONTHLY INTERRUPTS
PER ACTUAL TRNG DAY
GOAL ≤ 1
ALL DEVICES - last 12 m onths
1.00
0.80
0.60
0.40
0.20
0.00
May Jun Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr

AVERAGE AVAILABLE
GOAL ≥ 98.5% ALL DEVICES - last 12 months

100.0

99.0

98.0

97.0

96.0

95.0
Jun Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May

Example 3 – Monthly STD Report Charts On All Devices


ARINC REPORT 433 – Page 26

ATTACHMENT 1
EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING, AND REPORTS

Issue Ageing of a Single STD

AVERAGE OPEN DISCREPANCIES


per simulator
Average Open per sim

10

5
2/10/2000

Example 3 – Monthly STD Report Charts On All Devices


ARINC REPORT 433 – Page 27

ATTACHMENT 1
EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING, AND REPORTS

PERCENT UNINTERRUPTED SIMULATOR TRAINING SESSIONS


- ALL SIMULATORS - APRIL 2000
100%
80%
60%
40%
20%
0%

0
9

00
'0

'0
'9

'0

r'
b

ar
ec

Fe

Ap
Ja

M
D

ROOT CAUSE OF INTERRUPTS - SIX


MONTH ANALYSIS - ALL SIMS

Other
H/W
Host Reload
Instructor

IOS

CLU Visual
Motion

Example 3 – Monthly STD Report Charts On All Devices


ARINC REPORT 433 – Page 28

ATTACHMENT 1
EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING, AND REPORTS

Example 4 – Yearly Report on STD Simulation Availability


ARINC REPORT 433 – Page 29

ATTACHMENT 1
EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING, AND REPORTS

Example 5 – Yearly Report on STD Quality Rating


ARINC REPORT 433 – Page 30

ATTACHMENT 1
EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING, AND REPORTS

Example 6 –Training Session Quality Rating Summary


1. Track or team whose STD session assessments you want to view
(A300/310, STS Team B, etc.).
2. Range of training days you want to view.
3. STD ID number.
4. STD track (A300/310, MD10/11, etc.).
5. Team assigned to maintain the STD (Team A, B, C, D, etc.).
6. Average rating of training effectiveness [Sum of ratings/Number of
rated sessions]. The higher the rating, the better the overall
performance of the STD.
a. 1 = Unsatisfactory: Training could not be completed.
b. 2 = Debriefed Satisfactory: Training was completed in debrief
because workarounds and interrupts precluded further
STD training.
c. 3 = Satisfactory: Any workarounds or interrupts did not
preclude completion of training.
d. 4 = Outstanding: All training completed, no workarounds and
no interrupts.
7. Number of interrupts recorded in the assessments.
8. Total amount of lost time recorded in the assessments in hours (i.e.,
1 hour and 30 minutes is displayed as 1.5 hours).
9. Percentage of scheduled sessions in which the STD session was
rated [(Rated Sessions/Scheduled Sessions)*100].
10. Track or team whose STD session assessments are displayed.
11. Average rating for STDs in the selected Track/Team [Sum of average
STD ratings for selected STDs/Number of STDs].
ARINC REPORT 433 – Page 31

ATTACHMENT 1
EXAMPLE FORMS AND SCREEN DISPLAYS, INFORMATION GATHERING, AND REPORTS

12. Total recorded interrupts for selected STDs [Sum of interrupts].


13. Total recorded lost time for selected STDs [Sum of lost time in
hours].
14. Percentage of scheduled sessions in which the STD was rated for all
selected STDs [(Sum of Rated Sessions/Sum of Scheduled
Sessions)*100].
ARINC REPORT 433 – Page 32

ATTACHMENT 2
EXAMPLE ATA CODES

ATTACHMENT 2 EXAMPLE ATA CODES


System Sub
Code Code Description
0 NONE SELECTED
0 0 None Selected

5 MAINTENANCE CHECKS
5 0 General
5 20 Preventive/Periodic Maintenance
5 21 FAA Qualification Prep
5 43 INS/IRS/GPS Update

6 GENERAL
6 0 General
6 1 Non-STS Responsibilities
6 7 Operator Error - Sim Eng
6 8 Operator Error - STS
6 10 Could Not Duplicate
6 11 Works as Designed/ Works IAW
6 20 Facility Systems
6 21 Facility Power
6 22 Facility Cooling/Heating
6 23 Facility Fire Protection
6 30 Operator Error
6 31 Previously Entered
6 32 Entered in Error
6 40 ECO Accomplishment/Minor Mods
6 41 SPR/Failure Report/ECO Generation
6 60 Data Updates
6 62 Simulator Navigation Data Base (See 80-20)
6 63 Equip/Personnel Safety

11 PLACARDS AND MARKINGS


11 0 General
11 10 Exterior Paint
11 20 Exterior Placards/Markings
11 30 Interior Placards/Markings

21 AIR CONDITIONING
21 0 General
21 8 System Control Panel
21 26 Equipment/Instrument Cooling
21 30 Pressure Control/Indication
21 60 Temp. Control System

22 AUTO FLIGHT
22 0 General
22 10 Control AP/Flt Guide Sys/Auto Flt Sys
22 11 Flight Control Computer/Flight Augmentation C
22 20 Speed/Attitude Correct/Auto Pitch Trim
22 21 LSAS
ARINC REPORT 433 – Page 33

ATTACHMENT 2
EXAMPLE ATA CODES

System Sub
Code Code Description
22 23 Yaw Damper System
22 30 Auto Throttles
22 35 Thrust Rating PNL, Displays, Warnings
22 40 System Monitor, MTP
22 70 Approach Display/FMA

23 0 General
23 COMMUNICATIONS
23 10 VHF Communications
23 11 Cockpit Speaker
23 12 High Freq. (HF) Communication
23 20 SELCAL Function
23 24 ACARS/Printer
23 28 Satellite Communications
23 40 Interphone
23 50 Audio Integrating System
23 51 Crew Headphone
23 52 Microphones
23 70 Voice Recorder/ULB System
23 80 Instructor Communication

24 ELECTRICAL POWER
24 0 General
24 8 Electrical System Control Panel
24 10 Generator Drive(CSD)/Integrated Drive Gen(IDG)
24 20 AC Gen & Control/Monitoring
24 22 Emergency AC Power & ADG/RAT
24 30 DC Gen & Control/Monitoring
24 32 Emergency DC & Battery/Chrgr
24 40 External Power
24 50 AC Load Distribution/Bus Tie Cntrl
24 60 DC Load Distribution

25 EQUIPMENT FURNISHINGS
25 0 General
25 1 APLC/PAT Batteries
25 10 Flight Comp't
25 11 Seats
25 60 Emergency/Evacuation Equipment

26 FIRE PROTECTION
26 0 General
26 10 Engine Fire Detection System
26 11 APU Fire Detection
26 12 Smoke Detection
26 13 Wing/Body Overheat
26 20 Fixed Engine/Cargo Extinguishers
26 21 Portable Extinguishers
ARINC REPORT 433 – Page 34

ATTACHMENT 2
EXAMPLE ATA CODES

System Sub
Code Code Description
26 30 Simulator Fire Protection/Smoke and Overheat

27 FLIGHT CONTROLS
27 0 General
27 1 Takeoff Warning
27 10 Aileron/Tabs/Ind & Control
27 20 Rudder/Tabs/Ind & Control
27 30 Elevator/Tabs/Ind & Control
27 32 Elevator Load Feel
27 40 Horizontal Stabilizer/Speed Trim
27 50 Flaps/Control/Indication
27 60 Spoiler/Spd. Brakes & Indication
27 80 Leading Edge Slats & Flaps/Indication
27 90 Simulator Control Loading System
27 91 Computer/ Reload
27 92 Interface/Cables
27 93 Servo Amp/Buffer Unit
27 94 Load Unit
27 95 Mechanical Linkage
27 96 Calibration/Alignment/Tuning

28 FUEL SYSTEMS
28 0 General
28 8 Fuel Sys Control Panel/Controller
28 20 Fuel Distrib/Refueling & Defueling
28 21 Fuel Boost/Transfer/Pumps/Ind
28 30 Fuel Dump System
28 40 Fuel Qty Indication
28 41 Fuel Schedule/Management System
28 43 Fuel Temperature Indication

29 HYDRAULIC POWER
29 0 General
29 8 Hyd. Sys Control Panel/Controller
29 11 ADP/EDP
29 20 Auxiliary/Standby System
29 21 RMP/Aux Pumps
29 30 Hydraulic Indication Systems
29 40 Simulator Hyd Power (HPU)
29 41 Control Valves
29 42 Filters
29 43 Hoses, Tubing, Fittings
29 44 Accumulators
29 45 Pumps/Motors
29 46 Control/Monitoring/Warning System

30 ICE:RAIN PROTECTION
30 0 General
30 10 Airfoil Anti-Ice
30 30 Pitot/Static/TAT/AOA Heat
ARINC REPORT 433 – Page 35

ATTACHMENT 2
EXAMPLE ATA CODES

System Sub
Code Code Description
30 40 Windshield & Windows
30 80 Ice Detection

31 INSTRUMENTS
31 0 General
31 10 Instrument Panels
31 20 Clocks/Chronometer
31 30 Data Recorders & ULB
31 31 Data Management (FDAU)
31 41 Misc Systems Controller
31 42 Weight & Balance Comp./CGCC
31 43 Versatile Integrated Avionics (AIU/VIA)
31 50 Central Warning Systems/ECAM
31 60 Central Display Sys/ECAM/MCDU
31 61 Electronic Instrument System/EFIS(Douglas)
31 70 Monitoring/PSEU

32 LANDING GEAR
32 0 General
32 1 Ground Sensing
32 10 Main Gear and Doors
32 12 Body Gear and Doors
32 20 Nose Gear and Doors
32 30 Extension & Retraction System
32 43 Brakes/Cooling System
32 44 Anti-Skid System
32 45 Auto Brakes System
32 46 Brake Temp & Tire Press Mont. Sys
32 51 Nose/Body Gear Steering System
32 60 Gear Position Ind/Warning System
32 70 Tail Skid/Tail Stand

33 LIGHTING
33 0 General
33 1 Lamp Test
33 10 Flight Comp't/Panel Lighting
33 11 Master Warning
33 30 External/Service Lighting
33 50 Emergency Lighting
33 60 Simulation Lighting
33 61 Cockpit Lighting
33 62 Maintenance Lighting

34 NAVIGATION
34 0 General
34 1 Simulated Nav Aids Alignment
34 10 Pilot Static Systems
34 11 Altimeter: Stby Altimeter/Indicated Airspeed
ARINC REPORT 433 – Page 36

ATTACHMENT 2
EXAMPLE ATA CODES

System Sub
Code Code Description
34 12 Vertical Speed Indicator
34 13 RAT/OAT/TRS/SAT/TAS/TAT
34 14 Airspeed/MACH Indicator/Overspeed Warning
34 16 Air Data Computing
34 17 Altitude Alerting
34 18 Stall Warning/Stick Shaker
34 19 TAT-EPR Limit/RAT-EPR Limit
34 21 Compass System
34 22 Vertical Gyro/Horizon/Attitude
34 23 Turn and Bank Indicator
34 24 Standby Compass
34 25 Standby Attitude/SAI See 34-11
34 31 Instrument Landing Sys-LOC, G/S, HSI, etc.
34 35 Marker Beacon System
34 41 Weather Radar
34 42 Central Instrument Warning Sys (CIW)
34 43 INS/IRS Nav
34 45 Distance Measuring(DME)
34 48 Radio Altimeter
34 51 VOR/VHF Navigation
34 52 Ground Prox Warning/Windshear
34 53 ADF
34 55 Traffic & Collision Avoidance (TCAS)
34 58 Global Nav/Position Sys (GPS/GNS)
34 61 Performance Data Computer Sys (PDCS)
34 62 Flight Director Systems
34 63 Flight Management Computing
34 73 Electronic Flight Instrument System/EFIS

35 OXYGEN
35 0 General
35 10 Crew Systems
35 30 Portable Oxygen

36 PNEUMATIC SYSTEMS
36 0 General
36 10 Distribution System/Control
36 11 Low/High Bleed
36 12 Pressure Regulation
36 15 Temp Regulation
36 20 Manifold Pressure/Temp. Ind
36 22 Manifold Failure Detection/Ind

38 WATER:WASTE
38 0 General

45 CENTRAL MAINTENANCE SYS (CMS)


45 0 General
45 10 Centralized Fault Display System (CFDS)
45 11 On-Board Maintenance Terminal(OMT)
ARINC REPORT 433 – Page 37

ATTACHMENT 2
EXAMPLE ATA CODES

System Sub
Code Code Description
45 13 Pilot Access Terminal (PAT)

46 ELECTRONIC FLIGHT BAG


46 20 Electronic Flight Bag

49 AIRBORNE AUX. POWER(APU)


49 0 General
49 10 Power Plant
49 17 Inlet Doors & Actuation
49 30 Fuel & Control
49 40 Ignition/Starting
49 50 Bleed Air System
49 60 Engine Controls(ECB/ECU & VTN)
49 70 Indications
49 90 Oil System

52 DOORS
52 0 General
52 10 Crew & Passenger Entry
52 70 Door Warning Systems

53 SIMULATOR MOTION
53 0 General
53 10 Rams/Legs/Actuators
53 11 Servo Valves
53 12 Transducers
53 20 Base Assembly
53 30 Accessway/Gantry/Drawbridge
53 40 Control/Monitoring/Warning System
53 50 Tuning/Alignment
53 60 Computer (Reload)
53 70 Interface

56 WINDOWS
56 0 General
56 10 Flight Comp't Windows
56 11 Cockpit Sliding Windows

73 ENGINE FUEL:CONTROL
73 0 General
73 10 Fuel Distribution/Supply
73 22 Auto Fuel Sys/ PMC/TTC/FADEC
73 30 Indicating Fuel Temp & Heat
73 33 Fuel Flow/Fuel Used/Fuel Press Ind

74 IGNITION SYSTEMS
74 0 General
74 30 Ignition Switching
ARINC REPORT 433 – Page 38

ATTACHMENT 2
EXAMPLE ATA CODES

System Sub
Code Code Description

76 ENGINE CONTROLS
76 0 General
76 10 Power Levers
76 11 Throttle Controls (Cables/Pulleys/Switches)
76 13 Thrust Control Module/TCC
76 15 Fuel Shutoff Levers/Switch
76 20 Emergency Shutdown System/Fire Handles
76 30 Thrust Rev Sys/Actuation/Locking Sys

77 ENGINE INDICATING SYSTEMS


77 0 General
77 10 EPR/PT7, HP Indication
77 11 Alternator Control
77 14 Max Pointer Reset
77 15 N1 RPM Indication
77 16 N2 RPM Indication
77 20 EGT/TGT Indication
77 31 Engine Vibration Indication System

79 ENGINE OIL
79 0 General
79 30 Oil Temp Indication
79 31 Oil Quantity Indication
79 32 Oil Press Indication/Warning
79 34 Oil Differential Pressure Warning

80 SIMULATOR HOST COMPUTER SYSTEM


80 0 General
80 10 Software
80 11 Operating System (MPX,AIX,UNIX)(Reboot)
80 12 Executives/Real Time Control (Reload)
80 13 Real Time Modules
80 14 Utilities
80 15 Diagnostics
80 20 Data Updates
80 21 NDBS/GSD
80 22 INS
80 30 Hardware
80 31 Communications
80 32 Peripherals
80 60 Satellite PCs (Dedicated Function)

81 SIMULATOR ONLY SYSTEMS


81 0 General
81 1 Cables
81 10 Interface/Linkage
81 11 Circuit Board
81 12 Backplane/Buss
81 20 Power Control and Monitoring
ARINC REPORT 433 – Page 39

ATTACHMENT 2
EXAMPLE ATA CODES

System Sub
Code Code Description
81 21 Power Supplies
81 22 400HZ Generation
81 23 Emergency Power/ Battery Backup
81 30 Simulated Instruments
81 40 Simulated A/C Panels
81 50 Sound/Audio

85 VISUAL SYSTEM
85 0 General
85 1 Ambient Lighting
85 5 Cables
85 10 Projection Systems
85 11 Projector
85 12 Projection Tubes
85 13 Mirror and Control Systems
85 14 Alignment System/AutoCal(DRCU, RAU)
85 20 Monitor/CRT Systems
85 21 CRTs
85 22 Optics
85 30 Image Generation
85 31 Computer (Reboot)
85 32 Circuit Cards
85 40 Alignments
85 41 Geometry
85 42 Color
85 43 Visibility
85 50 Models
85 51 New Model Request
85 52 Alignment with Nav Aids
85 53 Airport Lighting/Marking
85 54 Weather Effects

99 INSTRUCTOR OPERATING SYSTEM/IF


99 0 General
99 10 Computer System
99 11 Computer (Reboot)
99 12 Peripherals/Printers
99 20 Display System
99 21 Monitors/Display
99 22 Touchscreen/Input
99 23 Calibration/Alignment
99 30 EL Panels/PCU
99 50 Seats/Movable IOS
99 60 Control Pages
99 61 Page Content
99 62 Page / Malfunction Operation
99 70 Misc Simulator Controls
ARINC REPORT 433 – Page 40

ATTACHMENT 2
EXAMPLE ATA CODES

System Sub
Code Code Description
100 LESSON PLAN
100 0 General
100 10 New Snapshot
100 11 Update Snapshot
100 20 New Scenario
100 21 Update Scenario
100 30 New Lesson
100 31 Update Lesson
100 40 New Menu
100 41 Update Menu
ARINC REPORT 433 – Page 41

APPENDIX A
ACRONYMS

APPENDIX A ACRONYMS
ADTAT Average Discrepancy Turn Around Time
AOG Aircraft on Ground (See Definition Section 2.1)
ATA Airline Transport Association
ATD Actual Training Day
CBT Computer-Based Training
FFS Full Flight Simulator
HVAC Heating, Ventilation, and Air Conditioning
IATA International Air Transport Association
IOS Instructor Operating System
LOFT Line Oriented Flight Training
LTT Lost Training Time
MT Maintenance Trainer
PTD Planned Training Day
QAP Quality Assurance program
STD Synthetic Training Device (e.g., FFS – full flight simulator, FSD – flight simulation
device, FTD – flight training device, simulator, FSTD – flight simulation training
device, etc.)
AERONAUTICAL RADIO, INC.
2551 Riva Road
Annapolis, Maryland 24101-7435

SUPPLEMENT 1

TO

ARINC REPORT 433

STANDARD MEASUREMENTS FOR FLIGHT SIMULATION QUALITY

Published: December 14, 2007

Prepared by the FSEMC

Adopted by the FSEMC Steering Committee: October 8, 2007


SUPPLEMENT 1 TO ARINC REPORT 433 – Page a

A. PURPOSE OF THIS DOCUMENT


This supplement represents a complete revision to ARINC Report 433: Standard
Measurements for Flight Simulation Quality.
B. ORGANIZATION OF THIS SUPPLEMENT
Due to the extensive nature of the changes incorporated into this supplement,
ARINC Report 433 has been reproduced in its entirety. As a result, the modified and
added material is not identified on each page.
Copies of ARINC Report 433 adopted by the AEEC April 3, 2001, and published
May 15, 2001, should be considered obsolete.
AERONAUTICAL RADIO, INC.
2551 Riva Road
Annapolis, Maryland 24101-7435

SUPPLEMENT 2
TO
ARINC REPORT 433
STANDARD MEASUREMENTS FOR FLIGHT SIMULATION QUALITY

Published: April 5, 2013

Prepared by the FSEMC

Adopted by the FSEMC Steering Committee: February 7, 2013


SUPPLEMENT 2 TO ARINC REPORT 433 – Page a

A. PURPOSE OF THIS DOCUMENT


This Supplement provides updates to guidance for measuring the age of open
issues with a flight training device. The terms are defined and examples are given to
graphically illustrate the metrics involved.
B. ORGANIZATION OF THIS SUPPLEMENT
In this document blue bold text is used to indicate those areas of text changed by
the current Supplement only.
C. CHANGES TO ARINC REPORT 433 INTRODUCED BY THIS SUPPLEMENT
This section presents a complete listing of the changes to the document introduced
by this Supplement. Each change is identified by the section number and the title as
it will appear in the complete document. Where necessary, a brief description of the
change is included.
1.4 Related Documents
Added or updated reference documents for regulatory agencies.
2.1 Definitions
Added the term Issue Ageing and associated definition.
4.2 Formulas and Computations
Added formula for measuring the ageing of issues on a flight training device.
Attachment 1 Example Forms and Screen Displays, Information Gathering, and
Reports
In Example 3, added a Pareto chart to show a cumulative total of interrupts tracked
by system.
In Example 3, added a chart representing the ageing of issues on a single flight
training device.
Appendix A Abbreviations
Added acronym for Average Discrepancy Turn-Around-Time (ADTAT).
Changed title of Appendix A to Acronyms for ARINC document standardization.
ARINC Standard – Errata Report
1. Document Title
(Insert the number, supplement level, date of publication, and title of the document with the error)

2. Reference

Page Number: Section Number: Date of Submission:

3. Error
(Reproduce the material in error, as it appears in the standard.)

4. Recommended Correction
(Reproduce the correction as it would appear in the corrected version of the material.)

5. Reason for Correction (Optional)


(State why the correction is necessary.)

6. Submitter (Optional)
(Name, organization, contact information, e.g., phone, email address.)

Please return comments to fax +1 410-266-2047 or standards@arinc.com

Note: Items 2-5 may be repeated for additional errata. All recommendations will be evaluated by the staff. Any
substantive changes will require submission to the relevant subcommittee for incorporation into a subsequent
Supplement.

[To be completed by IA Staff ]

Errata Report Identifier: Engineer Assigned:

Review Status:

ARINC Errata Form


March 2012
ARINC IA Project Initiation/Modification (APIM)
Name of proposed project APIM #: _____
Name for proposed project.
Suggested Subcommittee assignment
Identify an existing group that has the expertise to successfully complete the
project. If no such group is known to exist, a recommendation to form a new
group may be made.
Project Scope
Describe the scope of the project clearly and concisely. The scope should
describe “what” will be done, i.e., the technical boundaries of the project.
Example: “This project will standardize a protocol for the control of printers.
The protocol will be independent of the underlying data stream or page
description language but will be usable by all classes of printers.”
Project Benefit
Describe the purpose and benefit of the project. This section should describe
“why” the project should be done. Describe how the new standard will improve
competition among vendors, giving airlines freedom of choice. This section
provides justification for the allocation of both IA and airline resources.
Example: “Currently each class of printers implements its own proprietary
protocol for the transfer of a print job. In order to provide access to the cockpit
printer from several different avionics sources, a single protocol is needed. The
protocol will permit automatic determination of printer type and configuration to
provide for growth and product differentiation.”
Airlines supporting effort
Name, airline, and contact information for proposed chairman, lead airline, list
of airlines expressing interest in working on the project (supporting airlines), and
list of airlines expressing interest but unable to support (sponsoring airlines). It
is important for airline support to be gained prior to submittal. Other
organizations, such as airframe manufacturers, avionics vendors, etc. supporting
the effort should also be listed.
Issues to be worked
Describe the major issues to be addressed by the proposed ARINC standard.
Recommended Coordination with other groups
Draft documents may have impact on the work of groups other than the
originating group. The APIM writer or, subsequently, The Committee may
identify other groups which must be given the opportunity to review and comment
upon mature draft documents.
Projects/programs supported by work
If the timetable for this work is driven by a new airplane type, major avionics
overhaul, regulatory mandate, etc., that information should be placed in this
section. This information is a key factor in assessing the priority of this proposed
task against all other tasks competing for subcommittee meeting time and other
resources.
Timetable for projects/programs
Identify when the new ARINC standard is needed (month/year).
Documents to be produced and date of expected result
The name and number (if already assigned) of the proposed ARINC standard to
be either newly produced or modified.
Comments
Anything else deemed useful to the committees for prioritization of this work.
Meetings
The following table identifies the number of meetings and proposed meeting days
needed to produce the documents described above.

Activity Mtgs Mtg-Days


Document a # of mtgs # of mtg days
Document b # of mtgs # of mtg days

For IA Staff use


Date Received: IA Staff Assigned:
Potential impact:
(A. Safety B. Regulatory C. New aircraft/system D. Other)
Forward to committee(s) (AEEC, AMC, FSEMC): Date Forward:
Committee resolution:
(0 Withdrawn 1 Authorized 2 Deferred 3 More detail needed 4 Rejected)
Assigned Priority: Date of Resolution:
A. – High (execute first) B. – Normal (may be deferred for A.)
Assigned to SC/WG:

You might also like