Professional Documents
Culture Documents
1
Software Testing Standard
CHANGE HISTORY
2
Software Testing Standard
TABLE OF CONTENTS
1.0 Objective..............................................................................................................4
1.1 Purpose.....................................................................................................................4
1.2 Scope.........................................................................................................................4
2.0 Reference(s)........................................................................................................4
3.0 Abbreviations and Acronyms...........................................................................4
4.0 Definitions...........................................................................................................4
5.0 Procedure............................................................................................................6
5.1 Objective...................................................................................................................6
5.2 Testing.......................................................................................................................6
5.2.1 Testing Approach................................................................................7
5.3 Testing Overview.....................................................................................................8
5.3.1 Design of Test cases...........................................................................8
5.3.2 Order of Verification............................................................................9
5.4 Test Coverage Analysis........................................................................................12
5.4.1 Requirement Coverage Analysis......................................................13
5.4.2 Structural Coverage Analysis............................................................14
5.4.3 Structural Coverage Analysis Resolution.........................................18
5.5 Object oriented testing..........................................................................................18
5.6 Regression Testing................................................................................................19
5.7 Problem Capturing and Reporting Mechanism.................................................19
5.8 Checklists................................................................................................................20
5.9 Software Verification Matrix.................................................................................20
5.10 Test Results Summary.......................................................................................20
5.11 Naming Convention.............................................................................................21
5.12 Transition Criteria................................................................................................21
5.13 DO-178B Compatibility.......................................................................................22
3
Software Testing Standard
1.0 Objective
The objective of this document is to define the conventions for Software Testing. The
purpose of this document is to ensure that the testing documents are produced to the
defined level of presentation, both in format and in content, to support common
interpretation.
1.1 Purpose
1.2 Scope
2.0 Reference(s)
4.0 Definitions
Term Definition
Compiler Program that translates source code statements of a high level
language into object code.
Complete All attributes fully defined to ensure full implementation
Deactivated Code Executable object code (or data), which by design is either not,
intended to be executed (code) or used (data), for example, code that
is enabled by a hardware pin selection or software programmed
options.
Dead Code Executable object code (or data) which, as a result of a design error
cannot be executed (code) or used (data) in an operational
configuration of the target computer environment and is not traceable
to a system or software requirement.
Derived Additional requirements resulting from the software development
requirements processes, which may not be directly traceable to higher level
requirements.
4
Software Testing Standard
5
Software Testing Standard
5.0 Procedure
This document lays down the Testing standards for Airborne Safety Critical Software.
5.1 Objective
b. The Test philosophy which has been adopted throughout the Testing Process,
addresses the following:
5.2 Testing
a) Testing activities aid in error prevention. They are intended to meet the stringent
requirements of DO-178B such that the testing is complete, traceable and correct.
b) The principal aims of the testing shall be to verify that the unit under test has
met its design specification and to verify that errors shall not lead to
unacceptable failure conditions.
d) Tests shall be carried out by either running a test harness that calls the unit
under test and reports calls to external functions and values of global data to a
results file or by manually changing values of variables and recording test results
on a formal result sheet.
6
Software Testing Standard
Driver
A B C
F G H I
b) Top Down Approach : The highest level of subprogram shall be tested first.
Subprograms are integrated by moving downward through the control
hierarchy. All the internal interfaces shall be replaced with stubs to provide
Testing in isolation. Further Testing shall be carried to ensure the unit contains
no errors.
F G H I
7
Software Testing Standard
c) Isolation Approach: Isolation testing tests each unit in isolation from the units
which call it and the units it calls. Units can be tested in any sequence, because
no unit test requires any other unit to have been tested. Each unit test requires
a test driver and all called units are replaced by stubs.
Driver
A B C
Stub Stub
F G H I
• Unit Test
• Software Integration Test
• Hardware-Software Integration Test
• System Test
Normal Range Test Cases - To ensure that the Software responds under
normal conditions, including, as required:
8
Software Testing Standard
• Code Scrutiny
• Development and review of Unit Test procedures, cases, and generation
of results.
• Development and review of Software Integration Test procedures,
cases, and generation of results.
• Development and review of Hardware-Software Integration Test
procedures, cases, and generation of results.
• Development and review of System Test cases and generation of results
b) Configured templates for the Test Script, Module Verification Cases and
Procedures and Test Script Checklist may be used during development as
defined in the respective Planning document.
9
Software Testing Standard
c) The templates for following documents are available and should be used if
detailed customer checklists are not available. If needed these templates
can be customized as per project needs and the same shall be
documented in project planning documents of the specific project.
Range Checks
Boundary Values Checks
Independent Path Coverage
Robustness Checks
10
Software Testing Standard
The objective of this Test Phase is to verify that the software satisfies
the high-level requirements in the target environment.
11
Software Testing Standard
System Test Cases and Results: System Testing covers the full
functional and performance requirements of the product. System Test
shall be performed on the actual hardware.
Test data and Test Cases are produced to verify the following:
12
Software Testing Standard
Equivalence Class:
Any one of the following shall be used for deriving Test Cases:
If input condition specifies a Value, one valid (value) and two invalid
classes (outside lower limit and outside upper limit of value) are
defined.
Any one of the following shall be used for deriving Test Cases:
13
Software Testing Standard
• Statement coverage
• Decision coverage
• Condition coverage
• Condition / Decision coverage
• MCDC
• Multiple condition coverage
• Assertion Coverage
• Call-pair Coverage
• Instruction Coverage
• Object Code Verification
b. Statement Coverage:
Statement coverage verifies that all executable code statements are reachable.
c. Decision Coverage:
Example:
For the decision (A or B), Test Cases (TF) and (FF) will toggle the
decision outcome between true and false.
Unconditional Decision Coverage: JMP Label, For such a case, JMP
Label needs to be exercised once.
Conditional decision Coverage: JNZ Label
For such a case, the JNZ Label needs to be exercised twice. Once
for ZERO result and once for NON-ZERO result.
14
Software Testing Standard
Example: For the condition (A or B) Test Cases (TF) and (FT) meet the
coverage criteria.
Example: For the decision (A or B), Test Cases (TF), (FT), and (FF)
provide MC/DC
Example on MCDC
X = ((A or B) and C)
C
A X
B
A B C X
T T T T
T T F F
T F T T
F T T T
F F T F
F T F F
T F F F
F F F F
15
Software Testing Standard
T F T T
F F T F
F T T T
F T F F
Multiple condition coverage requires Test Cases that ensure each possible
combination of inputs to a decision is executed at least once, that is,
multiple condition coverage requires exhaustive Testing of the input
combinations to a decision.
Example: For a decision with n inputs, multiple condition coverage requires 2*n
Tests.
h. Assertion Coverage:
j. Instruction Coverage:
Instruction coverage verifies that all the executable instructions are reachable.
16
Software Testing Standard
If the compiler generates object code that is not directly traceable to the
source code, then identify the untraceable, compiler-generated object code
and verify it.
The guidelines for performing source code to object code traceability and
verification are as follows:
While verifying object code ensure that the analysis has been
performed on representative source code and object code that
is applicable to the target environment of the intended airborne
system.
17
Software Testing Standard
c) Dead code - The dead code shall be identified and reported to design
team so that it can be removed.
18
Software Testing Standard
c. System Testing : System testing for Object Oriented is same as non OO-
testing process. The end product needs to be verified against requirements
specifications using inputs to the system and output is analyzed.
a. Regression Test shall be carried out for all baseline test cases for which there
is a change in requirements and/or source code. If there is any variation
between the base lined and current results, then the corresponding Test Case
is scoped for work. Regression Test is carried out for Unit, Integration as well
as System Tests. When Software failures are rectified, not only the Test
which identified the failure be repeated, but also a selection of other Tests
should be repeated to verify that the change does not have an adverse effect
elsewhere in the Software.
b. Unit Test: The changed module and all others using it during Unit Test must
be retested.
c. Integration Test: The Integration Test of the asynchronous unit using the
changed module must be re-executed.
5.8 Checklists
The following Checklists templates are available and should be used if detailed
customer checklists for reviews are not available.
19
Software Testing Standard
• All the checklists and documents generated during the process of testing
shall follow a proper naming convention. Refer Section 5.11 Naming
Convention for details.
A Software Verification Matrix (SVM) shall be produced and updated for all levels of
testing.
20
Software Testing Standard
All the documents like Test Script, Test Plan, Test Procedure, MVCP, SVM,
SVR and Checklists that will be generated during the process of testing shall
follow a Standard Naming convention. This naming convention will be either
mentioned in the planning document (according to customers specifications if
required) or will be as mentioned below:
Checklists: checklist_<itemname>.doc
Review Report: RR_<itemname>.doc
MVCP: MVCP_<functionname>.doc
Where <functionname> can be any sub-system name
SVM: SVM_<projectname>.doc
SVR: SVR_<projectname>.doc
Where <projectname> is the name of the project
The Testing activity shall commence once all the following documents are
available:
The Software Testing Standards (this document) is in full compliance with the
sections 6.3, 11.13 and 11.14 of RTCA/DO-178B. The section-wise compatibility
matrix is as follows:
21
Software Testing Standard
22