You are on page 1of 64

WILLOWGLEN (MALAYSIA) SDN BHD

ISCS
Overall Software Test
Specification

Document No. : P205_ISCS_D2.3_OSTS


Rev. No. : 0.3.0
Date Issued : 12 June 2023

Prepared By 1st Reviewed By 2nd Reviewed By Approved By

Wong Jun Kit Norjannah Hazali Liong Zhong Jin Han Chung Siew

Tester Verifier Validator Project Manager

12 June 2023 12 June 2023 12 June 2023 12 June 2023


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

REVISION HISTORY

Rev. No. Rev. Date Rev. Description Revised By:

0.0.0 20 Oct 2022 First Issue of Overall Software Test Specification (OSTS) Tamilchelvan
document.

0.0.1 16 Dec 2022 Revised based on comments on Tamilchelvan


DRF_VV_P205_ISCS_D2.3_OSTS_Rev0.0.0:
1. Completed the information of author, reviewers and
approver in Cover Page.
2. Updated revision date in Revision History.
3. Figure 1.1 is rearranged after Table 1.1. Figure is
redrawn to correct and clear the confusion.
4. Revision column is added to Table 1.4.
5. Updated all ISIT into SwIT abbreviation.
6. Stated the filename of spreadsheet for test cases
P205_ISCS_D2.3_OSTS_TestCase.
7. Removed from Table 3.1 is Tester Lead/Test Engineer
(TST) which duplication of role to the Test Engineer.
8. Legend is added for symbols Figure 3.1. The test setup
breakdown is described above the diagram.
9. The reference is from FDR folder, the document is RTS-
SY03-SYS-EIC-DSN-30001_REV03.
10. OSTS covers for Software Integration Test. Refer to 4.7,
7.9.
11. Have completed the assigned sub-heading. Provided
the definition of Driver Test and real environment.
Added the specific of SIL2 Safety Function Testing.

0.0.2 18 Jan 2023 1. Revised based on comments on Tamilchelvan


DRF_SWAC_P205_ISCS_D2.3_OSTS_Rev0.0.1:
1. Removed unused abbreviation in Table 1-2 in
Section 1.3.1.
2. Added new term in Table 1-3 in Section 1.3.2.
3. Revised Table 1-4 in Section 1.4.1.
4. Revised Section 3, 4.1 and 7 according to new
Testing Strategy.
2. Revised based on comments on SRS-RTS-ISCS-DRS-
008-R00:
1. Revised Section 7.4 to state the evidence of
software tools being used according to Clause 6.7
of EN 50128:2011+A2:2020.

Page 2 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

0.0.3 3 Feb 2023 1. Revised based on Tamilchelvan


DRF_SWAC_P205_ISCS_D2.3_OSTS_Rev0.0.2:
1. Revised Note of Table 1-1 to include SIL 2
interfaces of Tunnel ventilation System (TVS),
Traction Power System (TPS) and Fire Protection
System (FPS) under Generic Software
development.
2. Added availability of Generic Software (Xentral
Safe) in Test Risks and Mitigation Plan (Table 6-1).
2. New test cases were added and revised in the test
cases document to meet all requirements, including SIL
2 requirements.

0.0.4 23 Feb 2023 1. Based on SRS-RTS-ISCS-DRS-008-R01, revised: Darsshan


1. Revised test cases in the P205_ISCS_D2.3
_OSTS_Rev0.0.4 Test Cases document and
added the Boundary Value Analysis technique.
2. The test cases in which the Boundary Value
Analysis technique is used:

i. OSTS-5.17-0001 (Alarm Acknowledgement)


ii. OSTS-5.27-0016 (Public Announcement)
iii. OSTS-5.27-0027 (Public Announcement)
iv. OSTS-5.27-0028 (Public Announcement)
v. OSTS-5.28-0006 (Passenger Information Display
System)
vi. OSTS-5.28-0011 (Passenger Information Display
System)
vii. OSTS-5.58-0001 (Filtering)
viii. OSTS-5.69-0002 (Database and Management)
ix. OSTS-5.85-0023 (Video Recording Playback)
x. OSTS-5.85-0024 (Video Recording Playback)

0.1.0 7 March 2023 Baseline update to 0.1.0 Darsshan

0.1.1 26 April 2023 1. Revised section 1.4.2 and removed IEE 829* and ISO Wong Jun Kit
12207:2008* from Standards as both were no longer will
be using as SIL2, BI and testing activities will be
referring to EN50128:2011+A2:2020 Standard.
2. Based on P205_ISCS_D1.2.3_CRF_DOC_00005
Change of Request, the
P205_ISCS_D2.3_OSTS_TestCase_Rev0.1.1
document was revised as it implements some changes
to the testcases. The test cases involve in the
implementation were listed below:
i. Removed OSTS-5.13-0019 to OSTS-5.13-
0026.

Page 3 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

ii. Removed OSTS-5.83-0011 to OSTS-5.83-


0014.
iii. Removed OSTS-5.37-0001.
iv. Removed OSTS-5.47-0001.
v. Rephrased OSTS-5.42-0022.
vi. Revised the integrity level of OSTS-5.46-0001.
(Changed from [BI] to [SIL2]).
3. Based on P205_ISCS_D1.2.3_CRF_DOC_00006
Change of Request, the
P205_ISCS_D2.3_OSTS_TestCase_Rev0.1.1
document was revised as it implements some changes
to the testcases. The test cases involve in the
implementation were listed below:
i. Added a note to the clauses which involving
OSTS-5.27-0012, OSTS-5.27-0013, OSTS-
5.27-0014.
4. Updated ISCS Software Requirements Specification to
Rev0.2.0.

0.2.0 26 April 2023 Document baseline. Wong Jun Kit

0.2.1 29 May 2023 1. Revised Section 8 Techniques and Measures based on Wong Jun Kit
internal audit comments:
1) Added Response Timing and Memory Constraints
to section 8.2 Performance Testing.
2) Added testing strategy regarding manual
techniques of software testing and analytical
techniques under section 3.
2. Updated test case spreadsheet
P205_ISCS_D2.3_OSTS_TestCase_Rev0.3.0 based
on internal audit comments:
1) Revised the existing type of tests.
2) Added Response Timing and Memory Constraints
Technique and Measure to applicable test cases.

0.3.0 12 June 2023 Document baseline. Wong Jun Kit

Page 4 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

TABLE OF CONTENTS
1 Introduction .................................................................................................................................. 9
1.1 Purpose of Document ............................................................................................................ 9
1.2 Scope of Work ........................................................................................................................ 9
1.3 Acronyms, Abbreviations and Terms ................................................................................... 13
1.3.1 Acronyms and Abbreviations ..........................................................................................13
1.3.2 Terms .............................................................................................................................16
1.4 Reference ............................................................................................................................. 17
1.4.1 Reference Documents ....................................................................................................17
1.4.2 Standard .........................................................................................................................18
1.5 Updating and Approval......................................................................................................... 19
1.6 Test Case ID ........................................................................................................................ 19
2 Document Structure .................................................................................................................. 20
3 Test Strategy .............................................................................................................................. 21
3.1 Test Objectives .................................................................................................................... 22
3.2 Test Assumptions and Constraints ...................................................................................... 22
3.3 Test Principles ...................................................................................................................... 22
3.4 Data Approach ..................................................................................................................... 22
3.5 Test Coverage and Levels of Testing .................................................................................. 23
3.5.1 Software Component, & Application Data/Algorithms Test ............................................23
3.5.2 Software Integration Test ...............................................................................................27
3.5.3 Overall Software Test .....................................................................................................30
4 Execution Strategy .................................................................................................................... 40
4.1 Test Cycles .......................................................................................................................... 40
4.2 Defect Management ............................................................................................................. 40
4.3 Test Criteria and Degree of Test Coverage ......................................................................... 43
4.3.1 Suspension Criteria ........................................................................................................43
4.3.2 Resumption Criteria ........................................................................................................43
4.3.3 Feature Pass/Fail Criteria ...............................................................................................44
5 Test Cases .................................................................................................................................. 45
6 Test Management Process ....................................................................................................... 46
6.1 Test Management Tool ........................................................................................................ 46
6.2 Test Design Process ............................................................................................................ 46
6.3 Test Execution Process ....................................................................................................... 47
6.4 Test Risks and Mitigation Factors ........................................................................................ 48

Page 5 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

6.5 Roles and Responsibilities ................................................................................................... 49


7 Test Environment ...................................................................................................................... 51
7.1 Simulated Environment ........................................................................................................ 51
7.1.1 Software Component & Application Data/Algorithms Test and Software Integration Test51
7.1.2 Overall Software Test .....................................................................................................52
7.2 Real Environment ................................................................................................................. 53
7.3 Hardware Items .................................................................................................................... 54
7.4 Software Items ..................................................................................................................... 56
8 Techniques and Measures........................................................................................................ 58
8.1 Functional/Black Box Testing ............................................................................................... 58
8.2 Performance Testing ............................................................................................................ 61
APPENDIX A TEST STATUS REPORT SAMPLE ............................................................................. 63
APPENDIX B DEFECT TRACKING LIST SAMPLE .......................................................................... 64

(Remaining page is intentionally left blank)

Page 6 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

FIGURES
Figure 1-1: Simple Architecture of an ISCS Platform on Different Subsystems ...................................... 12
Figure 3-1: Software Component Test on Add-on ISCS Interface Software Components ...................... 23
Figure 3-2: Application Data/Algorithms Test on Xentral Software Platform and Xentral Safe ............... 24
Figure 3-3: Software Integration Test on Xentral Software Platform and Xentral Safe ........................... 27
Figure 4-1: Defect Tracking Process ........................................................................................................ 41
Figure 7-1: Test Environment for Software Component & Application Data/Algorithms Test and Software
Integration Test ......................................................................................................................................... 51
Figure 7-2: Test Environment for Group 1 (Overall Software Test) ......................................................... 52
Figure 7-3: Real Test Environment .......................................................................................................... 53

(Remaining page is intentionally left blank)

Page 7 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

TABLES
Table 1-1: Summary of ISCS Specific Application Development Scope ................................................. 10
Table 1-2: Acronyms and Abbreviations .................................................................................................. 13
Table 1-3: Terms ...................................................................................................................................... 16
Table 1-4: Reference Documents ............................................................................................................. 17
Table 1-5: Standard Reference Documents ............................................................................................. 18
Table 2-1: Document structure with the section number and content descriptions ................................. 20
Table 3-1: Test Types ............................................................................................................................... 21
Table 3-2: Exit Criteria Defining Methods and Formula ........................................................................... 26
Table 3-3: Exit Criteria Defining Methods and Formula ........................................................................... 28
Table 3-4: Exit Criteria Defining Methods and Formula ........................................................................... 31
Table 3-5: Exit Criteria Defining Methods and Formula ........................................................................... 32
Table 3-6: Exit Criteria Defining Methods and Formula ........................................................................... 34
Table 3-7: Exit Criteria Defining Methods and Formula ........................................................................... 36
Table 3-8: Exit Criteria Defining Methods and Formula ........................................................................... 38
Table 4-1: Defects Category .................................................................................................................... 42
Table 4-2: Test Status .............................................................................................................................. 44
Table 5-1: Test Case Sample ................................................................................................................... 45
Table 6-1: Test Risks and Mitigation Plan ................................................................................................ 48
Table 6-2: Roles and Responsibilities ...................................................................................................... 49
Table 7-1: List of Hardware Items in Simulated and Real Environment .................................................. 54
Table 7-2: List of software items in real environment ............................................................................... 56
Table 8-1: List of selected Techniques and Measures for Functional/Black Box Testing ........................ 58
Table 8-2: Detailed Information of Boundary Value Analysis ................................................................... 59
Table 8-3: Detailed Information of Equivalence Classes and Input Partition Testing .............................. 60
Table 8-4: List of selected Techniques and Measures for Performance Testing ..................................... 61
Table 8-5: Detailed Information of Performance Requirements ............................................................... 61
Table 8-6: Detailed information of Response Timing and Memory Constraints ....................................... 62
Table AA-0-1: Test Status Report Sample ............................................................................................... 63
Table AB-0-1: Defect Tracking List Sample ............................................................................................. 64

Page 8 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

1 Introduction
The Government of the Republic of Singapore and the Government of Malaysia have agreed to jointly
develop the RTS Link project to enhance connectivity between Malaysia and Singapore, to benefit
commuters who travel between Singapore and Johor Bahru. The RTS Link will primarily serve as an
alternative mode of transport for commuters currently utilising the Johor Bahru-Singapore Causeway to
cross the border. The RTS Link is intended to be a convenient, safe, and cost-effective system that
integrates well with other transportation services in Woodlands and Johor Bahru.
The RTS Link will be a shuttle link with double tracks that crosses the Straits of Johor via a high bridge.
It will serve two terminal stations, one in Woodlands, Singapore and the other in Bukit Chagar, Johor
Bahru, Malaysia. The proposed link will be approximately 4.6km in length, and the crossing will take
approximately 5-10 minutes. The RTS Link Operator (who will be the Employer) will be required to operate
the RTS Link all year round.

1.1 Purpose of Document


The purpose of Overall Software Test Specification (OSTS) is to provide the testing approach and overall
framework that will drive the testing of ISCS for RTS project. The objective is to ascertain the behaviour
or performance of software against the corresponding test specification to the extent achievable by the
selected test coverage. Comprehensive instructions are relayed with clarity to ensure that the testing
activities proceed smoothly in a coordinated manner.

1.2 Scope of Work


In accordance with RTS SY03 Works, RTS Integrated Supervisory and Control System (ISCS) contractor
is responsible for the testing of defects during the Defect Liability Period and Other Related Works on the
associated equipment necessary to facilitate operation and maintenance of the ISCS system which
include special tools and testing equipment, spare parts, Operation and Maintenance Manuals and
training.
The test plan for the development of add-on ISCS interface software components and application
data/algorithms to Xentral Software Platform are to meet RTS project’s requirement.
There will be 2 categories of development that needs to be tested:
1. Development of add-on ISCS Interface Software Components.

2. Development of Application Data/Algorithms, including:


a. [SIL2 if tied to SIL2 objects] GUI configuration
b. [SIL2 if tied to SIL2 objects] IO configuration
c. [SIL2 if tied to SIL2 objects] Logic programming
d. [BI] Scripting

Page 9 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Table 1-1: Summary of ISCS Specific Application Development Scope

No Interface System SIL Level Xentral Xentral Safe Add-on GUI I/O Script Logic
Software
Platform

1. Communication Backbone Network (CBN) and Railway BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA
System LAN
2. Wayside Data Communication System (WDCS) BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA

3. Public Address (PA) System BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA

4. Public Information Display System (PIDS) BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA

5. Private Automatic Branch Exchange (PABX) System BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA

6. Trunk Radio Communication System BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA

7. Multi-channel Voice Recorder System (VRS) BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA

8. Master Clock System (MCS) BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA

9. Video Surveillance System (VSS) BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA

10. Cybersecurity System BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA

11. Tunnel Lighting System / Viaduct Lighting System BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA

12. Tunnel Ventilation System SIL2 NA NA NA ✓ [SIL2] ✓ [SIL2] ✓ [BI] NA

13. Traction Power System (TPS) SIL2 NA NA NA ✓ [SIL2] ✓ [SIL2] ✓ [BI] NA

14. Access Management System (AMS) BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA

15. Automatic Fare Collection System (AFC) BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA

16. Rolling Stock (TMS) BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA

Page 10 of 64

Copyright © 2023 All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

No Interface System SIL Level Xentral Xentral Safe Add-on GUI I/O Script Logic
Software
Platform

17. Signalling System (SS) SIL2 NA NA NA ✓ [BI] ✓ [SIL2] ✓ [BI] NA

18. Depot Equipment, Service Vehicle (TWP) BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA

19. Platform Screen Door (PSD) BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA

20. Uninterruptable Power System (UPS) BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA

21. Fire Fighting System SIL2 NA NA NA ✓ [SIL2] ✓ [SIL2] ✓ [BI] NA

22. High Voltage (HV) System / Integrated Building BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA
Management System (iBMS)
23. RTU/PLC BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA

Note:
NA: Not applicable in ISCS Development Scope

✓ [BI]: Identified scope as Basic Integrity

✓ [SIL2]: Identified scope as Safety Integrity Level 2

The details of development scope are provided in Software Development Plan (P205_ISCS_M4.1_SwDP).

Note: The development of ISCS Interface Software component to Tunnel Ventilation System (TVS), Traction Power System (TPS), Signaling System (SS) and Fire Protection
System (FPS) which requires SIL 2 interface will be included in Xentral Software (Generic Software).

Page 11 of 64

Copyright © 2023 All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

The OSTS documents the testing between Xentral and subsystems to ensure the ISCS system performs
well in interface with all subsystems to the requirements of its specification.

Software diagram below shows a simple overview of the subsystems and its interface to Xentral Software
Platform and Xentral Safe.

Figure 1-1: Simple Architecture of an ISCS Platform on Different Subsystems

Page 12 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

1.3 Acronyms, Abbreviations and Terms


The following tables contains the definitions for acronyms and abbreviations and terms used in this
document.

1.3.1 Acronyms and Abbreviations

Table 1-2: Acronyms and Abbreviations

Acronyms & Definition


Abbreviation

AFC Automatic Fare Collection System

AMS Access Management System

BI Basic Integrity

BMS Building Management System

BOCC Backup Operation Control Centre

CBN Communication Backbone Network

EN European Standards

ESS Emergency Stop Switch

FAT Factory Acceptance Test

FPS Fire Protection System

GS General Specification

HMI Human Machine Interface

HR Highly Recommended

HV High Voltage System

IBMS Integrated Building Management System

ID Identification Number

IEC International Electro technical Commission

IFAT Interface Factory Acceptance Test

IP Internet Protocol

ISCS Integrated Supervisory Control System

ISO International Organization for Standardization

Page 13 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Acronyms & Definition


Abbreviation

IMP Implementer

INT Integrator

MCS Master Clock System

NA Not Applicable

NMS Network Management System

OCC Operation Control Centre

OSTS Overall Software Test Specification

PA Public Address System

PABX Private Automatic Branch Exchange System

PAT Partial Acceptance Test

PIDS Passenger Information Display System

PLC Programmable Logic Controller

PS Particular Specification System

PSD Platform Screen Door

R Recommended

RS Rolling Stock

RTDB Real Time Database

RTM Requirements Traceability Matrix

RTU Remote Terminal Unit

SAT System Acceptance Test

SIL Safety Integrity Level

SIT System Integration Test

SOP Standard Operating Procedure

SQAP Software Quality Assurance Plan

SRS Software Requirements Specification

Page 14 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Acronyms & Definition


Abbreviation

SS Signaling System

SwIT Software Integration Test

TBC To Be Completed

TBD To Be Determined

TC Test Case

TCP Transmission Control Protocol

TETRA TETRA Radio System

TMS Train Management System

TOC Table Of Contents

TPS Traction Power System

TST Tester

TVS Tunnel Ventilation System

UPS Uninterruptible Power System

VAL Validator

VRS Voice Recorder System

VSS Video Surveillance System

VER Verifier

VWDS Video Wall Display

WDCS Wireless Data Communication System

WHE Water Handling Equipment

Xentral Xentral Software Platform

Page 15 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

1.3.2 Terms

Table 1-3: Terms

Terms Definition

ISCS Contractor Willowglen (Malaysia) Sdn. Bhd.


ISCS Integrated Supervisory and Control System
COMMS WPC Sapura Rail Systems Sdn. Bhd.
Employer Means RTS Operations Pte Ltd, its subsidiary or designated
representative.
Xentral Software ISCS Software Platform (Generic Software) developed by Willowglen
Platform/Xentral MSC Berhad.
Xentral Safe ISCS software modules developed by Willowglen MSC Berhad that
complies with EN50128 standard. It can be integrated with Xentral
software to perform safety functions
Backup Operation Control The backup control command centre for operators to monitor and control
Centre field equipment and interfacing systems in a project.
Operation Control Centre The main control command centre for operators to monitor and control
field equipment and interfacing systems in a project.
RESTful API An application programming interface (API or web API) that conforms to
the constraints of REST architectural style and allows for interaction with
RESTful web services.
Server A server is a computer program that provides a service to another
computer programs (and its user).
Virtual Machine A computer system created using software on one physical computer to
emulate the functionality of another separate physical computer.
Willowglen MSC Berhad The company that develops Xentral and Xentral Safe.
Workstation The machine from which the user operates the system.
Entry Criteria Entry criteria covers the desirable conditions in order to start test
execution.
Exit Criteria The Exit Criteria are the desirable conditions that need to be met in order
to close test phase and proceed with the next phase.

(Remaining page is intentionally left blank)

Page 16 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

1.4 Reference

1.4.1 Reference Documents

Table 1-4: Reference Documents

Document No. Revision Document Title

P205_ISCS_D1.1_SQAP Rev0.3.0 ISCS Software Quality Assurance Plan

P205_ISCS_D1.2_SCMP Rev0.3.0 ISCS Software Configuration Management Plan

P205_ISCS_D1.3_SVVP Rev0.3.0 ISCS Software Verification and Validation Plan

P205_ISCS_D2.2_SRS Rev0.2.0 ISCS Software Requirements Specification

P205_ISCS_M4.1_SwDP Rev0.0.4 Software Development Plan

P205_ISCS_D3.6_SITS Rev0.0.0 ISCS Software Integration Test Specification

RTS-SY03-SYS-EIC-DSN-30001 REV03 Final Design of Integrated Supervisory Control


System (ISCS)

P205_ISCS_D4.1_SCADDS Rev0.0.0 Software Component & Application


Data/Algorithms Test Report

P205_ISCS_D4.2_SCADTS Rev0.0.0 Software Component & Application


Data/Algorithms Test Specification

(Remaining page is intentionally left blank)

Page 17 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

1.4.2 Standard

Table 1-5: Standard Reference Documents

Document No. Document Title

EN 50128:2011 + Railway applications. Communication, signaling and processing systems -


A2:2020 Softwarefor railway control and protection systems.

IEC61131-3:2013 Programmable controllers – part 3: Programming languages

ISO 9000: All parts Quality Management and Quality Assurance Standards

ISO 9001:2015 Model for quality assurance in design, development, production, installing
and servicing

*Note: The latest version of the standard and applicable test as of the contract amendment date will be
used.

(Remaining page is intentionally left blank)

Page 18 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

1.5 Updating and Approval


This document will be maintained and updated throughout the Software Development Life Cycle (SDLC).
The OSTS will be updated as and when it is deemed appropriate when involving modifications in the
scope of ISCS software testing and validation:
• Software Component & Application Data/Algorithms Test
• Software Integration Test
• Overall Software Test
The updating of the OSTS will be carried out by Tester.

1.6 Test Case ID


Every requirement specification in this document shall be assigned with a unique identification (Test Case
ID) for traceability of subsequent documents including design, implementation and testing phase. In this
document, every requirement shall be assigned the ID format as follows:
Test Case IDs are defined as: OSTS-X.YY-ZZZZ
X - OSTS Main Chapter
YY - Sheet No of OSTS Test Case (P205_ISCS_D2.3_OSTS_TestCase)
ZZZZ - 'ID Column' for OSTS Test Case (P205_ISCS_D2.3_OSTS_TestCase)
For example, OSTS-5.03-0001, OSTS-5.03-0002, etc. Requirement Traceability Matrices (RTM) sheet
between OSTS and SRS shall be compiled inside OSTS test cases file (Sheet name: RTM (SRS-OSTS)).

(Remaining page is intentionally left blank)

Page 19 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

2 Document Structure
The following table summarizes the document structure:

Table 2-1: Document structure with the section number and content descriptions

Section Content
Number

Section 1 This section introduces the document by providing domain background, documents
and standard reference to be considered.

Section 2 This section briefs the content of each section in the document.

Section 3 This section describes the test strategy for the overall software test covering test
objectives, test assumptions and constraints, test principles, data approach, scope and
levels of testing for each test phases.

Section 4 This section describes the test execution strategy by documenting the test cycle, defect
management, test metrics, test criteria and degree of test coverage.

Section 5 This section specifies the test cases for the overall software which will be defined in the
Test Case spreadsheet: P205_ISCS_D2.3_OSTS_TestCase.

Section 6 This section specifies the test management process covering the test management tool,
test design and execution process, test risks and mitigation factors, roles and
responsibilities.

Section 7 This section specifies the test environment for each test phases.

Section 8 This section specifies the test cases specified in OSTS adhere with the techniques and
measures defined in SQAP (P205_ISCS_D1.1_SQAP) to ensure overall software
performs its intended functions.

(Remaining page is intentionally left blank)

Page 20 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3 Test Strategy
The selected testing strategy for ISCS’s software development life cycle is an analytical technique. In this
strategy, the requirements-based testing is identified as the appropriate method with the following
process:
1. Tester team will define the testing conditions to be covered after analyzing the test basis. The
test basis is the information from the documentation on which test cases are based, such as
requirements, architecture and design, and interfaces.
2. The requirements are analyzed to derive the test conditions. Then tests are designed,
implemented and executed to meet those requirements.
3. The results are recorded with respect to requirements.

Software testing is the process that is carried out throughout software development to identify the bugs,
issues, and defects in the software application. Manual testing will be executed in all the test types listed
in Table 3-1 below by tester without using any automated tools. The purpose of software testing activity
is to ensure that the application is error free, and it is working in conformance to the requirements. Tester
is responsible to make sure the test cases shall have 100% test coverage. The reported defects shall be
fixed by developers and re-testing shall be performed by testers on the fixed defects. The goal is to check
the quality of the system and deliver bug-free product to the customer. Testing team will practice dynamic
techniques for testing activities as it involves test cases and covers functional and non-functional testing.
Dynamic techniques execute the software and validates the output with the expected outcome.

Table 3-1: Test Types

No. Test Testing Environment


1. Functional/ Black Box Test (Appendix A-2 Table A-5, Table A-7, Real and simulated
Table A-11 of ISCS SQAP) environment
2. Performance Test (Appendix A-2 Table A-6, Table A-7 of ISCS Real and simulated
SQAP) environment
3. Factory Acceptance Test (FAT) Real environment
4. Interface Factory Acceptance Test (IFAT) Real environment
5. Partial Acceptance Test (PAT) Real environment
6. System Acceptance Test (SAT) Real environment
7. System Integration Test (SIT) Real environment

Page 21 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.1 Test Objectives


The test objective is to ascertain the behaviour or performance of software against the corresponding test
specification to the extent achievable by the selected test coverage.
In this Overall Software Test Specification, the objectives are:
• To test add-on ISCS interface software components.
• To test application data/algorithms developed.
• To confirm the developed ISCS specific application performs its intended functions and
performance in normal and degraded mode.
• To describe how the tester will test the subject matter under test and the interaction with external
interface subsystem and generic software.
• To ensure related test cases can comply with EN 50128:2011+A2:2020 techniques and measures
in adherence with the requirements established for Test Specification in the Software Quality
Assurance Plan (P205_ISCS_D1.1_SQAP).
• To ensure specification is produced by the tester based on the Software Requirements
Specification (P205_ISCS_D2.2_SRS).

3.2 Test Assumptions and Constraints


Below are the assumptions and constraints for the Overall Software Test Specification (OSTS):

• Subsystems are not available in the test environment and will be replaced with simulators during
the Software Component & Application Data/Algorithms Test, Software Integration Test and
Overall Software Test.
• All intended parameters will be put in test and ISCS should be able to receive all inputs (alarms).
• The scope of testing for each stage in Overall Software Test is subject to COMMS Testing and
Commissioning (T&C) approval.
• Generic software (Xentral Software Platform and Xentral Safe) will be tested by Willowglen MSC
Berhad.

3.3 Test Principles

• Testing will be focused on meeting the ISCS requirements and intended functions.
• There will be common, consistent procedures for supporting testing activities.
• Testing processes will be well defined, yet flexible, with the ability to change as needed.
• Testing activities will build upon previous stages to avoid redundancy or duplication of effort.
• Testing environment and data will emulate a real environment as much as possible.
• Testing will be a repeatable, quantifiable, and measurable activity.
• Testing will be divided into distinct phases, each with clearly defined objectives and goals.
• There will be entry and exit criteria.

3.4 Data Approach


ISCS will contain pre-loaded test data, and which is used for testing activities.

Page 22 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.5 Test Coverage and Levels of Testing

3.5.1 Software Component, & Application Data/Algorithms Test

3.5.1.1 Purpose
Software Component Test will be applied on each add-on ISCS interface software component individually.
The goal here is to find discrepancy between the add-on ISCS interface software components and the
program specifications prior to its integration with other components.

Figure 3-1 indicates the add-on ISCS interface software component under testing.

Figure 3-1: Software Component Test on Add-on ISCS Interface Software Components

(Remaining page is intentionally left blank)

Page 23 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Application Data/Algorithms Test will be applied on Xentral Software Platform and Xentral Safe based on
GUI configuration, IO configuration, logic programming and scripting. Simulators for each respective
standard protocol is used to simulate protocols so that third party system use these interfaces to interact
with Xentral Software Platform and Xentral Safe to exchange data.

Figure 3-2 indicates the application data/algorithms under testing.

Figure 3-2: Application Data/Algorithms Test on Xentral Software Platform and Xentral Safe

(Remaining page is intentionally left blank)

Page 24 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.5.1.2 Test Coverage


Software component testing will be performed for every add-on ISCS interface software component which
interfacing with the following subsystems:

• Communication Backbone Network (CBN)


• Wireless Data Communication System (WDCS)
• Public Address (PA) System
• Public Information Display System (PIDS)
• Private Automatic Branch Exchange (PABX) System
• TETRA Radio System (TETRA) Interface
• Video Surveillance System (VSS)
• Multi-channel Voice Recorder System (VRS)
• Rolling Stock (TMS)

Application data/algorithms testing will be performed on add-on ISCS interface software components,
Xentral Software Platform and Xentral Safe based on:
• GUI configuration
• IO configuration
• Logic programming
• Scripting

3.5.1.3 Testers
The test will be carried out by Software Testing Team.

3.5.1.4 Timing
The testing is performed in the Software Component & Application Data/Algorithms Testing phase of the
ISCS V-Model Development Life Cycle.

3.5.1.5 Entry Criteria


Entry criteria for Software Component, Application Data/Algorithm Test are listed below:

• Xentral Software Platform and Xentral Safe fully developed to carry out Application
Data/Algorithms Test.
• All add-on ISCS interface software components have been developed and ready to be tested.
• Software Component & Application Data/Algorithms Test Specification
(P205_ISCS_D4.2_SCADTS) properly reviewed and approved.
• Software Component & Application Data/Algorithms Test Case properly reviewed and approved.
• Availability of test environment of Software Component & Application Data/Algorithms Test.
• Availability of subsystem/protocol simulators of Software Component & Application
Data/Algorithms Test.

3.5.1.6 Exit Criteria


The methods of defining exit criteria are by specifying a targeted run rate and pass rate.

Page 25 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Table 3-2: Exit Criteria Defining Methods and Formula


Indication Methods Formula
Run Rate Number test cases executed / Total test cases
Pass Rate Number of test cases passed / Test cases executed

The following exit criteria should be considered for completion of a testing phase:

• 100% Run Rate achieved.


• At least 95% Pass Rate achieved.
• Ensuring all SIL2 test cases are passed.
• None of the identified Critical (Severity 1) defects are in Open Status.
• Reports of any defects or issues discovered during test execution recorded in the defect tracking
document.
• Re-testing and closing all the high and critical priority defects to execute corresponding regression
scenarios successfully.
• Test results approved by Validator before starting the next test phase.
• Achieving complete functional coverage in accordance with the specified requirements in
Software Component & Application Data/Algorithms Design Specification
(P205_ISCS_D4.1_SCADDS).
• No discrepancy found between the add-on ISCS interface software components, and it meets its
specification covered in SRS (P205_ISCS_D2.2_SRS).
• Application Data/Algorithms configuration on SIL2 subsystems do not have any logical and
functional errors.

3.5.1.7 Test Deliverables


Test deliverables provided after the Software Component & Application Data/Algorithms Test is
completed:

• Software Component & Application Data/Algorithms Test Report (P205_ISCS_D6.1_SCADTR)


➢ Test Status Report
➢ Defect Tracking List
• Defects Register

(Remaining page is intentionally left blank)

Page 26 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.5.2 Software Integration Test

3.5.2.1 Purpose
Software Integration Test and Software/Hardware Integration Test will be performed in the Software
Integration Phase.
The Software Integration Test is the process of testing the interfaces between software components as
against the specifications. The goal here is to find errors associated defects in the interaction between
software components when they are integrated.
Software/Hardware Integration Test which will be performed to validate that the developed software and
actual hardware can be integrated to work as a whole to perform the required functions.
Figure 3-3 indicates the software component & application data/algorithms under testing.

Figure 3-3: Software Integration Test on Xentral Software Platform and Xentral Safe

(Remaining page is intentionally left blank)

Page 27 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.5.2.2 Test Coverage


Software undergoing integration will be combined in a controlled manner. The following are tested to
ensure the add-on ISCS interface software components, application data/algorithms and RTU/PLC
integration is performed in accordance with the specifications:

• Integrated Software
➢ Add-on ISCS Interface Software Components
➢ GUI configuration
➢ I/O configuration
➢ Logic programming
➢ Scripting
• Hardware
➢ RTU/PLC

3.5.2.3 Testers
The test will be carried out by Software Testing Team.

3.5.2.4 Timing
The testing is performed in the Software Integration Phase of the ISCS V-Model Development Life Cycle.

3.5.2.5 Entry Criteria


Entry criteria for Software Integration Test are listed below:

• Each add-on ISCS interface software component has gone through Software Component Test.
• GUI configuration, IO configuration, logic programming and scripting has gone through
Application Data/Algorithms Test.
• Critical (Severity 1) and High (Severity 2) defects found during Software Component & Application
Data/Algorithms Test has been fixed and closed.
• Software Integration Test Specification (P205_ISCS_D3.6_SITS) properly reviewed and
approved.
• Software Integration Test Case properly reviewed and approved.
• Availability of test environment of Software Integration Test.
• Availability of subsystem/protocol simulators of Software Integration Test.

3.5.2.6 Exit Criteria


The methods of defining exit criteria are by specifying a targeted run rate and pass rate.

Table 3-3: Exit Criteria Defining Methods and Formula

Indication Methods Formula


Run Rate Number test cases executed / Total test cases
Pass Rate Number of test cases passed / Test cases executed

Page 28 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

The following exit criteria should be considered for completion of a testing phase:

• 100% Run Rate achieved.


• At least 95% pass rate achieved.
• Ensuring all SIL2 Test Cases are passed.
• None of the identified Critical (Severity 1) defects are in Open Status.
• Reports of any defects or issues discovered during test execution recorded in the defect tracking
document.
• Re-testing and closing all the high and critical priority defects to execute corresponding regression
scenarios successfully.
• Test results approved by Validator before starting the next test phase.
• Achieving complete functional coverage in accordance with the specified requirements in
Software Architecture and Design Specification (P205_ISCS_D3.1_SADS) and Software
Interface Specification (P205_ISCS_D3.2_SIS).
• Add-on ISCS interface software components do not have any logical and functional errors when
integrating with other interface components.
• Software/Hardware Integration on SIL2 subsystems do not have any logical and functional errors
when interfacing with other interface components.

3.5.2.7 Test Deliverables


Test deliverables provided after the Software Integration Test is completed:

• Software Integration Test Report (P205_ISCS_D7.1_ SITR)


➢ Test Status Report
➢ Defect Tracking List
• Defects Register

(Remaining page is intentionally left blank)

Page 29 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.5.3 Overall Software Test

The Overall Software Test is divided into 3 groups of tests which resulting in a formal software release for
different stage of system test (FAT, IFAT, PAT, SAT and SIT) as follow:

• Group 1 (Overall Software Test) which is conducted prior to the FAT.


• Group 2 (Regression Test):
➢ Phase 1 which is conducted prior to the IFAT.
➢ Phase 2 which is conducted prior to the PAT.
➢ Phase 3 which is conducted prior to the SAT.

• Group 3 (Regression Test) which is conducted prior to the SIT.

3.5.3.1 Group 1 (Overall Software Test)


3.5.3.1.1 Purpose
Group 1 (Overall software Test) is the process of testing the integrated software and hardware to ensure
compliance with the ISCS Software Requirements Specification. The goal here is to validate ISCS
performance meet its functional specification and safety aspects against the PS and the approved final
design.
This test is conducted prior to the FAT testing.

3.5.3.1.2 Test Coverage


The test coverage for Group 1 (Overall Software Test) shall include all the test cases defined for Overall
Software Test.
The test will be performed at ISCS Contractor’s facility (Off-site testing).

3.5.3.1.3 Testers
The test will be carried out by Software Testing Team.

3.5.3.1.4 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.

3.5.3.1.5 Entry Criteria


Entry criteria for Group 1 (Overall Software Test) are listed below:

• The Add-on ISCS Interface Software Components, Application Data/Algorithms and RTU/PLC
have gone through the Software Integration Test.
• Critical (Severity 1) and High (Severity 2) found during Software Integration Test has been fixed
and closed.
• Overall Software Testing Specification (P205_ISCS_D2.3_OSTS) properly reviewed and
approved.
• Overall Software Test Case properly reviewed and approved.
Page 30 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

• Availability of test environment of FAT.

3.5.3.1.6 Exit Criteria


The methods of defining exit criteria are by specifying a targeted run rate and pass rate.

Table 3-4: Exit Criteria Defining Methods and Formula

Indication Methods Formula


Run Rate Number test cases executed / Total test cases
Pass Rate Number of test cases passed / Test cases executed

The following exit criteria should be considered for completion of a testing phase:

• 100% Run Rate achieved.


• At least 95% pass rate achieved.
• Ensuring all SIL2 Test Cases are passed.
• None of the identified Critical (Severity 1) defects are in Open Status.
• Reports of any defects or issues discovered during test execution recorded in the defect tracking
document.
• Re-testing and closing all the high and critical priority defects to execute corresponding
Regression scenarios successfully.
• Test results approved by Validator before starting the next test phase.
• Achieving complete functional coverage in accordance with the specified requirements in SRS
(P205_ISCS_D2.2_SRS).
• No discrepancy found on ISCS hardware and ISCS functionalities against meeting its requirement
specification.

3.5.3.1.7 Test Deliverables


Test deliverables provided after the Group 1 (Overall Software Test) is completed:

• Overall Software Test Report (P205_ISCS_D8.1_ OSTR)


➢ Test Status Report
➢ Defect Tracking List
• Defects Register
• Software Validation Report
• Release Note

(Remaining page is intentionally left blank)

Page 31 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.5.3.2 Phase 1 of Group 2 (Regression Test)


3.5.3.2.1 Purpose
Phase 1 of Group 2 (Regression Test) is the regression test after fixing the bugs found in the previous
Group 1 (Overall Software Test) and FAT.
This test is conducted prior to the IFAT testing.

3.5.3.2.2 Test Coverage

Phase 1 of Group 2 is a regression test that will execute selecting relevant test cases from the Overall
Software Test Case. The objective of regression test is to make sure software still functions as expected
after any code changes, updates or improvements.

The test will be performed at COMMS WPC’s facility (Off-site testing).

3.5.3.2.3 Testers
The test will be carried out by Software Testing Team.

3.5.3.2.4 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.

3.5.3.2.5 Entry Criteria


Entry criteria for Phase 1 of Group 2 (Regression Test) are listed below:

• Successful completion of Group 1 (Overall Software Test).


• Critical (Severity 1) and High (Severity 2) found during Group 1 (Overall Software Test) has been
fixed and closed.
• Availability of test environment of IFAT.

3.5.3.2.6 Exit Criteria


The methods of defining exit criteria are by specifying a targeted run rate and pass rate.

Table 3-5: Exit Criteria Defining Methods and Formula

Indication Methods Formula


Run Rate Number test cases executed / Total test cases
Pass Rate Number of test cases passed / Test cases executed

The following exit criteria should be considered for completion of a testing phase:

• 100% Run Rate achieved.


• At least 95% pass rate achieved.
• Ensuring all SIL2 Test Cases are passed.
Page 32 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

• None of the identified Critical (Severity 1) defects are in Open Status.


• Reports of any defects or issues discovered during test execution recorded in the defect tracking
document.
• Re-testing and closing all the high and critical priority defects to execute corresponding
Regression scenarios successfully.
• Test results for each phase approved by Validator before starting the next test phase.
• Achieving complete functional coverage in accordance with the specified requirements in SRS
(P205_ISCS_D2.2_SRS).
• No discrepancy found on ISCS hardware and ISCS functionalities against meeting its requirement
specification.
• Functionalities of ISCS and its interfaces to all COMMS subsystems do not have any logical and
functional errors.

3.5.3.2.7 Test Deliverables


Test deliverables provided after the Phase 1 of Group 2 (Regression Test) is completed:

• Overall Software Test Report (P205_ISCS_D8.1_ OSTR)


➢ Test Status Report
➢ Defect Tracking List
• Defects Register
• Software Validation Report
• Release Note

(Remaining page is intentionally left blank)

Page 33 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.5.3.3 Phase 2 of Group 2 (Regression Test)


3.5.3.3.1 Purpose
Phase 2 of Group 2 (Regression Test) is the regression test after fixing the bugs found in the previous
Phase 1 of Group 2 (Regression Test) and IFAT.
This test is conducted prior to the PAT testing.

3.5.3.3.2 Test Coverage

Phase 2 of Group 2 is a regression test that will execute selecting relevant test cases from the Overall
Software Test Case. The objective of regression test is to make sure software still functions as expected
after any code changes, updates or improvements.

The test will be performed at site locally (On-site testing).

3.5.3.3.3 Testers
The test will be carried out by Software Testing Team.

3.5.3.3.4 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.

3.5.3.3.5 Entry Criteria


Entry criteria for Phase 2 of Group 2 (Regression Test) are listed below:

• Successful completion of Phase 1 of Group 2 (Regression Test).


• Critical (Severity 1) and High (Severity 2) found during Phase 1 of Group 2 (Regression Test) has
been fixed and closed.
• Availability of test environment of PAT.

3.5.3.3.6 Exit Criteria


The methods of defining exit criteria are by specifying a targeted run rate and pass rate.

Table 3-6: Exit Criteria Defining Methods and Formula

Indication Methods Formula


Run Rate Number test cases executed / Total test cases
Pass Rate Number of test cases passed / Test cases executed

The following exit criteria should be considered for completion of a testing phase:

• 100% Run Rate achieved.


• At least 95% pass rate achieved.
• Ensuring all SIL2 Test Cases are passed.
• None of the identified Critical (Severity 1) defects are in Open Status.
Page 34 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

• Reports of any defects or issues discovered during test execution recorded in the defect tracking
document.
• Re-testing and closing all the high and critical priority defects to execute corresponding
Regression scenarios successfully.
• Test results for each phase approved by Validator before starting the next test phase.
• Achieving complete functional coverage in accordance with the specified requirements in SRS
(P205_ISCS_D2.2_SRS).
• No discrepancy found on functionalities of ISCS and its interfaces to all COMMS subsystems on
site in a local environment.
• Functionalities of Xentral Software Platform and Xentral Safe and its interfaces to all SIL2
subsystems do not have any logical and functional errors on site in a local environment.
• No discrepancy found on ISCS hardware and ISCS functionalities against meeting its requirement
specification.
• Functionalities of ISCS and its interfaces to all COMMS subsystems do not have any logical and
functional errors.

3.5.3.3.7 Test Deliverables


Test deliverables provided after Phase 2 of Group 2 (Regression Test) is completed:
• Overall Software Test Report (P205_ISCS_D8.1_ OSTR)
➢ Test Status Report
➢ Defect Tracking List
• Defects Register
• Software Validation Report
• Release Note

(Remaining page is intentionally left blank)

Page 35 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.5.3.4 Phase 3 of Group 2 (Regression Test)


3.5.3.4.1 Purpose
Phase 3 of Group 2 (Regression Test) is the regression test after fixing the bugs found in the previous
Phase 2 of Group 2 (Regression Test) and PAT.
This test is conducted prior to the SAT testing.

3.5.3.4.2 Testers
The test will be carried out by Software Testing Team.

3.5.3.4.3 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.

3.5.3.4.4 Test Coverage

Phase 3 of Group 2 is a regression test that will execute selecting relevant test cases from the Overall
Software Test Case. The objective of regression test is to make sure software still functions as expected
after any code changes, updates or improvements.

The test will be performed at site with OCC/BOCC connection (On-site testing).

3.5.3.4.5 Entry Criteria


Entry criteria for Phase 3 of Group 2 (Regression Test) are listed below:

• Successful completion of Phase 2 of Group 2 (Regression Test).


• Critical (Severity 1) and High (Severity 2) found during for Phase 2 of Group 2 (Regression Test)
has been fixed and closed.
• Availability of test environment of SAT.

3.5.3.4.6 Exit Criteria


The methods of defining exit criteria are by specifying a targeted run rate and pass rate.

Table 3-7: Exit Criteria Defining Methods and Formula

Indication Methods Formula


Run Rate Number test cases executed / Total test cases
Pass Rate Number of test cases passed / Test cases executed

The following exit criteria should be considered for completion of a testing phase:

• 100% Run Rate achieved.


• At least 95% pass rate achieved.
• Ensuring all SIL2 Test Cases are passed.
Page 36 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

• None of the identified Critical (Severity 1) defects are in Open Status.


• Reports of any defects or issues discovered during test execution recorded in the defect tracking
document.
• Re-testing and closing all the high and critical priority defects to execute corresponding
Regression scenarios successfully.
• Test results for each phase approved by Validator before starting the next test phase.
• Achieving complete functional coverage in accordance with the specified requirements in SRS
(P205_ISCS_D2.2_SRS).
• No discrepancy found on ISCS hardware and ISCS functionalities against meeting its requirement
specification.
• Functionalities of ISCS and its interfaces to all COMMS subsystems do not have any logical and
functional errors.

3.5.3.4.7 Test Deliverables


Test deliverables provided after the Phase 3 of Group 2 (Regression Test) is completed:
• Overall Software Test Report (P205_ISCS_D8.1_ OSTR).
➢ Test Status Report
➢ Defect Tracking List
• Defects Register
• Software Validation Report
• Release Note

(Remaining page is intentionally left blank)

Page 37 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

3.5.3.5 Group 3 (Regression Test)


3.5.3.5.1 Purpose
Group 3 (Regression Test) is the regression test after fixing the bugs found in the previous Phase 3 of
Group 2 (Regression Test) and SAT.
This test is conducted prior to the SIT testing.

3.5.3.5.2 Test Coverage

Group 3 is a regression test that will execute selecting relevant test cases from the Overall Software Test
Case. The objective of regression test is to make sure software still functions as expected after any code
changes, updates or improvements.

The test will be performed at site with full operation of all subsystems (On-site testing).

3.5.3.5.3 Testers
The test will be carried out by Software Testing Team.

3.5.3.5.4 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.

3.5.3.5.5 Entry Criteria


Entry criteria for Group 3 (Regression Test) are listed below:

• Successful completion of Phase 3 of Group 2 (Regression Test).


• Critical (Severity 1) and High (Severity 2) found during Phase 3 of Group 2 (Regression Test) has
been fixed and closed.
• Availability of test environment of SIT.

3.5.3.5.6 Exit Criteria


The methods of defining exit criteria are by specifying a targeted run rate and pass rate.

Table 3-8: Exit Criteria Defining Methods and Formula

Indication Methods Formula


Run Rate Number test cases executed / Total test cases
Pass Rate Number of test cases passed / Test cases executed

The following exit criteria should be considered for completion of a testing phase:

• 100% Run Rate achieved.


• At least 95% pass rate achieved.
• Ensuring all SIL2 Test Cases are passed.
Page 38 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

• None of the identified Critical (Severity 1) defects are in Open Status.


• Reports of any defects or issues discovered during test execution recorded in the defect tracking
document.
• Re-testing and closing all the high and critical priority defects to execute corresponding
Regression scenarios successfully.
• Test results approved by Validator before starting the next test phase.
• Achieving complete functional coverage in accordance with the specified requirements in SRS
(P205_ISCS_D2.2_SRS).
• No discrepancy found on ISCS hardware and ISCS functionalities against meeting its requirement
specification.
• Functionalities of ISCS and its interfaces to all subsystems do not have any logical and functional
errors.

3.5.3.5.7 Test Deliverables


Test deliverables provided after the Group 3 (Regression Test) is completed:

• Overall Software Test Report (P205_ISCS_D8.1_ OSTR).


➢ Test Status Report
➢ Defect Tracking List
• Defects Register
• Software Validation Report
• Release Note

(Remaining page is intentionally left blank)

Page 39 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

4 Execution Strategy
4.1 Test Cycles
The test cycle includes:

• Software Component & Application Data/Algorithms Test


• Software Integration Test
• Overall Software Test
➢ Group 1 (Overall Software Test)
➢ Group 2 (Regression Test):
❖ Phase 1
❖ Phase 2
❖ Phase 3
➢ Group 3 (Regression Test)

4.2 Defect Management


It is expected that INT & TST execute all the test cases in each test phases. Additional testing can be
done if INT & TST identify a possible gap in the test cases. If a gap is identified, the test cases and
requirement traceability matrix will be updated.
It is the responsibility of the Tester (INT & TST) to register the defects via Defect Register form, link them
to the corresponding test case, assign an initial severity and status, retest and close the defect.
The defects will be tracked through Defect Tracking List. The implementation team will gather information
from defect tracking list and request additional details from the INT & TST. Implementation team will work
on fixes.
It is also the responsibility of the Test Lead (INT & TST) to review the severity of the defects. Test Lead
will facilitate with the implementation team regarding the fix and its implementation, communicate with
testers when the test can continue or should be halted, request the tester to retest, and modify status as
needed.

(Remaining page is intentionally left blank)

Page 40 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Figure 4-1 depicts the defect tracking process.

Figure 4-1: Defect Tracking Process

(Remaining page is intentionally left blank)

Page 41 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Defects found during the Testing will be categorized as depicted in Table 4-1.
Table 4-1: Defects Category

Category Description Severity Impact

Major An event affecting the 1 (Critical) ▪ This bug is critical enough to crash the
functionality being tested system, cause file corruption, or cause
in the session. The fault potential data loss.
shall be rectified before ▪ It causes an abnormal return to the
recommencing testing. operating system (crash or a system failure
message appears).
▪ It causes the application to hang and
requires re-booting the system.
2 (High) ▪ It causes a lack of vital program functionality
with workaround.
Minor An event not affecting the 3 (Medium) ▪ This bug will degrade the quality of the
functionality being tested System. However, there is an intelligent
in that session. Testing workaround for achieving the desired
may be continued. functionality - for example through another
screen.
▪ This bug prevents other areas of the product
from being tested. However other areas can
be independently tested.
4 (Low) ▪ There is an insufficient or unclear error
message, which has minimum impact on
product use.
5 (Cosmetic) ▪ There is an insufficient or unclear error
message that has no impact on product use.

(Remaining page is intentionally left blank)

Page 42 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

4.3 Test Criteria and Degree of Test Coverage


This section covers the standard/rule on which the test procedure and judgement can be based.

4.3.1 Suspension Criteria

Suspension criteria specifies the criteria to be used to suspend all or portion of testing activities. If the
suspension criteria are met during testing, the active test cycle will be suspended until criteria are resolved.
Following are the criteria of suspension:

• Any major category defect which highly impacts the testing progress.
• Hardware or software resources are not available as per the requirements.

• Testing process limited by defects in the build.


• The problem related to network connectivity.
• If more than 20% of the test cases are incomplete, testing should be suspended until
Implementation team fixes all the failed cases.

4.3.2 Resumption Criteria

Resumption criteria is the restart of the testing process which is invoked after the suspension criteria are
met. The resumption criteria involve verification of the defect by which suspension was invoked during
the testing process.
Resumption criteria is valid if the defect which caused the suspension of the testing process gets fixed
and the fix is verified by Software Testing Team.
Criteria to resume the testing process:

• Major category defect due to which suspension occurs gets resolved.

• Hardware or software resources are available as per the requirements.


• Testing process limited by defects in the build resolved.

• The problem related to network connectivity resolved.


• Implementation team fixes all the failed cases found in which suspension occurs.

(Remaining page is intentionally left blank)

Page 43 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

4.3.3 Feature Pass/Fail Criteria

Based upon contractual and approved design requirement as well as equipment’s


characteristics/features/specifications/attributes, the pass/fail criteria are determined. The tests shall be
considered as passed/approved when all test parameters and pre-determined criteria are met.
Before the test case execution, the Integration and/or Testing Engineer shall prepare the corresponding
signature sheets for the record. All the related in-process records shall be saved for each test case. After
execution of each test case, the Integration and/or Testing Engineer should mark the test case as Pass
(P), Fail (F), Incomplete (I) and Blocked (B). The detailed definition of the three states is described as
below:
Table 4-2: Test Status

Test Status Test Record


The test is successfully performed; the actual results are in compliance with
Pass (P) contractual requirement as well as approved design requirement. The test criteria
are met.
The test is performed; the actual results are NOT complying contractual
requirement as well as approved design requirement. The test criteria are NOT met.
Fail (F)
All failed tests shall be reported as anomalies in the test report. When a solution is
available, the test shall be performed again (re-tested).
The test has been performed but the actual results cannot be verified to satisfy:
a) Test criteria and/or
b) contractual requirement and/or
c) approved design document.
Incomplete (I)

Under this category, further investigation is needed. This state is only a “temporary”
state and repeat test shall be conducted, which should eventually lead to “Passed”
or “Failed” status.

Blocked (B) Test case unable to run because pre-requisite for its execution is not fulfilled.

(Remaining page is intentionally left blank)

Page 44 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

5 Test Cases
Test cases for the overall software test are defined in Test Case spreadsheet:
P205_ISCS_D2.3_OSTS_TestCase. Each test case is defined with an OSTS test case ID and traces all
[BI] and [SIL2] tagged requirements from SRS. A requirement can be tested with multiple OSTS test
cases. [SIL2] tagged refers to SIL2 requirement while [BI] will be referred to as non-SIL2 or basic integrity
requirement.
Test Case will be created for each of the test cycle.

Table 5-1: Test Case Sample

Item Test Test Case Prerequisite Test Test Expected Test Remarks
No Case Objective Data Step Result Status
ID

<Item <Test <Test <Prerequisite> <Test <Test <Expected <P/F/I/B> <Remarks>


No> Case Case Data> Step> Result>
ID> Objective>

(Remaining page is intentionally left blank)

Page 45 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

6 Test Management Process


6.1 Test Management Tool
Microsoft Excel Spreadsheet will be used as a tool for Test Management. All testing artifacts such as test
cases and test results are updated in the Excel Test Documentation.
• Integration and/or Testing Team will be provided with Read/Write access to add/modify Test
cases in Excel Test Documentation.
• During the Test Design phase, all test cases are written directly into Excel Test Documentation.
Any changes to the test case will be directly updated in the Excel Test Documentation.
• Integration and/or Testing Team will directly access respective assigned test cases and update
the status of each executed step in the Excel Test Documentation directly.

• Any defect encountered will be raised in Excel test documentation linking to the particular test
case/test step.

• During Defect fix testing, defects are re-assigned back to the tester to verify the defect fix. The
tester verifies the defect fix and updates the status directly in Excel Test Documentation and
defect tracking list.

• Various reports can be generated from Excel Test Documentation to provide status of test
execution in the test status report.

6.2 Test Design Process

Incorporating
Understanding Establishing Preparation of Peer Review of Review
Requirements. Traceability. Test cases. Test cases. comments in
test cases.

• The tester will understand each requirement and prepare corresponding test case to ensure all
requirements are covered.
• Each Test case will be mapped to Software Requirements Specification (P205_ISCS_D2.2_SRS)
as part of Requirement Traceability Matrix.

• Each of the test cases will undergo review by the testing team and the review defects are captured
and shared to the implementation team. The implementation team will rework on the review
defects and finally obtain approval and sign-off.

• During the test case preparation phase, tester will use the Software Requirement Specification
(P205_ISCS_D2.2_SRS), use case and functional specification to write step by step test cases.
Page 46 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

• Any subsequent changes to the test case will be directly updated in Excel Test Documentation.

6.3 Test Execution Process

Mark Status as Raise defects Participate in


Send the test Complete the
Execute each Pass/Fail/Inco for the failed Defect Triage
status report to test execution
of the test step mplete/Blocked test cases in cycle and
Project of all the test
in test case. in Excel test Excel Test explain the
Manager. cases.
documentation. Documentation. defects.

• Once all test cases are approved and the test environment is ready for testing, tester will start
each test phases to ensure the application is stable for testing.

• The Integration and/or Testing Team runs all test cases and update results directly in Excel Test
Documentation.

• Testers to ensure necessary access to the testing environment, Excel Test Documentation for
updating test status and raise defects. If any issues, will be escalated to the Project Manager as
escalation.
• Any Critical (Severity 1) defects and High (Severity 2) defects during Software Component,
Application Data and Algorithm test will be escalated to the respective implementation team for
fixes.

• Each tester performs step by step execution and updates the test status. The tester enters Pass,
Fail, Incomplete or Blocked Status for each of the step directly in Excel Test Documentation.

• If any failures, defect will be raised as per severity guidelines in Excel Test Documentation
detailing steps to simulate along with screenshots if appropriate.

• Daily test execution status as well as defect status will be reported to all stakeholders.
• Testing team will participate in defect triage meetings in order to ensure all test cases are
executed with either pass/fail category.
• Testing process is repeated until all test cases are executed fully with Pass/Fail status.
• During the subsequent cycle, any defects fixed applied will be tested and results will be updated
in Excel test documentation during the cycle.

• As per Process, final sign-off or project completion process will be followed.

Page 47 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

6.4 Test Risks and Mitigation Factors


The following table lists the test risks with its probability and impact, following with a mitigation plan.

Table 6-1: Test Risks and Mitigation Plan

Risk Probability Impact Mitigation Plan

SCHEDULE High High The testing team can control the


Testing schedule is tight. If the start preparation tasks (in advance) and the
of the testing is delayed due to early communication with involved
design tasks, the test cannot be parties.
extended beyond the testing Some buffer has been added to the
scheduled date. schedule for contingencies, although not
as much as best practices advise.
Unavailability of the Generic Medium High Due to the unavailability of the Generic
Software (Xentral Safe). Software, the schedule gets impacted
and will lead to delayed start of test
execution.
DEFECTS Medium High Defect management plan is in place to
Defects are found at a late stage of ensure prompt communication and
the cycle or at a late cycle; defects fixing of issues.
discovered late are most likely be
due to unclear specifications and
are time consuming to resolve.

SCOPE Medium Medium Scope is well defined but the changes in


Scope completely defined. functionality are not yet finalized or keep
on changing.
Non-availability of independent test Medium High Due to non-availability of the
environment and accessibility. environment, the schedule gets
impacted and will lead to delayed start
of test execution.
Delayed testing due to new issues. Medium High During testing, there is a good chance
that some “new” defects may be
identified and may become an issue that
will take time to resolve.
There are defects that can be raised
during testing because of unclear
document specification. These defects
can yield to an issue that will need time
to be resolved.
If these issues become showstoppers, it
will greatly impact on the overall project
schedule.

Page 48 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Risk Probability Impact Mitigation Plan

If new defects are discovered, the defect


management and issue management
procedures are in place to immediately
provide a resolution.

6.5 Roles and Responsibilities


The following table lists the roles and responsibilities for test specification plan, design, execution, and
evaluation.
Table 6-2: Roles and Responsibilities

Roles Responsibilities

Software • Manages and ensures the Verification and Validation process of the software in
Assurance accordance with EN 50128:2011+A2:2020 requirements.
Team • Manages the verification process (review, integration, and testing) and ensures
Verifier/ independence of activities as required.
• Conducts internal software quality audits, inspections, or reviews on the overall
Validator
project as appropriate in various phases of software development.
(VER, VAL)
• Develops and maintains records on the verification activities.
• Develops a Software Verification & Validation Report.

Integration & • Ensures a good-functioning system when ISCS integrates with other systems in
Testing Team terms of software deployment.
(INT & TST) • Maintains the traceability of system integration activities to software requirement
specifications.
• Develops and maintains records on system integration activities.
• Identifies integration anomalies, records and communicates these anomalies to
relevant Change Management body for evaluation and decision.
• Coordinate and participate in integration activities.
• Ensure the integration work is completed according to schedule.
• Ensure the testing activities done by Test Team is according to schedule.
• Develops the test specification.
• Plan test activities.
• Develop the overall software test specification with test objectives and test cases.
• Ensure the traceability of test objectives against the specified software
requirements and test cases against the specified test objectives.
• Ensure test plans are implemented and the specified tests are carried out.
• Identify deviations from expected results and record them in test reports.
• Communicate deviations to the authority responsible for the changes management
for evaluation and decision making.
• Record the test reports with results.
• Select the tool or equipment for testing ISCS.
• Ensures traceability of test objectives against the specified software requirements
and of test cases against the specified test objectives.

Page 49 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Roles Responsibilities

• Identifies deviations from expected results and record them in test reports.
• Communicates deviations with relevant Change Management body for evaluation
and decision.
• Documents test results into reports.

Implementation ▪ Ensure all software customization work adhere to practices defined in ISO
Team 12207 & EN 50128:2011+A2:2020.
(IMP) ▪ Responsible for the project software development work.
▪ Ensure the software development work is completed according to schedule.
▪ Ensure all software functions developed comply with contract document.
▪ Develops and maintains the implementation documents comprising the applied
methods, data types, and listings.
▪ Develop all application data & system work required in this project; and
▪ Ensure the system development work is completed according to schedule; and
▪ Develop all software customization work required in this project.

(Remaining page is intentionally left blank)

Page 50 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

7 Test Environment
7.1 Simulated Environment
This section describes the representation mock-up of the RTS test environment.

7.1.1 Software Component & Application Data/Algorithms Test and Software


Integration Test
Simulator Workstation allows monitoring the values returned by simulators for protocol request and the
generation of protocol traps on demand for testing the software components. ISCS HMI will show the
operational user interface to make sure required functions performed accordingly.
Figure 7-1 shows the test environment for both Software Component & Application Data/Algorithms Test
and Software Integration Test.

Figure 7-1: Test Environment for Software Component & Application Data/Algorithms Test and Software
Integration Test

Page 51 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

7.1.2 Overall Software Test

Figure 7-2 shows the test environment for the Group 1 (Overall Software Test).

WDLS - Duty WDLS - Communication Legends:


BKCS – Communication BKCS - Depot Officer
BKCS - Duty Operation Operation Manager
Officer Controller
Manager

RTU

Marshalling
Panel

ISCS
Workstation

BKCS - BKCS - Traffic/Operation


Engineering Officer Officer MIMIC
BKCS - WDLS - WDLS - Traffic/Operation Workstation
Mimic Display
Mimic Display Engineering Officer Officer
Workstation
Workstation
BKCS - Trainer RTU A
Workstation
KVM

WDLS -
BKCS - WDLS -
BKCS - Network Server
Network Hyper-V
Hyper-V Service
Service Server
Server Server
Server

Network
BKCS - Switch
Network
Database RTU B WDLS -
BKCS - Switch #1
Server Database RTU C Redundant
Training
Server Network Cable
Server

BKCS - Trainee Engineering


Workstation Workstation WDLS -
Engineering
Workstation
Network
Switch #2

WAHD - Hyper-V WAHD - Backup Depot


Server RTU D Controller

Figure 7-2: Test Environment for Group 1 (Overall Software Test)


Page 52 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

7.2 Real Environment


Figure 7-3 shows the real test environment setup at sites as defined in the Final Design of Integrated Supervisory Control System (ISCS) (RTS-SY03-SYS-
EIC-DSN-30001).
Bukit Chagar Station Woodlands North Station Legends:
Concourse Level CIQ – Level 1
Passenger Service Centre 2 Security Control Room Passenger Service Centre 1 Communication
OCC Theatre Comms Color Comms B/W RTU
AFC PSC1 Operator Officer Comms
PSC2 Operator PSC2 Operator Security Personnel Duty Operation Printer Printer
Equipment Workstation Plotter
Workstation#1 Workstation#2 Workstation Manager

*By OCC Marshalling


Panel
From CE/ISCS Room

OCC Level ISCS


OCC Theatre Incident Management Room *By OCC Workstation
Communication Depot Controller
Duty Operation Officer IMR Workstation Mimic Display
Manager Engineering Officer Traffic/Operation Officer Workstation
MIMIC
Workstation

Passenger Service Centre CE / ISCS Room

PSC Operator
Workstation #1 KVM

Mimic Display Hyper-V Network


Workstation TWP Server Service Server
Server
Engineering Officer Traffic/Operation Officer Remote
Panel
*By OCC
Comms Comms Color Comms B/W Video Wall
AFC Database
Plotter Printer Printer Display Panel
*By OCC Equipment Server

PSC Operator Engineering Video Wall


CE / ISCS Room Workstation #2 Workstation
Training Room Controller

Fare Equipment AFC UPS (Opco) Printer


Network Room Equipment
Hyper-V Service UPS
Trainer Server Server
Workstation Plotter

To CIQ Level 1

Training Database Hardwire


Server
Server Mezzanine Level
Video Cable

Trainee Incident Management Room Redundant


Engineering IMR Workstation SY03 Communication Network Cable
Workstation Workstation Interfaces SY01
VSS Server Rolling Stock
UPS (Opco)
MCS NMS SY02 Signaling System
UPS Platform Screen Door
Master Clock
SY03
CBN NMS Communication
PABX NMS SY05 Traction
Power Supply
Wadi Hana Depot PAS & PIDS Server
TWP Equipment SY06 Uninterruptible
Room Rolling Stock TETRA RS NMS Power Supply System
TWP
Depot Equipment Room UPS (Opco) TMS VRS Server SY07 Automatic Fare
Local Panel Collection System
WDCS NMS
UPS SY08 Depot Equipment
InfraCo Cybersecurity Equipments and Service Vehicles
Backup Depot Control Centre InfraCo
Backup Depot Controller
TPS SS EIS Server PSD
PSCADA Server (Rail 9000) Controller
Hyper-V
Server

Figure 7-3: Real Test Environment


Page 53 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

7.3 Hardware Items


This section specifies list of hardware items in each test phases in simulated and real environment which will be used in ISCS Overall Software Test Specification
(P205_ISCS_D2.3_OSTS).
Table 7-1 identifies the list of hardware items to be used in different testing phase:
Table 7-1: List of Hardware Items in Simulated and Real Environment

Simulated Real
No Items Purpose
Environment Environment

1. ISCS Hyper – V Server To host ISCS workload. ✓ ✓

2. ISCS Training Server For Training at Bukit Chagar Station. NA ✓

3. ISCS Database Server To host ISCS Database Management System and store all the NA ✓
historical data in ISCS system.

4. ISCS Network Service Server ISCS Backup & Print Server. NA ✓

5. ISCS Desktop Tower Workstation To host the integrated ISCS HMI to enable operator to perform control ✓ ✓
and monitoring of various subsystems.

6. ISCS Monitor Monitor for HMI Workstation. ✓ ✓

7. ISCS USB Headset Voice communication thru ISCS (Radio & PABX). ✓ ✓

8. Stereo Speaker Audio Output for ISCS HMI. ✓ ✓

9. ISCS Desktop Tower Workstation To drive ISCS workload to video wall display controller to achieve NA ✓
(for Mimic Display) Mimic Display function.

10. Keyboard, Video and Mouse (KVM) Input devices for Servers. NA ✓

Page 54 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Simulated Real
No Items Purpose
Environment Environment
Switch with LCD monitor and
keyboard & mouse

11. Marshalling Panel Gather multiple wires and cables, provide cross wiring functionality NA NA
between the control room cabinet and field instruments.

12. Network Switch It is a networking hardware that connects devices on a computer ✓ ✓


network by using packet switching to receive and forward data to the
destination device.

13. RTU/ PLC To collect data, code the data into a format that is transmittable and ✓ ✓
transmits the data back to an ISCS server.

14. Training Workstation To be used to configure and test the system using the runtime NA ✓
environment.

15. Circuit Dummy Breaker An electrical safety device designed to protect an electrical circuit NA ✓
from damage caused by an overcurrent or short circuit.

Page 55 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

7.4 Software Items


This section specifies the list of software tools selected for each testing phase in simulated and real environment and its justification which will be used in ISCS
Overall Software Test Specification (P205_ISCS_D2.3_OSTS).
Software tools consist of a series of supporting tools that aid and contribute to the development of the project. Such tools include testing and debugging tools.
These tools are classified in accordance with the EN 50128:2011+A2:2020 standard’s tools criteria as the following:
1. T1 - generates no outputs which can directly or indirectly contribute to the executable code (including data) of the software.
2. T2 - supports the test or verification of the design or executable code, where errors in the tool can fail to reveal defects but cannot directly create errors
in the executable software.
3. T3 - generates outputs which can directly or indirectly contribute to the executable code (including data) of the safety related system.
Tools Validation Report will be produced to comply with clause Section 6.7 of EN 50128:2011+A2:2020.
Table 7-2 identifies the list of software items to be used in different test phases for generating, recording and analysing the test cases.

Table 7-2: List of software items in real environment

Tool Simulated
No Tools Purpose Real Environment
Class Environment

1. POSTMAN To test and simulate all intended REST API services. T2 ✓ NA

2. Draw.io Test Specifications. T1 ✓ ✓

3. Microsoft 365 Apps for Test Specifications. T1 ✓ ✓


Business

4. Microsoft Excel 365 To save all the test cases and record all the test results. T1 ✓ ✓
Also used to record all faults and defects found during test
execution.

5. XManager Used to configure and manage the Xentral system T1 ✓ ✓


components and application. Also provides system
Page 56 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Tool Simulated
No Tools Purpose Real Environment
Class Environment
diagnostic and maintenance functions.

6. Training Simulator Offers operators in a simulated runtime environment. T1 NA ✓


Provides capability to simulate data changes, playback
from live/history data records and allowing trainer to create
various training scenarios.

7. Config Safe Configuration of Xentral Safe data. T3 NA ✓

8. Google Chrome Web browser to launch Xentral and Xentral Safe. T2 ✓ ✓

9. Vinci Protocol Analyzer To simulate IEC 60870-5-104 protocol for testing T2 ✓ ✓


configuration on interfaces.

10. Modbus Slave Simulator To simulate Modbus TCP/RTU protocol for testing T2 ✓ ✓
configuration on interfaces.

11. iReasoning SNMP Agent To simulate SNMP Protocol. T2 ✓ ✓


Simulator

12. NAS Simulator To simulate HTTP Protocol, for TMS testing only. T2 ✓ ✓

13. JMS API Signaling Simulator To test all intended JMS API services. T2 ✓ ✓

Page 57 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

8 Techniques and Measures


Test cases specified in OSTS adhere with the techniques and measures defined in SQAP
(P205_ISCS_D1.1_SQAP) to ensure overall software performs its intended functions. The techniques
and measures in each testing phase are established in adherence with the SIL 2 compliance of the EN
50128:2011+A2:2020 standards.

8.1 Functional/Black Box Testing


The techniques and measures used in OSTS for Functional/Black Box Testing are Boundary Value
Analysis, Equivalence Classes and Input Partition Testing.

Table 8-1: List of selected Techniques and Measures for Functional/Black Box Testing

A-14 Functional/Black Box Test

Technique/Measure BI SIL 2 Status Details


(Applied/
Not Applied)
Boundary Value Analysis R HR Applied D.4 Boundary Value
Analysis

Note: For testing,


integration and overall
software testing activities
Equivalence Classes and R HR Applied D.18 Equivalence Classes
Input Partition Testing and Input Partition Testing

Note: For testing,


integration and overall
software testing activities

Below is the detailed information on each of the chosen techniques and measures for Functional/Black
Box Testing: -

(Remaining page is intentionally left blank)

Page 58 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Table 8-2: Detailed Information of Boundary Value Analysis

Boundary Value Analysis

Aim
To remove software errors occurring at parameter limits or boundaries.

Description
The input domain of the program is divided into several input classes. The tests should cover the
boundariesand extremes of the classes. The tests check that the boundaries in the input domain of the
specification coincide with those in the program. The use of the value zero, in a direct as well as in an
indirect translation,is often error-prone and demands special attention:
• Zero divisor
• non-printing control characters
• empty stack or list element
• null matrix
• Zero table entry.
Normally the boundaries for input have a direct correspondence to the boundaries for the output range.
Testcases should be written to force the output to its limited values. Consider also, if it is possible to
specify a test case which causes output to exceed the specification boundary values.
If output is a sequence of data, for example a printed table, special attention should be paid to the first
and the last elements and to lists containing none, 1 and 2 elements.

(Remaining page is intentionally left blank)

Page 59 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Table 8-3: Detailed Information of Equivalence Classes and Input Partition Testing

Equivalence Classes and Input Partition Testing

Aim
To test the software adequately using a minimum of test data. The test data is obtained by selecting
the partition of the input domain required to exercise the software.

Description
This testing strategy is based on the equivalence relation of the inputs, which determines a partition
of the input domain.
Test cases are selected with the aim of covering all subsets of this partition. At least one test case is
taken from each equivalence class.
There are two basic possibilities for input partitioning which are:
● Equivalence classes may be defined on the specification. The interpretation of the specification
may be either input oriented, for example the values selected are treated in the same way or
output oriented, for example the set of values leading to the same functional result, and
● Equivalence classes may be defined on the internal structure of the program. In this case the
equivalence class results are determined from static analysis of the program, for example the set
of values leading to the same path being executed.

(Remaining page is intentionally left blank)

Page 60 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

8.2 Performance Testing

The techniques and measures used in OSTS for Performance Testing is Performance Requirements
Table 8-4: List of selected Techniques and Measures for Performance Testing

A-18 Performance Testing

Technique/Measure BI SIL 2 Status Details


(Applied/
Not Applied)
Performance Requirements - HR Applied D.40 Performance
Requirements

Note: For overall software


testing activities
Response Timing and - HR Applied D.45 Response Timing
Memory Constraints and Memory Constraints

Note: For overall


software testing activities

Below is the detailed information on each of the chosen techniques and measures for Performance
Testing:
Table 8-5: Detailed Information of Performance Requirements

Performance Requirements

Aim
To establish that the performance requirements of a software have been satisfied.

Description
An analysis is performed of both the system and the Software Requirements Specifications to identify
all general and specific, explicit, and implicit performance requirements.
Each performance requirement is examined in turn to determine:
• The success criteria to be obtained,
• Whether a measure against the success criteria can be obtained,
• The potential accuracy of such measurements,
• The project stages at which the measurements can be estimated,
• The project stages at which the measurements can be made.
The practicability of each performance requirement is then analyzed in order to obtain a list of
performance requirements, success criteria and potential measurements. The main objectives are:
• Each performance requirement is associated with at least one measurement.
• Where possible, accurate and efficient measurements are selected which can be used
as early in the development process as possible.
• essential and optional performance requirements and success criteria are identified and
• where possible, advantage shall be taken of the possibility of using a single
measurement for more than one performance requirement.
Page 61 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

Table 8-6: Detailed information of Response Timing and Memory Constraints

Response Timing and Memory Constraints

Aim
To ensure that the system will meet its temporal and memory requirements.

Description
The requirements specification for the system and the software includes memory and response
requirements for specific functions, perhaps combined with constraints on the use of total system
resources. An analysis is performed which will identify the distribution demands under average and
worst-case conditions. This analysis requires estimates of the resource usage and elapsed time of
each system function. These estimates can be obtained in several ways, for example, comparison with
an existing system or the prototyping and benchmarking of time critical systems

(Remaining page is intentionally left blank)

Page 62 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

APPENDIX A TEST STATUS REPORT SAMPLE


Test Status Report contains a summary of all test activities and final test results for the RTS project. It is
an assessment of how well the testing for each test phases are performed. Based on the test report,
quality of tested software components can be evaluated.

Table AA-0-1: Test Status Report Sample

Test Cycle <Test Cycle>

EXECUTED PASSED <No.>


FAILED <No.>
TOTAL TEST EXECUTED
<No.>
(PASSED + FAILED)
INCOMPLETE <No.>
IN PROGRESS <No.>
BLOCKED <No.>
TOTAL TEST PLANNED
<No.>
(EXECUTED + INCOMPLETE + IN PROGRESS + BLOCKED)

(Remaining page is intentionally left blank)

Page 63 of 64

Copyright © 2023. All rights reserved.


Document Title Overall Software Test Specification
Document No. P205_ISCS_D2.3_OSTS
Rev. No. 0.3.0

APPENDIX B DEFECT TRACKING LIST SAMPLE


Defect tracking list reports of any defects or issues discovered during test execution.

Table AB-0-1: Defect Tracking List Sample

Estimated
Item Test Equipment Description of Expected Tested
Category Severity Tester Repair Comments Status
No ID type defect Output Output
Time (Date)
<No> <ID> <Software/ <Description of defect <Major/Minor> <1/2/3/4/5> <Tester > <Expected < Tested <Estimated <Add <Open/
Hardware> > Output > Output > Repair Time Comments> Closed>
(Date)>

(Remaining page is intentionally left blank)

Page 64 of 64

Copyright © 2023. All rights reserved.

You might also like