Professional Documents
Culture Documents
P205 - ISCS - D2.3 - OSTS Rev0.3.0
P205 - ISCS - D2.3 - OSTS Rev0.3.0
ISCS
Overall Software Test
Specification
Wong Jun Kit Norjannah Hazali Liong Zhong Jin Han Chung Siew
REVISION HISTORY
0.0.0 20 Oct 2022 First Issue of Overall Software Test Specification (OSTS) Tamilchelvan
document.
Page 2 of 64
0.1.1 26 April 2023 1. Revised section 1.4.2 and removed IEE 829* and ISO Wong Jun Kit
12207:2008* from Standards as both were no longer will
be using as SIL2, BI and testing activities will be
referring to EN50128:2011+A2:2020 Standard.
2. Based on P205_ISCS_D1.2.3_CRF_DOC_00005
Change of Request, the
P205_ISCS_D2.3_OSTS_TestCase_Rev0.1.1
document was revised as it implements some changes
to the testcases. The test cases involve in the
implementation were listed below:
i. Removed OSTS-5.13-0019 to OSTS-5.13-
0026.
Page 3 of 64
0.2.1 29 May 2023 1. Revised Section 8 Techniques and Measures based on Wong Jun Kit
internal audit comments:
1) Added Response Timing and Memory Constraints
to section 8.2 Performance Testing.
2) Added testing strategy regarding manual
techniques of software testing and analytical
techniques under section 3.
2. Updated test case spreadsheet
P205_ISCS_D2.3_OSTS_TestCase_Rev0.3.0 based
on internal audit comments:
1) Revised the existing type of tests.
2) Added Response Timing and Memory Constraints
Technique and Measure to applicable test cases.
Page 4 of 64
TABLE OF CONTENTS
1 Introduction .................................................................................................................................. 9
1.1 Purpose of Document ............................................................................................................ 9
1.2 Scope of Work ........................................................................................................................ 9
1.3 Acronyms, Abbreviations and Terms ................................................................................... 13
1.3.1 Acronyms and Abbreviations ..........................................................................................13
1.3.2 Terms .............................................................................................................................16
1.4 Reference ............................................................................................................................. 17
1.4.1 Reference Documents ....................................................................................................17
1.4.2 Standard .........................................................................................................................18
1.5 Updating and Approval......................................................................................................... 19
1.6 Test Case ID ........................................................................................................................ 19
2 Document Structure .................................................................................................................. 20
3 Test Strategy .............................................................................................................................. 21
3.1 Test Objectives .................................................................................................................... 22
3.2 Test Assumptions and Constraints ...................................................................................... 22
3.3 Test Principles ...................................................................................................................... 22
3.4 Data Approach ..................................................................................................................... 22
3.5 Test Coverage and Levels of Testing .................................................................................. 23
3.5.1 Software Component, & Application Data/Algorithms Test ............................................23
3.5.2 Software Integration Test ...............................................................................................27
3.5.3 Overall Software Test .....................................................................................................30
4 Execution Strategy .................................................................................................................... 40
4.1 Test Cycles .......................................................................................................................... 40
4.2 Defect Management ............................................................................................................. 40
4.3 Test Criteria and Degree of Test Coverage ......................................................................... 43
4.3.1 Suspension Criteria ........................................................................................................43
4.3.2 Resumption Criteria ........................................................................................................43
4.3.3 Feature Pass/Fail Criteria ...............................................................................................44
5 Test Cases .................................................................................................................................. 45
6 Test Management Process ....................................................................................................... 46
6.1 Test Management Tool ........................................................................................................ 46
6.2 Test Design Process ............................................................................................................ 46
6.3 Test Execution Process ....................................................................................................... 47
6.4 Test Risks and Mitigation Factors ........................................................................................ 48
Page 5 of 64
Page 6 of 64
FIGURES
Figure 1-1: Simple Architecture of an ISCS Platform on Different Subsystems ...................................... 12
Figure 3-1: Software Component Test on Add-on ISCS Interface Software Components ...................... 23
Figure 3-2: Application Data/Algorithms Test on Xentral Software Platform and Xentral Safe ............... 24
Figure 3-3: Software Integration Test on Xentral Software Platform and Xentral Safe ........................... 27
Figure 4-1: Defect Tracking Process ........................................................................................................ 41
Figure 7-1: Test Environment for Software Component & Application Data/Algorithms Test and Software
Integration Test ......................................................................................................................................... 51
Figure 7-2: Test Environment for Group 1 (Overall Software Test) ......................................................... 52
Figure 7-3: Real Test Environment .......................................................................................................... 53
Page 7 of 64
TABLES
Table 1-1: Summary of ISCS Specific Application Development Scope ................................................. 10
Table 1-2: Acronyms and Abbreviations .................................................................................................. 13
Table 1-3: Terms ...................................................................................................................................... 16
Table 1-4: Reference Documents ............................................................................................................. 17
Table 1-5: Standard Reference Documents ............................................................................................. 18
Table 2-1: Document structure with the section number and content descriptions ................................. 20
Table 3-1: Test Types ............................................................................................................................... 21
Table 3-2: Exit Criteria Defining Methods and Formula ........................................................................... 26
Table 3-3: Exit Criteria Defining Methods and Formula ........................................................................... 28
Table 3-4: Exit Criteria Defining Methods and Formula ........................................................................... 31
Table 3-5: Exit Criteria Defining Methods and Formula ........................................................................... 32
Table 3-6: Exit Criteria Defining Methods and Formula ........................................................................... 34
Table 3-7: Exit Criteria Defining Methods and Formula ........................................................................... 36
Table 3-8: Exit Criteria Defining Methods and Formula ........................................................................... 38
Table 4-1: Defects Category .................................................................................................................... 42
Table 4-2: Test Status .............................................................................................................................. 44
Table 5-1: Test Case Sample ................................................................................................................... 45
Table 6-1: Test Risks and Mitigation Plan ................................................................................................ 48
Table 6-2: Roles and Responsibilities ...................................................................................................... 49
Table 7-1: List of Hardware Items in Simulated and Real Environment .................................................. 54
Table 7-2: List of software items in real environment ............................................................................... 56
Table 8-1: List of selected Techniques and Measures for Functional/Black Box Testing ........................ 58
Table 8-2: Detailed Information of Boundary Value Analysis ................................................................... 59
Table 8-3: Detailed Information of Equivalence Classes and Input Partition Testing .............................. 60
Table 8-4: List of selected Techniques and Measures for Performance Testing ..................................... 61
Table 8-5: Detailed Information of Performance Requirements ............................................................... 61
Table 8-6: Detailed information of Response Timing and Memory Constraints ....................................... 62
Table AA-0-1: Test Status Report Sample ............................................................................................... 63
Table AB-0-1: Defect Tracking List Sample ............................................................................................. 64
Page 8 of 64
1 Introduction
The Government of the Republic of Singapore and the Government of Malaysia have agreed to jointly
develop the RTS Link project to enhance connectivity between Malaysia and Singapore, to benefit
commuters who travel between Singapore and Johor Bahru. The RTS Link will primarily serve as an
alternative mode of transport for commuters currently utilising the Johor Bahru-Singapore Causeway to
cross the border. The RTS Link is intended to be a convenient, safe, and cost-effective system that
integrates well with other transportation services in Woodlands and Johor Bahru.
The RTS Link will be a shuttle link with double tracks that crosses the Straits of Johor via a high bridge.
It will serve two terminal stations, one in Woodlands, Singapore and the other in Bukit Chagar, Johor
Bahru, Malaysia. The proposed link will be approximately 4.6km in length, and the crossing will take
approximately 5-10 minutes. The RTS Link Operator (who will be the Employer) will be required to operate
the RTS Link all year round.
Page 9 of 64
No Interface System SIL Level Xentral Xentral Safe Add-on GUI I/O Script Logic
Software
Platform
1. Communication Backbone Network (CBN) and Railway BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA
System LAN
2. Wayside Data Communication System (WDCS) BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA
5. Private Automatic Branch Exchange (PABX) System BI NA NA ✓ [BI] ✓ [BI] ✓ [BI] ✓ [BI] NA
11. Tunnel Lighting System / Viaduct Lighting System BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA
Page 10 of 64
No Interface System SIL Level Xentral Xentral Safe Add-on GUI I/O Script Logic
Software
Platform
22. High Voltage (HV) System / Integrated Building BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA
Management System (iBMS)
23. RTU/PLC BI NA NA NA ✓ [BI] ✓ [BI] ✓ [BI] NA
Note:
NA: Not applicable in ISCS Development Scope
The details of development scope are provided in Software Development Plan (P205_ISCS_M4.1_SwDP).
Note: The development of ISCS Interface Software component to Tunnel Ventilation System (TVS), Traction Power System (TPS), Signaling System (SS) and Fire Protection
System (FPS) which requires SIL 2 interface will be included in Xentral Software (Generic Software).
Page 11 of 64
The OSTS documents the testing between Xentral and subsystems to ensure the ISCS system performs
well in interface with all subsystems to the requirements of its specification.
Software diagram below shows a simple overview of the subsystems and its interface to Xentral Software
Platform and Xentral Safe.
Page 12 of 64
BI Basic Integrity
EN European Standards
GS General Specification
HR Highly Recommended
ID Identification Number
IP Internet Protocol
Page 13 of 64
IMP Implementer
INT Integrator
NA Not Applicable
R Recommended
RS Rolling Stock
Page 14 of 64
SS Signaling System
TBC To Be Completed
TBD To Be Determined
TC Test Case
TST Tester
VAL Validator
VER Verifier
Page 15 of 64
1.3.2 Terms
Terms Definition
Page 16 of 64
1.4 Reference
Page 17 of 64
1.4.2 Standard
ISO 9000: All parts Quality Management and Quality Assurance Standards
ISO 9001:2015 Model for quality assurance in design, development, production, installing
and servicing
*Note: The latest version of the standard and applicable test as of the contract amendment date will be
used.
Page 18 of 64
Page 19 of 64
2 Document Structure
The following table summarizes the document structure:
Table 2-1: Document structure with the section number and content descriptions
Section Content
Number
Section 1 This section introduces the document by providing domain background, documents
and standard reference to be considered.
Section 2 This section briefs the content of each section in the document.
Section 3 This section describes the test strategy for the overall software test covering test
objectives, test assumptions and constraints, test principles, data approach, scope and
levels of testing for each test phases.
Section 4 This section describes the test execution strategy by documenting the test cycle, defect
management, test metrics, test criteria and degree of test coverage.
Section 5 This section specifies the test cases for the overall software which will be defined in the
Test Case spreadsheet: P205_ISCS_D2.3_OSTS_TestCase.
Section 6 This section specifies the test management process covering the test management tool,
test design and execution process, test risks and mitigation factors, roles and
responsibilities.
Section 7 This section specifies the test environment for each test phases.
Section 8 This section specifies the test cases specified in OSTS adhere with the techniques and
measures defined in SQAP (P205_ISCS_D1.1_SQAP) to ensure overall software
performs its intended functions.
Page 20 of 64
3 Test Strategy
The selected testing strategy for ISCS’s software development life cycle is an analytical technique. In this
strategy, the requirements-based testing is identified as the appropriate method with the following
process:
1. Tester team will define the testing conditions to be covered after analyzing the test basis. The
test basis is the information from the documentation on which test cases are based, such as
requirements, architecture and design, and interfaces.
2. The requirements are analyzed to derive the test conditions. Then tests are designed,
implemented and executed to meet those requirements.
3. The results are recorded with respect to requirements.
Software testing is the process that is carried out throughout software development to identify the bugs,
issues, and defects in the software application. Manual testing will be executed in all the test types listed
in Table 3-1 below by tester without using any automated tools. The purpose of software testing activity
is to ensure that the application is error free, and it is working in conformance to the requirements. Tester
is responsible to make sure the test cases shall have 100% test coverage. The reported defects shall be
fixed by developers and re-testing shall be performed by testers on the fixed defects. The goal is to check
the quality of the system and deliver bug-free product to the customer. Testing team will practice dynamic
techniques for testing activities as it involves test cases and covers functional and non-functional testing.
Dynamic techniques execute the software and validates the output with the expected outcome.
Page 21 of 64
• Subsystems are not available in the test environment and will be replaced with simulators during
the Software Component & Application Data/Algorithms Test, Software Integration Test and
Overall Software Test.
• All intended parameters will be put in test and ISCS should be able to receive all inputs (alarms).
• The scope of testing for each stage in Overall Software Test is subject to COMMS Testing and
Commissioning (T&C) approval.
• Generic software (Xentral Software Platform and Xentral Safe) will be tested by Willowglen MSC
Berhad.
• Testing will be focused on meeting the ISCS requirements and intended functions.
• There will be common, consistent procedures for supporting testing activities.
• Testing processes will be well defined, yet flexible, with the ability to change as needed.
• Testing activities will build upon previous stages to avoid redundancy or duplication of effort.
• Testing environment and data will emulate a real environment as much as possible.
• Testing will be a repeatable, quantifiable, and measurable activity.
• Testing will be divided into distinct phases, each with clearly defined objectives and goals.
• There will be entry and exit criteria.
Page 22 of 64
3.5.1.1 Purpose
Software Component Test will be applied on each add-on ISCS interface software component individually.
The goal here is to find discrepancy between the add-on ISCS interface software components and the
program specifications prior to its integration with other components.
Figure 3-1 indicates the add-on ISCS interface software component under testing.
Figure 3-1: Software Component Test on Add-on ISCS Interface Software Components
Page 23 of 64
Application Data/Algorithms Test will be applied on Xentral Software Platform and Xentral Safe based on
GUI configuration, IO configuration, logic programming and scripting. Simulators for each respective
standard protocol is used to simulate protocols so that third party system use these interfaces to interact
with Xentral Software Platform and Xentral Safe to exchange data.
Figure 3-2: Application Data/Algorithms Test on Xentral Software Platform and Xentral Safe
Page 24 of 64
Application data/algorithms testing will be performed on add-on ISCS interface software components,
Xentral Software Platform and Xentral Safe based on:
• GUI configuration
• IO configuration
• Logic programming
• Scripting
3.5.1.3 Testers
The test will be carried out by Software Testing Team.
3.5.1.4 Timing
The testing is performed in the Software Component & Application Data/Algorithms Testing phase of the
ISCS V-Model Development Life Cycle.
• Xentral Software Platform and Xentral Safe fully developed to carry out Application
Data/Algorithms Test.
• All add-on ISCS interface software components have been developed and ready to be tested.
• Software Component & Application Data/Algorithms Test Specification
(P205_ISCS_D4.2_SCADTS) properly reviewed and approved.
• Software Component & Application Data/Algorithms Test Case properly reviewed and approved.
• Availability of test environment of Software Component & Application Data/Algorithms Test.
• Availability of subsystem/protocol simulators of Software Component & Application
Data/Algorithms Test.
Page 25 of 64
The following exit criteria should be considered for completion of a testing phase:
Page 26 of 64
3.5.2.1 Purpose
Software Integration Test and Software/Hardware Integration Test will be performed in the Software
Integration Phase.
The Software Integration Test is the process of testing the interfaces between software components as
against the specifications. The goal here is to find errors associated defects in the interaction between
software components when they are integrated.
Software/Hardware Integration Test which will be performed to validate that the developed software and
actual hardware can be integrated to work as a whole to perform the required functions.
Figure 3-3 indicates the software component & application data/algorithms under testing.
Figure 3-3: Software Integration Test on Xentral Software Platform and Xentral Safe
Page 27 of 64
• Integrated Software
➢ Add-on ISCS Interface Software Components
➢ GUI configuration
➢ I/O configuration
➢ Logic programming
➢ Scripting
• Hardware
➢ RTU/PLC
3.5.2.3 Testers
The test will be carried out by Software Testing Team.
3.5.2.4 Timing
The testing is performed in the Software Integration Phase of the ISCS V-Model Development Life Cycle.
• Each add-on ISCS interface software component has gone through Software Component Test.
• GUI configuration, IO configuration, logic programming and scripting has gone through
Application Data/Algorithms Test.
• Critical (Severity 1) and High (Severity 2) defects found during Software Component & Application
Data/Algorithms Test has been fixed and closed.
• Software Integration Test Specification (P205_ISCS_D3.6_SITS) properly reviewed and
approved.
• Software Integration Test Case properly reviewed and approved.
• Availability of test environment of Software Integration Test.
• Availability of subsystem/protocol simulators of Software Integration Test.
Page 28 of 64
The following exit criteria should be considered for completion of a testing phase:
Page 29 of 64
The Overall Software Test is divided into 3 groups of tests which resulting in a formal software release for
different stage of system test (FAT, IFAT, PAT, SAT and SIT) as follow:
3.5.3.1.3 Testers
The test will be carried out by Software Testing Team.
3.5.3.1.4 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.
• The Add-on ISCS Interface Software Components, Application Data/Algorithms and RTU/PLC
have gone through the Software Integration Test.
• Critical (Severity 1) and High (Severity 2) found during Software Integration Test has been fixed
and closed.
• Overall Software Testing Specification (P205_ISCS_D2.3_OSTS) properly reviewed and
approved.
• Overall Software Test Case properly reviewed and approved.
Page 30 of 64
The following exit criteria should be considered for completion of a testing phase:
Page 31 of 64
Phase 1 of Group 2 is a regression test that will execute selecting relevant test cases from the Overall
Software Test Case. The objective of regression test is to make sure software still functions as expected
after any code changes, updates or improvements.
3.5.3.2.3 Testers
The test will be carried out by Software Testing Team.
3.5.3.2.4 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.
The following exit criteria should be considered for completion of a testing phase:
Page 33 of 64
Phase 2 of Group 2 is a regression test that will execute selecting relevant test cases from the Overall
Software Test Case. The objective of regression test is to make sure software still functions as expected
after any code changes, updates or improvements.
3.5.3.3.3 Testers
The test will be carried out by Software Testing Team.
3.5.3.3.4 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.
The following exit criteria should be considered for completion of a testing phase:
• Reports of any defects or issues discovered during test execution recorded in the defect tracking
document.
• Re-testing and closing all the high and critical priority defects to execute corresponding
Regression scenarios successfully.
• Test results for each phase approved by Validator before starting the next test phase.
• Achieving complete functional coverage in accordance with the specified requirements in SRS
(P205_ISCS_D2.2_SRS).
• No discrepancy found on functionalities of ISCS and its interfaces to all COMMS subsystems on
site in a local environment.
• Functionalities of Xentral Software Platform and Xentral Safe and its interfaces to all SIL2
subsystems do not have any logical and functional errors on site in a local environment.
• No discrepancy found on ISCS hardware and ISCS functionalities against meeting its requirement
specification.
• Functionalities of ISCS and its interfaces to all COMMS subsystems do not have any logical and
functional errors.
Page 35 of 64
3.5.3.4.2 Testers
The test will be carried out by Software Testing Team.
3.5.3.4.3 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.
Phase 3 of Group 2 is a regression test that will execute selecting relevant test cases from the Overall
Software Test Case. The objective of regression test is to make sure software still functions as expected
after any code changes, updates or improvements.
The test will be performed at site with OCC/BOCC connection (On-site testing).
The following exit criteria should be considered for completion of a testing phase:
Page 37 of 64
Group 3 is a regression test that will execute selecting relevant test cases from the Overall Software Test
Case. The objective of regression test is to make sure software still functions as expected after any code
changes, updates or improvements.
The test will be performed at site with full operation of all subsystems (On-site testing).
3.5.3.5.3 Testers
The test will be carried out by Software Testing Team.
3.5.3.5.4 Timing
The testing is performed in the Software Validation Phase of the ISCS V-Model Development Life Cycle.
The following exit criteria should be considered for completion of a testing phase:
Page 39 of 64
4 Execution Strategy
4.1 Test Cycles
The test cycle includes:
Page 40 of 64
Page 41 of 64
Defects found during the Testing will be categorized as depicted in Table 4-1.
Table 4-1: Defects Category
Major An event affecting the 1 (Critical) ▪ This bug is critical enough to crash the
functionality being tested system, cause file corruption, or cause
in the session. The fault potential data loss.
shall be rectified before ▪ It causes an abnormal return to the
recommencing testing. operating system (crash or a system failure
message appears).
▪ It causes the application to hang and
requires re-booting the system.
2 (High) ▪ It causes a lack of vital program functionality
with workaround.
Minor An event not affecting the 3 (Medium) ▪ This bug will degrade the quality of the
functionality being tested System. However, there is an intelligent
in that session. Testing workaround for achieving the desired
may be continued. functionality - for example through another
screen.
▪ This bug prevents other areas of the product
from being tested. However other areas can
be independently tested.
4 (Low) ▪ There is an insufficient or unclear error
message, which has minimum impact on
product use.
5 (Cosmetic) ▪ There is an insufficient or unclear error
message that has no impact on product use.
Page 42 of 64
Suspension criteria specifies the criteria to be used to suspend all or portion of testing activities. If the
suspension criteria are met during testing, the active test cycle will be suspended until criteria are resolved.
Following are the criteria of suspension:
• Any major category defect which highly impacts the testing progress.
• Hardware or software resources are not available as per the requirements.
Resumption criteria is the restart of the testing process which is invoked after the suspension criteria are
met. The resumption criteria involve verification of the defect by which suspension was invoked during
the testing process.
Resumption criteria is valid if the defect which caused the suspension of the testing process gets fixed
and the fix is verified by Software Testing Team.
Criteria to resume the testing process:
Page 43 of 64
Under this category, further investigation is needed. This state is only a “temporary”
state and repeat test shall be conducted, which should eventually lead to “Passed”
or “Failed” status.
Blocked (B) Test case unable to run because pre-requisite for its execution is not fulfilled.
Page 44 of 64
5 Test Cases
Test cases for the overall software test are defined in Test Case spreadsheet:
P205_ISCS_D2.3_OSTS_TestCase. Each test case is defined with an OSTS test case ID and traces all
[BI] and [SIL2] tagged requirements from SRS. A requirement can be tested with multiple OSTS test
cases. [SIL2] tagged refers to SIL2 requirement while [BI] will be referred to as non-SIL2 or basic integrity
requirement.
Test Case will be created for each of the test cycle.
Item Test Test Case Prerequisite Test Test Expected Test Remarks
No Case Objective Data Step Result Status
ID
Page 45 of 64
• Any defect encountered will be raised in Excel test documentation linking to the particular test
case/test step.
• During Defect fix testing, defects are re-assigned back to the tester to verify the defect fix. The
tester verifies the defect fix and updates the status directly in Excel Test Documentation and
defect tracking list.
• Various reports can be generated from Excel Test Documentation to provide status of test
execution in the test status report.
Incorporating
Understanding Establishing Preparation of Peer Review of Review
Requirements. Traceability. Test cases. Test cases. comments in
test cases.
• The tester will understand each requirement and prepare corresponding test case to ensure all
requirements are covered.
• Each Test case will be mapped to Software Requirements Specification (P205_ISCS_D2.2_SRS)
as part of Requirement Traceability Matrix.
• Each of the test cases will undergo review by the testing team and the review defects are captured
and shared to the implementation team. The implementation team will rework on the review
defects and finally obtain approval and sign-off.
• During the test case preparation phase, tester will use the Software Requirement Specification
(P205_ISCS_D2.2_SRS), use case and functional specification to write step by step test cases.
Page 46 of 64
• Any subsequent changes to the test case will be directly updated in Excel Test Documentation.
• Once all test cases are approved and the test environment is ready for testing, tester will start
each test phases to ensure the application is stable for testing.
• The Integration and/or Testing Team runs all test cases and update results directly in Excel Test
Documentation.
• Testers to ensure necessary access to the testing environment, Excel Test Documentation for
updating test status and raise defects. If any issues, will be escalated to the Project Manager as
escalation.
• Any Critical (Severity 1) defects and High (Severity 2) defects during Software Component,
Application Data and Algorithm test will be escalated to the respective implementation team for
fixes.
• Each tester performs step by step execution and updates the test status. The tester enters Pass,
Fail, Incomplete or Blocked Status for each of the step directly in Excel Test Documentation.
• If any failures, defect will be raised as per severity guidelines in Excel Test Documentation
detailing steps to simulate along with screenshots if appropriate.
• Daily test execution status as well as defect status will be reported to all stakeholders.
• Testing team will participate in defect triage meetings in order to ensure all test cases are
executed with either pass/fail category.
• Testing process is repeated until all test cases are executed fully with Pass/Fail status.
• During the subsequent cycle, any defects fixed applied will be tested and results will be updated
in Excel test documentation during the cycle.
Page 47 of 64
Page 48 of 64
Roles Responsibilities
Software • Manages and ensures the Verification and Validation process of the software in
Assurance accordance with EN 50128:2011+A2:2020 requirements.
Team • Manages the verification process (review, integration, and testing) and ensures
Verifier/ independence of activities as required.
• Conducts internal software quality audits, inspections, or reviews on the overall
Validator
project as appropriate in various phases of software development.
(VER, VAL)
• Develops and maintains records on the verification activities.
• Develops a Software Verification & Validation Report.
Integration & • Ensures a good-functioning system when ISCS integrates with other systems in
Testing Team terms of software deployment.
(INT & TST) • Maintains the traceability of system integration activities to software requirement
specifications.
• Develops and maintains records on system integration activities.
• Identifies integration anomalies, records and communicates these anomalies to
relevant Change Management body for evaluation and decision.
• Coordinate and participate in integration activities.
• Ensure the integration work is completed according to schedule.
• Ensure the testing activities done by Test Team is according to schedule.
• Develops the test specification.
• Plan test activities.
• Develop the overall software test specification with test objectives and test cases.
• Ensure the traceability of test objectives against the specified software
requirements and test cases against the specified test objectives.
• Ensure test plans are implemented and the specified tests are carried out.
• Identify deviations from expected results and record them in test reports.
• Communicate deviations to the authority responsible for the changes management
for evaluation and decision making.
• Record the test reports with results.
• Select the tool or equipment for testing ISCS.
• Ensures traceability of test objectives against the specified software requirements
and of test cases against the specified test objectives.
Page 49 of 64
Roles Responsibilities
• Identifies deviations from expected results and record them in test reports.
• Communicates deviations with relevant Change Management body for evaluation
and decision.
• Documents test results into reports.
Implementation ▪ Ensure all software customization work adhere to practices defined in ISO
Team 12207 & EN 50128:2011+A2:2020.
(IMP) ▪ Responsible for the project software development work.
▪ Ensure the software development work is completed according to schedule.
▪ Ensure all software functions developed comply with contract document.
▪ Develops and maintains the implementation documents comprising the applied
methods, data types, and listings.
▪ Develop all application data & system work required in this project; and
▪ Ensure the system development work is completed according to schedule; and
▪ Develop all software customization work required in this project.
Page 50 of 64
7 Test Environment
7.1 Simulated Environment
This section describes the representation mock-up of the RTS test environment.
Figure 7-1: Test Environment for Software Component & Application Data/Algorithms Test and Software
Integration Test
Page 51 of 64
Figure 7-2 shows the test environment for the Group 1 (Overall Software Test).
RTU
Marshalling
Panel
ISCS
Workstation
WDLS -
BKCS - WDLS -
BKCS - Network Server
Network Hyper-V
Hyper-V Service
Service Server
Server Server
Server
Network
BKCS - Switch
Network
Database RTU B WDLS -
BKCS - Switch #1
Server Database RTU C Redundant
Training
Server Network Cable
Server
PSC Operator
Workstation #1 KVM
To CIQ Level 1
Simulated Real
No Items Purpose
Environment Environment
3. ISCS Database Server To host ISCS Database Management System and store all the NA ✓
historical data in ISCS system.
5. ISCS Desktop Tower Workstation To host the integrated ISCS HMI to enable operator to perform control ✓ ✓
and monitoring of various subsystems.
7. ISCS USB Headset Voice communication thru ISCS (Radio & PABX). ✓ ✓
9. ISCS Desktop Tower Workstation To drive ISCS workload to video wall display controller to achieve NA ✓
(for Mimic Display) Mimic Display function.
10. Keyboard, Video and Mouse (KVM) Input devices for Servers. NA ✓
Page 54 of 64
Simulated Real
No Items Purpose
Environment Environment
Switch with LCD monitor and
keyboard & mouse
11. Marshalling Panel Gather multiple wires and cables, provide cross wiring functionality NA NA
between the control room cabinet and field instruments.
13. RTU/ PLC To collect data, code the data into a format that is transmittable and ✓ ✓
transmits the data back to an ISCS server.
14. Training Workstation To be used to configure and test the system using the runtime NA ✓
environment.
15. Circuit Dummy Breaker An electrical safety device designed to protect an electrical circuit NA ✓
from damage caused by an overcurrent or short circuit.
Page 55 of 64
Tool Simulated
No Tools Purpose Real Environment
Class Environment
4. Microsoft Excel 365 To save all the test cases and record all the test results. T1 ✓ ✓
Also used to record all faults and defects found during test
execution.
Tool Simulated
No Tools Purpose Real Environment
Class Environment
diagnostic and maintenance functions.
10. Modbus Slave Simulator To simulate Modbus TCP/RTU protocol for testing T2 ✓ ✓
configuration on interfaces.
12. NAS Simulator To simulate HTTP Protocol, for TMS testing only. T2 ✓ ✓
13. JMS API Signaling Simulator To test all intended JMS API services. T2 ✓ ✓
Page 57 of 64
Table 8-1: List of selected Techniques and Measures for Functional/Black Box Testing
Below is the detailed information on each of the chosen techniques and measures for Functional/Black
Box Testing: -
Page 58 of 64
Aim
To remove software errors occurring at parameter limits or boundaries.
Description
The input domain of the program is divided into several input classes. The tests should cover the
boundariesand extremes of the classes. The tests check that the boundaries in the input domain of the
specification coincide with those in the program. The use of the value zero, in a direct as well as in an
indirect translation,is often error-prone and demands special attention:
• Zero divisor
• non-printing control characters
• empty stack or list element
• null matrix
• Zero table entry.
Normally the boundaries for input have a direct correspondence to the boundaries for the output range.
Testcases should be written to force the output to its limited values. Consider also, if it is possible to
specify a test case which causes output to exceed the specification boundary values.
If output is a sequence of data, for example a printed table, special attention should be paid to the first
and the last elements and to lists containing none, 1 and 2 elements.
Page 59 of 64
Table 8-3: Detailed Information of Equivalence Classes and Input Partition Testing
Aim
To test the software adequately using a minimum of test data. The test data is obtained by selecting
the partition of the input domain required to exercise the software.
Description
This testing strategy is based on the equivalence relation of the inputs, which determines a partition
of the input domain.
Test cases are selected with the aim of covering all subsets of this partition. At least one test case is
taken from each equivalence class.
There are two basic possibilities for input partitioning which are:
● Equivalence classes may be defined on the specification. The interpretation of the specification
may be either input oriented, for example the values selected are treated in the same way or
output oriented, for example the set of values leading to the same functional result, and
● Equivalence classes may be defined on the internal structure of the program. In this case the
equivalence class results are determined from static analysis of the program, for example the set
of values leading to the same path being executed.
Page 60 of 64
The techniques and measures used in OSTS for Performance Testing is Performance Requirements
Table 8-4: List of selected Techniques and Measures for Performance Testing
Below is the detailed information on each of the chosen techniques and measures for Performance
Testing:
Table 8-5: Detailed Information of Performance Requirements
Performance Requirements
Aim
To establish that the performance requirements of a software have been satisfied.
Description
An analysis is performed of both the system and the Software Requirements Specifications to identify
all general and specific, explicit, and implicit performance requirements.
Each performance requirement is examined in turn to determine:
• The success criteria to be obtained,
• Whether a measure against the success criteria can be obtained,
• The potential accuracy of such measurements,
• The project stages at which the measurements can be estimated,
• The project stages at which the measurements can be made.
The practicability of each performance requirement is then analyzed in order to obtain a list of
performance requirements, success criteria and potential measurements. The main objectives are:
• Each performance requirement is associated with at least one measurement.
• Where possible, accurate and efficient measurements are selected which can be used
as early in the development process as possible.
• essential and optional performance requirements and success criteria are identified and
• where possible, advantage shall be taken of the possibility of using a single
measurement for more than one performance requirement.
Page 61 of 64
Aim
To ensure that the system will meet its temporal and memory requirements.
Description
The requirements specification for the system and the software includes memory and response
requirements for specific functions, perhaps combined with constraints on the use of total system
resources. An analysis is performed which will identify the distribution demands under average and
worst-case conditions. This analysis requires estimates of the resource usage and elapsed time of
each system function. These estimates can be obtained in several ways, for example, comparison with
an existing system or the prototyping and benchmarking of time critical systems
Page 62 of 64
Page 63 of 64
Estimated
Item Test Equipment Description of Expected Tested
Category Severity Tester Repair Comments Status
No ID type defect Output Output
Time (Date)
<No> <ID> <Software/ <Description of defect <Major/Minor> <1/2/3/4/5> <Tester > <Expected < Tested <Estimated <Add <Open/
Hardware> > Output > Output > Repair Time Comments> Closed>
(Date)>
Page 64 of 64