You are on page 1of 29

Test Strategy

OneOoredoo

Page 1 of 29
Project Identification

Project Name
OneOoredoo
Customer Name
Ooredoo

Document Identification

Author Document Location (repository/path/name)


Srinivas Mannapuram
Version Status Date (YYYY-MM-DD) Document Classification
0.1 Initial Draft 31-03-2023 Internal
0.2 Review Comments 17-04-2023 Internal

Revision history

Version Date Description Updated by


0.1 March 31, 2023 Initial Draft Srinivas Mannapuram
0.2 April 17, 2023 Included Review comments from Mobolaji Srinivas Mannapuram
Included Review comments and updated Anil Kumar
0.3 May 11, 2023
Defect Flows, Timelines and Diagram
Review comments included, updated Exit Anil Kumar
criteria, Upstream/Downstream diagram,
0.4 May 25, 2023
UAT scope at OpCo and SI level, Defect
triage audience and TCR details.

Approval

Approver Name Date Email confirmation

OneOoredoo IT Enablement

EY Programme Director

EY Test Manager

Page 2 of 29
Table of Contents
1 Introduction .................................................................................................................................... 5
2 Objectives ....................................................................................................................................... 5
3 Requirements ................................................................................................................................. 5
4 Testing Scope ................................................................................................................................. 5
4.1 Types of Testing........................................................................................................................................... 6
4.2 Testing Approach ......................................................................................................................................... 8
4.3 Testing Strategy and Approach ................................................................................................................ 10
4.4 Testing Structure ........................................................................................................................................ 12
4.5 Testing Cycles ............................................................................................................................................ 12
4.6 SIT ............................................................................................................................................................... 12
4.6.1 SIT approach ..................................................................................................................................................... 12
4.7 Test Environment ....................................................................................................................................... 15
4.8 UAT ............................................................................................................................................................. 16
4.8.1 UAT-Approach ................................................................................................................................................... 16
4.9 Post-Go live Regression ............................................................................................................................ 18
4.9.1 Post-Go live Regression approach................................................................................................................. 18
4.10 Test Data .................................................................................................................................................... 19
4.11 Environment Management ........................................................................................................................ 19
4.12 Test Execution............................................................................................................................................ 20
4.13 High Level Test Execution Plan ................................................................................................................ 21
4.14 RASCI ......................................................................................................................................................... 21
4.15 Defect Management .................................................................................................................................. 21
4.16 Testing Scope: ........................................................................................................................................... 21
4.17 Test Deliverables ....................................................................................................................................... 22
4.18 Testing Process ......................................................................................................................................... 23
4.19 Testing Environments ................................................................................................................................ 24
4.20 Schedule ..................................................................................................................................................... 24
4.21 Testing Tools .............................................................................................................................................. 25
5 Training ......................................................................................................................................... 25
6 Defect Management ...................................................................................................................... 25
6.1 Defect resolution timelines ........................................................................................................................ 28
7 Testing Roles & Responsibilities ................................................................................................. 28
8 Assumptions, Dependencies and Risks ...................................................................................... 29

List of Figures
Figure 1. Dependencies between the types of testing ........................................................................................... 7
Figure 2. Value Model .............................................................................................................................................. 8
Figure 3. Configuration Process .............................................................................................................................. 9
Figure 4. OneOoredoo Testing Structure ............................................................................................................. 12
Figure 5. Kuwait Application Architecture ............................................................................................................. 14
Figure 6. Qatar Application Architecture ............................................................................................................... 15
Figure 7. Algeria Application Architecture ............................................................................................................ 15
Figure 8. Test Environment ................................................................................................................................... 16
Figure 9. Landscaope ............................................................................................................................................ 20

Page 3 of 29
Figure 10. Testing Process .................................................................................................................................... 23
Figure 11. Illustration of General Testing Process ............................................................................................... 26
Figure 12. Severity 1 & 2 Defects escalation process to remedy test defect ..................................................... 27

Page 4 of 29
1 Introduction
This document defines OneOoredoo high-level testing strategy and approach and is being used as the
“guiding principles” and common framework for Testing. This is a high-level document that describes
the following topics:
▪ Project Testing Objectives
▪ Requirements
▪ Testing Scope
▪ Types of Testing
▪ Testing Approach
▪ Testing Deliverables
▪ Testing Tools
▪ Defect Management
▪ Roles and Responsibilities

Definition of Testing
Testing is an activity aimed at evaluating an attribute or capability of a computer program or system in
determining that it meets the business requirements as defined in the business requirements.

2 Objectives
The objectives of testing are:
▪ Ensure that the system (Hite to Retire, Spurce to Settle and Finance) meets all the business
requirements determined to be in scope.
▪ Agree on the depth of Testing approach in each cycle.
▪ Identify any missing design components.
▪ Ensure that the system meets technical requirements and meets service levels for application
response time, throughout.

3 Requirements
Following requirements should be fulfilled according to the Test Strategy document:
▪ Design (global and delta) requirements acceptance and approval is completed before
Realization Phase begins.
▪ Unit Testing is complete before Integration Testing commences.
▪ The overall program Plan is in place before Integration Testing commences – governs changes
to the system, design decisions, documentation, etc. Specifically:
o Scope Management Plan
o Integration Management Plan, procedures related to
o Integration Change Control Process
o Issue Management Process
o Risk Management Plan
o Integration Testing conducted in iterations with the progressive addition of security & data
conversions.

4 Testing Scope
The scope of testing includes business and technical requirements specified in the respective business
design documents, as signed off.

Page 5 of 29
4.1 Types of Testing
Unit Testing – validates that individual functions are configured and/or developed to appropriately
translate technical and functional requirements. This would include testing of individual configuration
elements, process steps associated with business transactions, and custom development objects.
Unit testing should include:
▪ Positive Testing – validates that test functions correctly by inputting a known value that is correct
and verifies that the data/view return is what is expected.
▪ Negative Testing – validates that the test fails by inputting a value that we know is incorrect and
verifying that the component or test case fails. This allows us to understand and identify failures
and that the target application is operating correctly by displaying the appropriate warning
message.
▪ Unit Test Data – Fabricated or customer specific master data will be manually entered as
required for unit testing and used by other teams where appropriate.

Functional Testing - focuses on the functionality of an application that validates the output based on
selected input that consists of Unit Testing, Business Process (String) Testing, and Scenario (Integration)
testing.

Business Process (String) Testing – validates the full operability of interconnected functions, methods,
or objects within the functional areas of an SAP Solution (e.g., Sales).
▪ Includes a set of logically related activities or business process steps to achieve a defined
business process.
▪ Includes business processes that cross functional areas (e.g., Sales and Finance).
During subsequent integration testing activities these business process (string) tests are combined to
build end-to-end integration test scenarios.

System Integration Testing – validates a set of business processes that define a business scenario in
a comprehensive and self-contained manner on a macro level.
▪ Integration testing is recommended to be done in multiple iterations.
▪ The initial iteration of integration testing concentrates on testing all important business
processes within the SAP components of the implemented solution, starting with touch point
scenarios, and ending with end-to-end scenarios.
▪ The final iteration of integration testing focuses on cross-functional business scenarios with
non-SAP systems and applications, custom development objects and converted data.

Data Conversion Testing – testing conversion programs to load production data used to drive a test
scenario that defines entities of information that is required to support manual testing.

Authorization/IAG (identity access governance) Testing - is performed to ensure that all the security
profiles and roles are being implemented as designed. A security profile is designed and built based on
the job role (i.e., positions) of the end users. Security roles are assigned at the business transaction level.
Security testing seeks to ensure that:
▪ Each security profile and role work as designed.
▪ Access to “sensitive transactions or reports” (e.g., personnel information, payroll data, supplier
bank information) is restricted to only those roles that require access to this information.

Page 6 of 29
▪ Segregation of duties has been addressed when designing each security profile and when
profiles are grouped into a role, especially for high-risk areas (e.g., payments, payroll, and
master data creation vs. transaction data entry).

User Acceptance Testing (UAT) – users test the complete, end-to-end business processes to verify that
the implemented solution performs the intended functions and satisfies the business requirements.

The following illustration represents the dependencies between the types of testing executed during
the Realization Phase of the project.

Figure 1. Dependencies between the types of testing

As part of testing, it is important to simulate daily, weekly, and monthly business events and activities
(e.g., daily batch processes, generation of key reports, and execution of financial monthly close) during
business process (string) testing and scenario (integration) testing in the quality assurance
environment. Additionally, the project team will practice cutover build activities to develop and practice
cutover activities during business process (string) testing and scenario (integration) testing iterations
using the QA environment to prepare and simulate production build of the SAP Solution.

Page 7 of 29
4.2 Testing Approach
The following Value Model (V-Model) diagram illustrates how testing in the Realization through Go Live
and support phases relates to business needs and requirements defined in the Design Phase of the
program. The right side of the model consists of all the testing cycles and traceability back to the
business design and requirements on the left side of the diagram.

Business needs and Realized and Continual


Requirements Improvements

Operations and Continual


Business Objectives and Goals
System usage delivers anticipated Improvement
results

SOPs

Solution performs intended


integrated functions
Business Scenarios Requirements Integration and Acceptance Testing
Business scenarios become
integrations Test Scenario

Bus Scenario Scenario Integration Test


Requirements Models Scenarios

Process and Programs perform


Business Process Requirements, intended functions
Business Process String Test
Initial Design and OCM implications
Unit Test cases

Process
Process Models BPP s Config Docs
Requirements

Solution Design Functional and Structural code Review & Unit Test
Technical Specifications Config and code complies to Case
Standard

Key Decision
Business Blueprint
Document

Golden
Config/Code

Build

Figure 2. Value Model

Testing is designed to validate business requirements to provide traceability to the Design requirements
to build the SAP solution. The illustration below reflects the relationship between the configuration
process – business process framework and testing types.

Page 8 of 29
Figure 3. Configuration Process

As illustrated in the Configuration Process – Business Process Framework and Testing Activities,
functional testing is defined at the lowest level, starting with unit testing at the lowest business
transaction level, progressing to business process step level, to business process (string) testing and
will include:
▪ Testing of configurable transactions
o Test Non-ERP transactions or step as part of the configurable transaction.
o Test manual transactions or steps performed as part of the configurable transaction.
▪ Testing of development objects. These tests will further include:
o Testing of the code within the development object. These tests will be based on the
Technical Specifications documents. This activity will be owned and conducted by the
Technical Team.
o Testing the functional aspects of the development object. These tests will be based on
the Functional Specifications documents & Business Process Procedures and will be
conducted by the Functional Teams with the assistance of the Technical Team.
▪ Testing of business process steps (configuration transactions) to achieve a defined business
outcome within a module.

Business process (String) testing validates the full operability of interconnected functions, methods, or
objects within the functional areas of an SAP Solution.

Scenario (Integration) testing will be designed at the business scenario level. Since business scenarios
are collections of business processes and process steps, Integration Tests will include a collection of unit
tests. Integration tests may also include other transactions including manual transactions, custom
transactions, security steps, etc. as defined in the Design.

Page 9 of 29
4.3 Testing Strategy and Approach
EY in collaboration with Ooredoo will be adopting the following strategy and approach for testing.

Page 10 of 29
SYSTEM
BUSINESS PROCESS DATA CONVERSION SECURITY
CYCLE UNIT TESTING INTEGRATION UAT
STRING TEST TESTING TESTING
TESTING
Purpose Validates that individual Validates the full Validates a set of Testing conversion Performed to Users test the complete,
Objectives functions are operability of business processes programs to load ensure that all the end-to-end business
configured and/or interconnected that define a business production data used to security profiles processes to verify that
developed to functions, methods, or scenario in a drive a test scenario and roles are the implemented
appropriately translate objects within the comprehensive and that defines entities of being solution performs the
technical and functional functional areas of an self-contained information that is implemented as intended functions and
requirements SAP Solution manner on a macro required to support designed satisfies the business
level testing requirements

Owner/ Testers SAP Functional and Key Users Key Users Key Users and End Key Users and End Key Users and End Users
Technical Team Users users

Data Sample Data Sets Sample Data sets Combination of user- Migrated data-full set Sample Data sets Combination of user-
created data and created data and sample
sample migrated data migrated data

Outcome Configured solution is Process execution of Process end to end Process the load of data Process the load of Process end to end
validated and business processes scenarios function per requirements data per scenarios function per
critical/high defects per requirements requirements requirements
resolved

Page 11 of 29
4.4 Testing Structure

Figure 4. OneOoredoo Testing Structure

4.5 Testing Cycles


3 Cycles in SIT:
▪ SIT 1: Silo testing will be done with no integrations to be tested. Mock1 will be masked data, and
it will be encrypted.
▪ SIT 2: Silo testing will be done with no integrations to be tested. Mock1 will be masked data,and
it will be encrypted.
▪ SIT 3: In this phase we will be doing complete end to end testing with all third part systems
▪ Test scenarios to be developed / documented.
▪ Map business requirements as per BPDs / RTM Tracker to test scenarios and assure complete
coverage.

Note: Unit Testing- This would be done in development systems by developers/Fcs whereas SIT would
be done in Quality system. Testers will capture mostly all screenshots in Jira during SIT whereas in Unit
testing all screenshots will not be captured.

4.6 SIT
System Integration Test Scenarios will be identified. These will be run End to End as a prerequisite to
UAT. The Business processes will be agreed along with Acceptance Criteria by the Business. They will be
fully defined as a part of the SIT (and UAT) plans.

4.6.1 SIT approach


This is the test approach for the System and Integration Testing Stage (SIT).

Page 12 of 29
System and Integrations Testing of the in-scope systems, processes
High-Level Objectives
and functionality as detailed in the Blueprint documents, catalogues,
functional and nonfunctional requirements specification documents.
Verify that all individual systems behave as specified and behave as
specified and according to requirements.
Scope
Early execution of the Business processes to ensure that components
work with each other and transfer data end to end.
Exit Criteria met from Build and Unit Testing.
System Test Environment in pace.
Data Creation/Data Load into SIT Client completed/mock data load
Entry Criteria
completed.
Test plan covering all requirements and Test Scenarios agreed.
No Critical (P1) and Severe (P2) defects from Build and Unit Testing.
Planned tests cover all in scope test scenarios and agreed
requirements.
Test inventory and approval flow – SIT Test scenarios will be identified
and shared with business for approval. Final scenario list post
business approval would be referred as a test inventory. Opcos and
Functional leads will have the opportunity to review the test inventory
and approve it.
Planned scenarios executed.
All proof of tests is facilitated in Jira (Screenshots will be taken with
Exit Criteria document number generated which is unique in SAP.)
No Critical (P1) or Severe(P2) defects from Development Testing.
Number of P3 and P4 defects outstanding to be agreed.
TCR (Test completion Report Complete)
Disposition of all exceptions to the above agreed by Test Manager,
Functional Leads and Project Manager.
Management decision would be taken if the program can go ahead
and start UAT irrespective of not meeting SIT Exit criteria. (For eg:
Icertis step is not working, but I can still go to finance step and do
Testing)
System and Functional Testing for expected input data.
Functional Testing of individual system and sub-system
Functional testing of unhandled messages and data.
End to End integration of systems, interfaces, and modules across the
Test Types in-scope platform to verify that interfaces work with each other and
transfer data successfully from source to target systems and vice
versa.
Data Validation and Transformation

Page 13 of 29
Test Manager is responsible for test management, high level status
reporting, defect monitoring, triage and SIT test plan.
Roles and Responsibilities Workstream leads, Consultants and Development Team are
responsible for Testing Execution, defect identification and fixes, and
sign off on each scenario
Data Data as defined by workstream leads

Note: TCR (Test Completion Report) will be published to test leads from each OpCo, OneoOoredoo
program team and content would cover the below details:
▪ Journey of the program so far.
▪ Scenarios identified for testing and testing period.
▪ Details of total Test cases prepared and executed.
▪ Defect details based on Severity and agreed approach on p1/p2/p3/p4 tickets and change
requests
▪ Major challenges faced and Solution provided.
▪ Work around implemented for any open defect.
▪ Lessons Learned

Note: Approach for upstream downstream systems testing and 3rd party systems integration testing.
Testing between upstream and downstream-between S4H, Ariba, SF, Icertis, SAC and Concur will be part
of the test case and testing environment will be set up for integration testing. Testing between SAP with
3rd party will be tested as part of SIT 3, where timelines will be shared with Opco to have 3rd party team
available for defect fixing and verify test results”.

Figure 5. Kuwait Application Architecture

Page 14 of 29
Figure 6. Qatar Application Architecture

Figure 7. Algeria Application Architecture

4.7 Test Environment


Please refer to Landscape document for reference:
https://ooredoogroup.sharepoint.com/sites/One/Shared%20Documents/General/IT%20Enablement/C-
%20ALM/Landscape%20Guidelines.pptx?web=1

Page 15 of 29
Figure 8. Test Environment

4.8 UAT
User Acceptance Test Scenarios will be identified and if there is a need then a new test scenario for UAT
will be created. These will be run End to End as a prerequisite to Post Go-Live. The Business processes
will be agreed along with Acceptance Criteria by the Business. They will be fully defined as a part of the
UAT plans. Jira tool will be used to store proof of testing(screenshots), UAT testers can take note of
Document number(e.g unique PO number generated) in case of passed Test case. In case of a failed
test case -it will follow the Defect Management cycle and ultimately pass and again a document number
will be generated which can be captured.
UAT Test execution for parallel runs for Payroll and Finance Period End Close cycles- below approach will be
followed:
❖ Parallel Payroll Run
• The integrated test scenario (Payroll posting to FI) can be validated during SIT 3 for the purpose of
readiness to UAT.
• Payroll run will be executed in a separate system and not in UAT client, Payroll run would be executed
on 100% actual data, the result of payroll run would be compared with legacy data.
• Payroll parallel run will start post the completion of Payroll UAT.
❖ Finance Period encl Close
• Finance period end close would be part of UAT and would be executed in the UAT client on the
transaction data generated during UAT. The complete sequence and steps involved would be covered
in a separate document which will be shared before start of UAT.

4.8.1 UAT-Approach
This is the test approach for the User Acceptance Testing Stage (UAT)

User Acceptance Testing of the in-scope systems, processes and


High-Level Objectives
functionality as detailed in the Blueprint documents, catalogues,
functional and non functional requirements specification documents.
Verify that all systems behave as specified and behave as specified
Scope
and according to requirements.

Page 16 of 29
• Exit Criteria met from SIT.
• System Test Environment in place.
• Smoke tests finalized and ready to start UAT
• Testers has been identified
• Upskilling activities for UAT has been conducted
• UAT Schedule has been agreed and published with the Opcos
• infrastructure is ready for UAT
• Business roles are created and tested for UAT
• System access is provided to all UAT testers
• JiRA tool is set up and ready for defect management and UAT test
execution reports.
Entry Criteria • Data Creation/Data Load into UAT Client completed/mock data
load completed.
• Test plan covering all requirements and Test Scenarios agreed.
• Any test scenario which has not been tested in SIT will not be tested
in UAT, however a test scenario which has been tested in SIT and
has failed with a P3/P4 defect will be discussed and if approved by
Management will be tested in UAT.
• Test inventory and approval flow – UAT Test scenarios will be
identified and shared with business for approval. Final scenario list
post business approval would be referred as a test inventory. Opcos
and Functional leads will have the opportunity to review the test
inventory and approve it
• No Critical (P1) and Severe (P2) defects from SIT Testing
• Planned tests cover all in scope test scenarios and agreed
requirements.
• Planned scenarios executed.
• No Critical (P1) or Severe(P2) defects from Testing.
• Number of P3 and P4 defects outstanding to be agreed.
• Any defect which is not part of RTM (original requirement) need to
put forward with OneOoredoo program as a Change request.
Exit Criteria
• If some P1 and P2 open defects are open till the end of UAT, a
workaround should be proposed and approved by the Business and
GPO.
• UAT closure Memo-Test completion Report-to be signed-off by
Functional leads and OpCo SPOC
• Disposition of all exceptions to the above agreed by Test Manager,
Functional Leads and Project Manager.
• System and Functional Testing for expected input data.
Test Types • Functional Testing of individual system and sub-system
• Functional testing of unhandled messages and data.

Page 17 of 29
• Systems, interfaces, and modules across the in-scope platform to
verify that interfaces work with each other and transfer data
successfully from source to target systems and vice versa.
• Data Validation and Transformation
Test Manager is responsible for test management, high level status
reporting, defect monitoring, triage and UAT test plan.
Roles and Responsibilities Workstream leads, Consultants and Development Team are
responsible for Testing Execution, defect identification and fixes, and
sign off on each scenario
Data Data as defined by workstream leads

4.9 Post-Go live Regression


Post-Go Live Regression Test Scenarios will be identified. The Business processes will be agreed along
with Acceptance Criteria by the Business. They will be fully defined as a part of the Post-Go Live
Regression plans.

4.9.1 Post-Go live Regression approach


This is the test approach post-Go live Regression Testing Stage (UAT).

Post-Go live Regression Testing of the in-scope systems, processes


High-Level Objectives and functionality as detailed in the Blueprint documents, catalogues,
functional and non-functional requirements specification
documents.
Verify that all systems behave as specified and behave as specified
Scope and according to requirements.

Exit Criteria met from UAT.


System Test Environment in pace.
Entry Criteria
Test plan covering all requirements and Test Scenarios agreed.
No Critical (P1) and Severe (P2) defects from UAT Testing.
Planned tests cover all in scope test scenarios and agreed
requirements.
Planned scenarios executed.
No Critical (P1) or Severe(P2) defects from Testing.
Exit Criteria
Number of P3 and P4 defects outstanding to be agreed.
Test completion Report Complete.
Disposition of all exceptions to the above agreed by Test Manager,
Functional Leads and Project Manager.
System and Functional Testing for expected input data.
Test Types Functional Testing of individual system and sub-system
Functional testing of unhandled messages and data.

Page 18 of 29
Systems, interfaces, and modules across the in-scope platform to
verify that interfaces work with each other and transfer data
successfully from source to target systems and vice versa.
Data Validation and Transformation

Test Manager is responsible for test management, high level status


reporting, defect monitoring, triage and post-Go live Regression test
plan.
Roles and Responsibilities
Workstream leads, Consultants and Development Team are
responsible for Testing Execution, defect identification and fixes, and
sign off on each scenario
Data Data as agreed by Program

4.10 Test Data


▪ The functional team will identify the test data requirements.
▪ Master data (Employee, Vendors, Materials, GL, COA etc.) to be loaded into the testing
environment by EY team.
▪ Master data will be a controlled set based on specific scenario variants.
▪ Mock1 and Mock2 will be masked data, data needs to be masked/encrypted for example Kuwait
should not be able to see/view Algeria data and vice versa kind of testing will be done.

Please refer to Data migration document: Data Migration Strategy-One Ooredoo V1.5.docx

4.11 Environment Management


▪ All necessary transports and validation will be completed by the Basis and Functional team
members before test cycles can be initiated.

Page 19 of 29
▪ Landscape details shown below:

Figure 9. Landscaope

4.12 Test Execution


▪ SIT would be executed by consultants using their existing access of development system. Roles
would be tested by SI team during SIT phase and role-based testing will be done in SIT before it
is handed over to business users.
▪ UAT:Ooredoo will be responsible for managing & executing the UAT with below expectations:
o OpCo Level:
▪ Identification of testers for each workstream to be done by OneOoredoo along
with backup during their absence.
▪ Availability of testers during the UAT timelines.
▪ Opcos should provide Work Stream wise Lead
▪ OpCo SPOC needs to ensure that the defect is logged in Jira on daily basis.
▪ Training for Jira will be given to Testers before the start of UAT by OpCo SPOC.
▪ Upskilling of testers on how to use the systems
o SI Level:
▪ During the UAT period (4-5 hrs. duration) a war room would be set up where
testers will participate and run Test cases, and from SI side- Functional
consultants and technical consultants will be made available to support UAT
along with the Test Team.
▪ Identification of Functional consultants and technical consultants who will
support the UAT will be done. User id of testers and FCs /TCs will be validated
before UAT. credentials along with URLs to be shared before start of UAT.
▪ Training for Jira will be given to Testers before the start of UAT by Testing leads.

Page 20 of 29
4.13 High Level Test Execution Plan

4.14 RASCI
For Program level RASCI, please refer the link:
https://ooredoogroup.sharepoint.com/:x:/r/sites/One/Shared%20Documents/General/TMO/Deliverabl
es/20220821_OneOoredoo_RACI_MASTER_V1.4_for%20opco%20sharing.xlsx?d=weca101b4443b4eb69
9b8fec00a768c38&csf=1&web=1&e=UTjaVx

4.15 Defect Management


Excel / Jira will be the tool in which test execution and defects will be managed. SAP standard defect
management and reporting process will be followed, and each defect will include details such as
transactional data, Error screen shot, clear description, severity, priority, environment details, etc.

4.16 Testing Scope:


Hire to Retire:
▪ Payroll
▪ PA -OM-Time
▪ Employee Central
▪ RCM/RMK
▪ ONB
▪ Compensation
▪ Concur
▪ Work force Analytics

Source to Settle:
▪ Sourcing
▪ Supplier Lifecycle Management
▪ Commerce Automation
▪ Buying
▪ Spend Analysis
Page 21 of 29
▪ Supplier Catalogue
▪ Supply Chain Collaboration
▪ Materials Management
▪ Inventory Management

Finance:
▪ Financials and Controlling
▪ Vendor Invoice Management
▪ Treasury & Risk Management
▪ RealEstate Management - Flexible
▪ Cash & Liquidity Management
▪ Revenue Accounting & Reporting
▪ Funds Management
▪ Project Systems (PS)
▪ Project and Portfolio Management (PPM)
▪ SAP Analytical Cloud (SAC)
▪ Group Reporting

Sales and Distribution:


▪ Order Management

Icertis:
▪ Contract Management

4.17 Test Deliverables


There will be four types of testing reports produced during testing the Ooredoo Solution. All reports will
be generated from Excel, however, program specific status reports for presentation will be prepared
with the data from Excel.

On-Demand Reports:
▪ Overall test status across the program
▪ Overall defect status
▪ Test execution progress for each active test phase and test cycle
▪ Highlights for open Severity 1 and Severity 2 defects.
▪ Detailed defect management information

Daily Status Reports: the Daily status reports will contain the test progress and defect information for
the current active test phases and test cycles.

Weekly Status Reports: the weekly status reports will summarize progress to date, activities completed
and planned and risks and issues. The test progress information will be produced by the dashboard
toolset.

Test Completion / Summary Reports: test Completion / Summary Reports will be produced within 3 days
of a test phase completing and will provide full testing coverage, status, outstanding items (defects,
tests etc.) and business signoff of the test phase were required.
Page 22 of 29
4.18 Testing Process
Regardless of the project implementation approach and methodology used, there are common
fundamentals to testing. Detailed throughout this document are guidelines for initiating, planning,
execution, monitoring, controlling, and closure of project testing.

Monitoring &
Initiate Plan Execute Close
Controlling

Assess Customer Train Project Team Compare Actual


Testing Process, Test Strategy & Customer Acceptance
Testing Performance
Methodology, & Approach & Sign-off
Perform Testing Against Test Plan
Toolsets
Activities

Test Plan Assess Testing


Identify SOW Testing Test Cases Performance to
Services & Toolsets Determine whether
Test Schedule Corrective or
Test Scripts Preventive Actions are
necessary
Harmonize Test Cases
Compare Actual
Manage Defect
Testing Performance
Testing Toolsets Management issues
Against Test Plan
to Closure

Project Team
Training Plan
Provide Metrics To
Collect Testing data,
Support Progress And
report progress, and
Performance
performance
Reporting

Figure 10. Testing Process

Initiate test planning at the beginning of the program:


Assess existing testing process, methodology, and testing tools used by the customer.
▪ Identify testing services and testing toolset that are part of the program or project Statement of
Work (SOW).
o When applicable, contact the Testing Services team to assist with planning and execution
of testing services and installation of testing tools.
▪ Inventory and consolidated testing processes, methodology, and toolsets to harmonize the
project testing strategy and approach and test plan during the Blueprint Phase.
▪ These activities should occur while the overall SAP implementation is in the Project Prep phase.

Plan a project testing strategy and approach regardless of technology or toolsets to support manual
testing.
▪ Create testing strategy and approach outlines key elements of the testing methodology that the
project will be using as the “guiding principles” and common framework for Testing.
▪ Create test plan to document testing scope, test cycles, timeline, entrance criteria, exit criteria,
test management, test documentation, testing roles and responsibilities, project system
environments to used, and completion criteria that will be used by project.
▪ Develop testing schedule
▪ Design and develop testing deliverables for manual testing.
▪ Test case contains the detailed steps, step-by-step, and criteria for completing a test (functional
to support manual testing.
▪ Finalize testing tools for the project, setup and install
Page 23 of 29
▪ Jira can support quality assurance and testing regardless of the application technology.
▪ Plan and develop project team training for testing processes, methodology, deliverables.
▪ Execute test activities to complete testing deliverables to accomplish program or project
requirements.
▪ Project Team Training on testing strategy and approach, test plan, test deliverables and test
tools
▪ Execute test cases (manual tests)
▪ Create and Execute test scripts.
▪ Resolve defects.
▪ Collect testing data, report progress and performance.

Monitoring and controlling of testing activities consists of observing testing performance so that
potential defects can be identified in a timely manner and corrective action can be taken, along with
controlling changes through an integrated change control process.

Close testing activities and deliverables with formal customer acceptance.

4.19 Testing Environments


For every project there should be an exact listing of system environments that will be used by type of
testing. An example for such a list is shown below.

Project System Environment


Development Production
Test Iteration or Cycle
Environment Quality Assurance Environment
(DEV) XXX Environment (QA) (PROD)
Functional Testing:
Unit Testing for Baseline Configuration Yes No No
Business Process (String) Testing No Yes No
Unit Testing for Final Configuration Yes No No
System Integration Testing No Yes No
User Acceptance Testing No Yes No
Other Non-Functional Testing:
Data Conversion Testing No Yes No
IAG Testing- identity access governance No Yes No
Regression Testing No Yes No

4.20 Schedule
Please refer to Program Schedule document for updated timelines: OneOoredoo (sharepoint.com)
Note: SuccessFactors regression Testing in SIT Cycle1: Talent essentials are well connected with modules going
live in wave 1. A round of testing will be performed to ensure that there is no impact on the functionality in
production.
Page 24 of 29
4.21 Testing Tools
Manual testing is the most common form of testing used on projects. A test case is used to execute
manual testing and continues to be used to perform Unit Testing, Business Process (String) Testing, and
Scenario (Integration) Testing on projects. The following further describes manual testing:

Manual Testing:
▪ Test case contains the detailed steps, step-by-step, and criteria for completing a test (functional
) to support manual testing.
▪ After the creation of the test case, the actual testing is performed manually by a project team
member
▪ The results of each test case are recorded manually
▪ No test automation tools are being used
▪ Test management tools can be deployed for test administration and test organization
▪ Test defects are tracked and monitored manually.
For each project an exact list of the test tools that will be used should be defined in the project specific
test strategy.

5 Training
Training for testing processes, methodology, will be planned, developed and delivered as part of the
project team training to ensure team members involved in testing cycles have adequate knowledge and
skills required to perform testing.

6 Defect Management
A defect is a test problem or error that must be corrected. Defects will be written for solutions, legacy
applications, and manual steps that are part of the testing scope of the program or project. A defect is
formally documented in a program or tool for managing test defects. Defects can become project issues
if the solution does not meet the business or technical requirements of the customer.

Page 25 of 29
For each project, there should be defined what defect management tool will be used for tracking and
monitoring testing defects during project implementation. Below represents a general testing process
during a testing cycle when a defect is found:

Create Test Assign the Test


case Case to Tester

Start Testing

Test Team will


In Progress Fail Defect Closed
Log Defect

Yes
Pass
Categories of the
defect(Type/
Severity) Passed?

Closed Test Team will Re


Test Team/Test Test in Test
Lead will assign Environment
the defect to
Developer

Change Request/ Assigned back to


Defect is Closed Defect Triage and Test Team for
Validation done/ Retest by
Assigned to Duplicate Closed Developer
Ooredoo Re-open Defect
respective by Test Team
Tester
Yes

developer

Yes
L
E Defect is Tested
New
G Requirement/ and Resolved in
E SI No Defect? Dev
Change assigned Developer/triage
N to SPOC will link it to
D Original defect
S
Yes

Ooredoo Triage/Test Lead


Yes
No

will assign the


SPOC/SI/
defect to
OoredooPMO Developer
Duplicate
Not a defect
hence closed

No
Categories in
Defect(Type/
Severity)

Assign to
Workstream
Configuaration
Lead/Functional
Consultant

Figure 11. Illustration of General Testing Process

The tester assigned to test case or test script is responsible for entering the defect into the Defect
Repository by specifying the following:
▪ Defect ID
▪ Status
▪ Severity is the prioritization of the defect:
o Severity 1 (1-Critical) – Serious errors that prevent or stop testing of a particular function
or serious data type error (i.e., system locks-up).
o Severity 2 (2-High) – Serious or incorrect functionality errors, incorrect data, or significant
Load problems that may make the application unusable (i.e., login takes over 5 minutes;
query takes 10 minutes, etc.).
o Severity 3 (3-Medium) – Defects should not prevent or hinder the functionality or Load
of the system (i.e., an incomplete phone number string is returned).
Page 26 of 29
o Severity 4 (4-Low) – Defects that do not prevent or hinder functionality of the system,
which is normally confined to the User Interface (i.e., missed spelled word).
o Note: When tester identifies a potential Severity One (1) or Two (2) defects, the tester will
contact the test lead immediately to implement the appropriate corrective action
▪ Summary (brief description of defect)
▪ Project (Cycle of Testing)
▪ Project Area (i.e., Functional/Technical Team).
▪ Description (detailed description of the defect)
▪ R&D Comments (update the history associated with a defect)
▪ Attachments (i.e., pictures)
▪ Detected On Date
▪ Close Date
▪ Defect Type uses an appropriate value to define the tracking status of a defect.
o Code: Defect is within the code.
o Data: Bad data or data errors
o Enhancement: Used with a Change Control Procedure process to request change
o Other: Use only once you receive approval from the Test Lead
o Specification: Requirements are not correct
o UI/Cosmetic: UI error (i.e., misspelled label)
o Usability: Unable to perform a task
o Defect Status - Use the appropriate value to define the tracking status of a defect.
o Closed: The identified test case with a defect passes all applicable regression tests
o Duplicate-Closed: The Test Lead sets this status if a test step/process is determined to
be no longer valid.
o Fixed: Developer corrected, and unit tested defect and it is ready for retest.
o Deferred: Further investigation is required to identify if this is a true defect
o Fixed Pending Build: Defect is fixed and the testing SRF is pending the update.
o New: Defect was entered into the system has not been validated by the Test Lead for
assignment
o Open: Defect is not assigned to anyone to fix
o Re-open: Defect is found after it was closed.

Figure 12. Severity 1 & 2 Defects escalation process to remedy test defect

Page 27 of 29
The development/configuration team is responsible for updating the defect information in the Defect
repository by specifying the following:
▪ Assigned To (developer’s name)
▪ Status
▪ Assigned Date
▪ Closing Date
▪ R&D Comments (what action was taking to correct a defect)

The Integration Lead or Test Lead is responsible for periodically checking Defect Repository to validate
when a defect is fixed and ready for retest or to escalate when a Severity 1 or 2 are impacting the test
schedule. Once a defect is successfully tested (and relevant Regression testing has taken place), the
Tester will update the status in Defect Repository.

6.1 Defect resolution timelines

Resolution
Severity Description
Goal
The defect is in a critical application and there is no manual
workaround.
Defects would cause the business to be halted if it is not corrected. 1 day
Sev1 –Very High Multiple users and / or departments could be affected if the defect is
not corrected.
Defect is holding up a significant number of test cases from being
executed.
The defect is in a critical application but there is a manual workaround.
Multiple users and / or departments are affected, but there is a manual
2-4 days
Sev2 – High workaround.
A Critical application would be partially unavailable, and users’
productivity would be impacted if the defect is not corrected.
Defects have a low impact on critical business functions if the defect is
not corrected.
Sev3 – Medium 4-10 days
The end user is still able to perform his / her job function.

Multiple users are affected but a work-around exists.


The defect is cosmetic and causes no loss of functionality to the Prior to end of
Sev4 – Low
application. SIT cycle

7 Testing Roles & Responsibilities


An Integration Lead or Test Lead (TL) is responsible for managing the test team. A TL has thorough
knowledge of functional testing using structured testing techniques and their application across
differing projects. Additional responsibilities and characteristics include the following.
▪ Develops Test Strategy, Assessments and Test Plans
▪ Oversees that projects are successful and on schedule.
▪ Work daily w/customers and act as Point of Contact (PoC) for escalations.
▪ Provides direction and support for test team.

Page 28 of 29
▪ Test Management includes tracking and reporting of testing progress, defects, risks & issues.
▪ Team Leadership
▪ Report test results, status reviews & lead defect management process
▪ Identification and escalation of issues and risks

Test Architect (TA) provides technical guidance and has thorough knowledge of testing using structured
testing techniques, including techniques that include the following responsibilities and characteristics.
Team Leadership
▪ Communicates with business to obtain more detail for the individual business processes.
▪ Ensures that the exact business process is defined at the step level.
▪ Assist in identifying system requirements.
▪ Creates test scripts and manages. (Virtual project teams and/or line responsibility)
▪ Point of contact for upper management and project management when there is no Test
Manager
▪ Represent test team in periodic review meetings (defect review, build, etc.)
▪ Participate in definition and development of test plans, test cases and participate in manual
functional testing.
▪ Review team’s results, defects and test cases to ensure adherence to established testing
standards.

Project Team members are responsible for testing and execution.


▪ Create test scripts using word and excel.
▪ Define test execution scenarios.
▪ Execute manual tests, report issues and defects.
▪ Establish, conduct and control testing scenarios and predictive outcomes.
▪ Report project status to the Test Lead
▪ Track metrics on defects, test results, etc.
▪ Comply with Change Management requirements
▪ Record test results
▪ Provides direction & requirements during test assessments and scoping

8 Assumptions, Dependencies and Risks


S.no Description
1 Jira tool is built as per the workflow / process diagram shared with Algeria
2 Jira licenses for UAT users is taken care by Ooredoo
3 UAT testers are well trained and have knowledge on Sap process of their Opco
4 There would be a Testing lead from Each Opco to coordinate with testers of respective Opco
5 UAT testers will spend sufficient time from their BAU for testing
6 If Testers are going on leave, they should nominate backup
7 The testing environment is ready from configuration, integration, and Data load

Page 29 of 29

You might also like