You are on page 1of 12

TESTING STRATEGY

PROJECT IDENTIFICATION
Project Type
Project Name CPI/Project Number (Business Consulting, Implementation,
Upgrade, Internal, other)

Customer Name Customer Number Planned Start/Finish

Project Sponsor Program Manager Project Manager (Customer)

Project Manager (SAP) SAP Service Partner(s) Project Manager (Service Partner)

Table of contents

Introduction............................................................................................................................. 2
Purpose.................................................................................................................................... 2
Scope........................................................................................................................................ 2
Data Migration Business Object Scope................................................................................3
Data Migration Process Definition......................................................................................... 3
Migration Solution Steps........................................................................................................... 3
Test Cycles Definition............................................................................................................ 3
Test Cycle Definitions and Descriptions....................................................................................3
Roles and Responsibilities.................................................................................................... 4
Project Roles............................................................................................................................. 4
Project Responsibilities............................................................................................................. 5
Data Migration Testing Strategy............................................................................................5
Project Role Assignments......................................................................................................... 6
Unit Testing............................................................................................................................... 6
System Testing......................................................................................................................... 7
Integration Testing.................................................................................................................... 9
User Acceptance Testing........................................................................................................ 10
Cutover / Performance Testing............................................................................................... 11
Introduction
Purpose

The purpose of the Testing Strategy document is to define the strategy to be used to establish
an effective testing approach for the legacy data migration solution. The actual migration
solution will vary among projects due to the deployment and availability of software tools, but
the need to establish a structured testing approach remains constant regardless of the level of
automation. The migration test strategy forms part of and is input to the project level test
planning and execution process.

This document will provide the framework for the detailed planning of the tasks required for
validating the legacy data migration solution. The objectives are:

 To define the test cycles for the data migration work stream and their purpose
 To describe the dependencies and integration to the project level test strategy
 To define the roles and responsibilities required to successfully test and validate the data
migration solution
 To identify the steps within the migration process to determine specific test objectives
 To provide a basis for the test planning for each of the data migration test cycles
 To ensure the data migration solution meets the documented functional and technical
requirements

Scope

For the purpose of the document, it is assumed that project will contain the following test
cycles:

 Unit Testing
 System Testing
 Integration Testing
 User Acceptance Testing
 Cutover / Performance Testing

The scope of this document is the definition of the strategy required to test the migration
process, which is defined by the movement of data from a legacy source system to the SAP
Application. It is not intended to define an approach or strategy for the testing of application
master or transactional data. It is assumed that once the data is loaded into the SAP
Application, the testing strategy and test scenarios developed for the master and transactional
data within the Testing work stream will be utilized to validate application functionality and
business scenario requirements.

Copyright/Trademark
Data Migration Business Object Scope

The following table lists the Business Objects that will be migrated as part of the SAP
Application implementation.

ID SAP Business Source Source Table / Manual / Data


Object Application File Automated Volume
1
2
3
4
5
6
7
8
9
10

Data Migration Process Definition


Migration Solution Steps

Regardless of the actual automated migration solution, the following steps are performed to
support the movement of data from a legacy source system to the SAP Applications:

1. Data Extraction – extraction of a specific data set from the source application or
system to a specified format
2. Data Transformation – the validation, harmonization and enrichment of legacy data
based on specified business and validation rules
3. Generation of Load Ready Data – the generation of data to a specified format
4. Loading of Data into the SAP Application – the loading of load ready data into the
SAP Application using standard SAP load utilities such as LSMW
5. Validation of Migrated Data – the validation and reconciliation of migrated data to
ensure data completeness and validation of usability

Test Cycles Definition


Test Cycle Definitions and Descriptions

This document assumes that the test cycles listed below are included in the overall testing
strategy. The migration test strategy may need to be adjusted if the implementation testing
strategy is different. The table below provides a description for each test cycle and the
dependency to the overall project.

Copyright/Trademark
Test Cycle Dependency Purpose
Unit Self contained within the Test the individual migration programs and
Migration Track processes.
System Self contained with the Test the complete set of migration programs
Migration Track and processes, along with data quality and
usability.
Integration Integrated with project Test the complete set of migration programs
level Integration testing and process, along with execution times, data
quality and usability. Populate SAP Application
Test environment to support project level
integration testing.
User Integrated with project Test the complete set of migration programs
Acceptance level User Acceptance and process, along with execution times, data
testing quality and usability. Populate SAP Application
Test environment to support project level user
acceptance testing.
Cutover / Integrated with project Simulate full production load to capture
Performance level Cutover / execution times, tune the process and confirm
Performance testing execution sequences.

Roles and Responsibilities

The following section is designed to define the project roles required to support the testing of
the data migration process and to identify the appropriate project members who will be
responsible for fulfilling these roles.

Project Roles

The table below contains recommendations on project roles and descriptions. This
information should be adjusted to align to the individual project structure and requirements.

ID Project Role Team Description


1 Migration Lead Consulting Responsible for the planning and execution of the
migration track. Works directly with the Testing track
lead to coordinate testing schedules.
2 Migration Consulting Responsible for the design, development and testing
Developer of migration programs and processes.
3 Report Developer Consulting Responsible for the design, development and testing
of quality reports for monitoring test results.
4 Technical Consulting Responsible for establishing the technical platform for
Architect the migration solution, if a software solution is being
deployed.
5 Functional Expert Consulting Responsible for providing SAP business object and
application expertise for the design and testing of the
migration solution.
6 Data Owner Client The owner of the data from the business side who
understands the use and business rules associated
with the data from a business perspective.
7 Business User Client The business user of the legacy application where the

Copyright/Trademark
data is being used.
8 Legacy Client The administrator of the legacy application where the
Application legacy data resides. Responsible for extracting and
Administrator mapping of the legacy data.
9 Legacy Data Client The technical or business analyst that understands the
Analyst underlying business rules for the business object and
data elements from an application or technical
perspective. Responsible for mapping the legacy data
to the target SAP application.
10 Testing Track Consulting The project level lead for the testing track.
Lead / Client
11 Cutover Lead Consulting The project level lead for the cutover track.

Project Responsibilities

The table below contains a mapping of the project roles and the individuals assigned to these
roles for each of the business objects or migration processes

ID Business Object Project Role Responsible Resource


- n/a Migration Lead
- n/a Testing Track Lead
- n/a Cutover Track Lead
- n/a Technical Architect

1 <Business Objects #1> Migration Developer


Report Developer
Functional Expert
Data Owner
Business User
Legacy Application
Administrator
Legacy Data Analyst

2 <Business Object #2> Migration Developer


Report Developer
Functional Expert
Data Owner
Business User
Legacy Application
Administrator
Legacy Data Analyst

Data Migration Testing Strategy

This section is designed to describe the approach, goals and objectives for each of the testing
cycles. As the project progresses, the migration solution will mature with each test cycle and
the focus of testing moves from the underlying solution to the quality and availability of
migrated legacy data. The migration team needs to align the testing tasks, goals and

Copyright/Trademark
objectives to the overall project requirements to ensure that they are able to deploy an
efficient solution while satisfying the project level data requirements and milestones.

Project Role Assignments

The table below contains recommendations on the assignment and allocation of project roles
across the various test cycles.

ID Project Role Unit System Integration User Cutover


Acceptance Performan
1 Migration Lead Yes Yes Yes Yes Yes
2 Migration Developer Yes Yes Yes Yes Yes
3 Report Developer Yes Yes No No No
4 Technical Architect Yes Yes Yes Yes Yes
5 Functional Expert No Yes Yes Yes No
6 Data Owner No No Yes Yes No
7 Business User No No Yes Yes No
8 Legacy Application Yes Yes Yes Yes Yes
Administrator
9 Legacy Data Analyst Yes Yes Yes Yes Yes
10 Testing Track Lead No No Yes Yes Yes
11 Cutover Track Lead No No No No Yes

Unit Testing

The Unit Test cycle is self contained within the migration track and is designed to test and
validate the underlying migration platform, programs and processes. There is no dependency
to the main project or requirement to deliver legacy data to other project tracks from this test
cycle.

ID Subject Area Objective Strategy


1 Platform, Programs  Validate connectivity of Develop unit test plans that
and Processes platform describe the execution steps and
 Validate file format and results. Plans are designed to
data content of legacy validate the design specifications
extracts for the programs and processes
 Exercise validation and and the underlying platform.
business rules in the
migration programs and
processes
 Validate file format and
data content of load
ready data
 Validate SAP load
utilities
2 Data Requirements  There is no need for There will be different resources
production level data focused on the various migration
volumes solution components. Each
 For extracts, data resource will be responsible

Copyright/Trademark
necessary to validate managing the data requirements
format and content to satisfy their test plan.
 For programs and
processes, data
necessary to validate
and exercise business
rules
 For load ready data,
data necessary to
validate format and
content and execute
load utilities
3 Application  Perform in the Establish a separate application
Environment and Migration environment environment dedicated to the
Execution Plan  Tests can be migration track to eliminate
independent of each dependencies to other project
other tracks.
4 Application Business Not required during this test
Object Unit Testing cycle
5 Application Business Not required during this test
Scenario Testing cycle
7 Timing Statistics Not required during this test
Tracking cycle
8 Data Quality Not required during this test
Tracking cycle
9 Test Results Track results at the Establish spreadsheet or project
Tracking migration program and plan to track tasks. Issue tracking
process component level owned by individual migration
for each migration flow team member.
10 Success Criteria Successful execution of Signoff of individual test plans or
each migration program completion of individual testing
and process component tasks on tracking sheet or plan
based on the design
specifications

System Testing

The System Test cycle is self contained within the migration track. It is designed to expand
on the Unit test by directing focus on the execution sequence, execution times and the quality
of legacy data. There is no dependency to the main project or requirement to deliver legacy
data to other project tracks from this test cycle.

ID Subject Area Objective Strategy


1 Platform, Programs  Validate connectivity Same as unit test. The data
and Processes of platform volumes will be larger, so the
 Validate platform underlying solution should continue
sizing requirements to be tested.
 Validate file format
and data content of
legacy extracts
 Exercise validation

Copyright/Trademark
and business rules in
the migration programs
and processes
 Validate file format
and data content of
load ready data
 Validate SAP load
utilities
2 Data Requirements  Close to production The legacy data set needs to come
level data volumes are directly from the source systems
required for capturing and applications, no manual
timing statistics creation of data is required
 The data set should
contain most data
anomalies to properly
exercise the validation
and business rule
3 Application  Perform in the Establish a separate application
Environment and Migration environment environment dedicated to the
Execution Plan  The migration flows migration track to eliminate
need to be executed in dependencies to other project
the required sequence tracks. The execution sequence
needs to be validated and
confirmed to account for
interdependencies.
4 Application Business Business object unit Utilized the business object test
Object Unit Testing testing to be performed plans developed within the Test
once the data is loaded to track for this testing
validate usability
5 Application Business Business scenario testing Utilized the business scenario test
Scenario Testing to be performed once the plans developed within the Test
data is loaded to validate track for this testing
usability
7 Timing Statistics Capture the execution Create a spreadsheet to capture
Tracking times for each execution the execution times by execution
step within the migration run
flow for each business
object
8 Data Quality Capture the record counts Create a spreadsheet to capture
Tracking for the source data set, the record counts results by
records rejected and execution run. Capture data issues
records loaded for each in an issues log.
step within the migration
flow for each business
object
9 Test Results Track the results at the Establish spreadsheet or project
Tracking business object level to plan to track metrics at the
reflect the completion of business object flow level. Issue
the end to end migration tracking owned by migration track
process. The key metrics lead.
include extraction, load
and usability.
10 Success Criteria Successful execution of Signoff of completion of individual

Copyright/Trademark
business object migration business object migration flows.
flow, high percentage of
quality data and functional
usability of the migrated
data

Integration Testing

The planning and execution of the Integration Test cycle is the responsibility of the Testing
Track Lead. The migration team is responsible for providing migrated legacy data to support
the end-to-end testing. By this test cycle, the migration solution and execution sequence
needs to be stable, so the focus continues to be on the execution times and the quality of
legacy data.

ID Subject Area Objective Strategy


1 Platform, Programs  Assume stable React to issues encountered with
and Processes environment migration programs and
 Monitor throughput processes, no specific testing
performance performed. Analyze and address
performance bottlenecks.
2 Data Requirements  Close to production The legacy data set needs to
level data volumes are come directly from the source
required for capturing systems and applications, no
timing statistics manual creation of data is
 The data set should required
contain most data
anomalies to properly
exercise the validation
and business rule
3 Application  Perform in the Test Execute migration process in the
Environment and environment Test environment
Execution Plan  The migration flows
need to be executed in
the required sequence
4 Application Business Business object unit testing Testing Team responsible for the
Object Unit Testing to be performed once the execution of the test scripts
data is loaded to validate
usability
5 Application Business Business scenario testing Testing Team responsible for the
Scenario Testing to be performed once the execution of the test scripts
data is loaded to validate
usability
7 Timing Statistics Capture the execution Capture the execution times in the
Tracking times for each execution tracking sheet created during the
step within the migration System test cycle.
flow for each business
object
8 Data Quality Capture the record counts Capture the record counts results
Tracking for the source data set, in the tracking sheet created
records rejected and during the System test cycle.
records loaded for each

Copyright/Trademark
step within the migration
flow for each business
object
9 Test Results Track the results at the Project plan needs to track
Tracking business object level to metrics at the business object flow
reflect the completion of level. Issue tracking owned by
the end to end migration Testing Track Lead.
process. The key metrics
include extraction, load and
usability.
10 Success Criteria Successful execution of Testing Team responsible for
business object migration signoffs and approvals
flow, high percentage of
quality data and functional
usability of the migrated
data

User Acceptance Testing

The planning and execution of the User Acceptance Test cycle is the responsibility of the
Testing Track Lead. The migration team is responsible for providing migrated legacy data to
support the end-to-end testing. By this test cycle, the migration solution and execution
sequence needs to be stable, so the focus continues to be on the execution times and the
quality of legacy data.

ID Subject Area Objective Strategy


1 Platform, Programs  Assume stable React to issues encountered with
and Processes environment migration programs and
 Monitor throughput processes, no specific testing
performance performed. Analyze and address
performance bottlenecks.
2 Data Requirements  Production level data The legacy data set needs to
volumes are required for come directly from the source
capturing timing systems and applications, no
statistics manual creation of data is
 The data set should required
contain most data
anomalies to properly
exercise the validation
and business rule
3 Application  Perform in the Test Execute migration process in the
Environment and environment Test environment
Execution Plan  The migration flows
need to be executed in
the required sequence
4 Application Business Business object unit testing Testing Team responsible for the
Object Unit Testing to be performed once the execution of the test scripts
data is loaded to validate
usability
5 Application Business Business scenario testing Testing Team responsible for the
Scenario Testing to be performed once the execution of the test scripts

Copyright/Trademark
data is loaded to validate
usability
7 Timing Statistics Capture the execution Capture the execution times in the
Tracking times for each execution tracking sheet created during the
step within the migration System test cycle
flow for each business
object
8 Data Quality Capture the record counts Capture the record counts results
Tracking for the source data set, in the tracking sheet created
records rejected and during the System test cycle.
records loaded for each
step within the migration
flow for each business
object
9 Test Results Track the results at the Project plan needs to track
Tracking business object level to metrics at the business object
reflect the completion of the flow level. Issue tracking owned
end to end migration by Testing Track Lead.
process. The key metrics
include extraction, load and
usability.
10 Success Criteria Successful execution of Testing Team responsible for
business object migration signoffs and approvals
flow, high percentage of
quality data, functional
usability of the migrated
data and close to
acceptable execution time
for migration process

Cutover / Performance Testing

The planning and execution of the Cutover / Performance Test cycle is the responsibility of
the Cutover Track Lead. The test is designed to simulate the production cutover process, so
the focus for the migration team is on execution times and data quality. The migration team is
responsible for executing the complete migration process with production volumes and
address performance and data quality issues.

ID Subject Area Objective Strategy


1 Platform, Programs  Assume stable React to issues encountered with
and Processes environment migration programs and
 Monitor throughput processes, no specific testing
performance performed. Analyze and address
performance bottlenecks.
2 Data Requirements  Full production data set The legacy data set needs to
is required for capturing come directly from the source
timing statistics and systems and applications, no
validating data quality manual creation of data is
required
3 Application  Perform in the Cutover Execute migration process in the
Environment and environment Cutover environment

Copyright/Trademark
Execution Plan  The migration flows
need to be executed in
the required sequence
4 Application Business Project dependent
Object Unit Testing
5 Application Business Project dependent
Scenario Testing
7 Timing Statistics Capture the execution Capture the execution times in the
Tracking times for each execution tracking sheet created during the
step within the migration System test cycle
flow for each business
object
8 Data Quality Capture the record counts Capture the record counts results
Tracking for the source data set, in the tracking sheet created
records rejected and during the System test cycle.
records loaded for each
step within the migration
flow for each business
object
9 Test Results Track the results at the Project plan needs to track
Tracking business object level to metrics at the business object flow
reflect the completion of level. Issue tracking owned by
the end to end migration Cutover Track Lead.
process. The key metrics
include extraction, load and
usability.
10 Success Criteria Successful execution of Cutover Team responsible for
business object migration signoffs and approvals
flow, high percentage of
quality data, functional
usability of the migrated
data and acceptable
execution time for
migration process

Copyright/Trademark

You might also like