Professional Documents
Culture Documents
Sap BW Project Plan Methodology
Sap BW Project Plan Methodology
Project
Methodology
for SAP BW
The text throughout the document in red indicates SAP content. 2002 SAP AG. All rights reserved. SAP, the SAP
logo, R/3, Accelerated SAP and ASAP are trademarks of SAP AG.
Table of Contents
INTRODUCTION.........................................................................................................................................................4
PROJECT PREPARATION........................................................................................................................................7
A
B
C
D
E
F
Project Start-Up............................................................................................................................................10
Data Warehousing Project Abstract.............................................................................................................24
Change Integration.......................................................................................................................................47
Prototyping...................................................................................................................................................53
Technology Planning, Support and Cutover.................................................................................................65
Project Preparation Checkpoint.................................................................................................................101
BUSINESS BLUEPRINT.........................................................................................................................................110
G
H
I
J
K
L
M
N
O
P
REALIZATION STAGE..........................................................................................................................................260
Q
R
S
T
U
System Implementation...............................................................................................................................360
Production Support.....................................................................................................................................369
Post-Implementation Review......................................................................................................................375
Introduction
Ready access to information about an organization's business, products and customers is a key element
in supporting decision making. When searching for information to support business decisions in today's
dynamic global business environment, many business users find that the traditional sources of data transaction-based systems - are inadequate in content, accessibility, form, performance and availability.
The problem often lies not with the data or its source, but in the limitations of current technology to bring
together information from many disparate systems. Increasingly, the solution to these problems is data
warehousing.
A data warehouse (DW) can be defined as an orderly and accessible repository of known facts and
related data that is used as a basis for making better management decisions. The DW provides a unified
repository of consistent data for decision making that is subject-oriented, integrated, time-variant and nonvolatile. The data can also be characterized as accessible, transformed and management-oriented.
Data warehousing provides a multidimensional view of an organization's operational data that is designed
to be accessible and valuable to decision-makers. Those users that benefit most from better data access
and analysis capabilities are likely to be executives, analysts, knowledge workers and front-line staff.
Data warehousing is a concept that also implies the existence of an underlying discipline, structure and
organization, which ensures that business users will be able to find the correct information when they
need it. Technologies that support this concept allow organizations to extract and store information in a
specialized database that can be accessed by flexible, intuitive tools.
Today, data warehousing is considered the most effective way to transform "data" into "information". This
information is increasing in importance, as organizations need to adapt continually to changes resulting
from competitive pressures, shrinking business cycles, a global market and a transforming business
environment.
The value of data warehousing lies in its ability to help users make more informed, faster decisions. Users
can make quicker and more accurate decisions through the analysis of the key trends and events that
affect business. As a result, users spend less time finding and accumulating data and more time
analyzing relevant information and working to implement solutions.
DWs are often built starting with a few of the most valuable data areas and required infrastructure.
Additional and more complex data, infrastructure and tools are often added over time as users learn more
about their requirements and then identify additional data needs.
As an organization's data increases in both volume and complexity, the DW will become an even more
critical repository of timely and accurate data for decision making. DWs can also address the data
overload problem experienced by many executives.
Analytical Processing Versus Transaction Processing
Information systems traditionally have been developed around functional requirements associated with
the day-to-day running of the business. Separate operational systems accept orders, schedule
installations and service, generate bills, do the accounting and process the payroll.
Operational systems generally capture and generate only the information necessary to process
transactions and manage the clerical aspects of their respective functional areas. Sometimes
management reporting components of these systems may summarize the results of transaction
processing for analytical purposes. However, the management reporting components of most operational
Page 4
systems are secondary afterthoughts to the transaction processing functions and often relate only to the
narrow functional area within the scope of a particular system. For all the investments that organizations
have made in their information systems, the world remains a "data rich but information poor" place.
The concept of a DW has been developed to focus on analytical processing. It integrates data from
various internal transaction processing systems and external data sources to meet The firm's various
analytical processing information needs. In addition, a DW can also help to improve the performance of
transaction processing systems by removing some of the operational reporting requirements of these
systems.
Objectives of the Data Warehousing Methodology
The BW Project Plan Methodology presents a structured approach to planning and developing a data
warehousing system using SAPs Business Information Warehouse product. The objectives of the BW
Methodology include:
To develop a business-driven knowledge management solution in meeting The firms analytical
processing information needs considering the characteristics and needs of the:
business environment,
organizational environment, and
technology environment;
To determine analytical processing information requirements based on an analysis of The firm's
performance measurement and decision analysis processes including the uses of quantitative
analysis methods (e.g., in data mining);
Page 5
CPDEP Phase
2
Project Preparation
CPDEP Phase
3
Business Blueprint
CPDEP Phase
5
Final Preparation
Support
Go Live &
Q Custom Extract
System Construction
R BW Administrators
Workbench
Construction
K
Presentation
System
Design
H Analytical
Processing
Data
Definition
B
DW
Project
Abstract
Realization
J Data
Transformation
System
Design
G Analysis
and
Decision
Process
Definitions
A
Project
Start-up
CPDEP Phase
4
V
System
Implementati
on
S
BW Business Explorer
Construction
L
Data
Design
M
User Documentation
N
Training
O
Acceptance Testing
C Change Integration
D Prototyping
I Definition
Checkpoint
P
Design
Checkpoint
Project Management
U
Realization
Checkpoint
Page 6
W
Production
Support
X
PostImplementation
Review
Project Preparation
The purpose of this stage is to provide planning and preparation for the BW project. It is
concerned with understanding the business requirements or aspects of the planned DW system "what" the business does and "what it needs to do" to either resolve current business problems or
improve the way it does business at The firm. There is also a focus on understanding the project
environment and defining the requirements or scope for the planned data warehousing system.
Although each project will have its own unique objectives, scope and priorities, the steps in
Project Preparation will help confirm and plan the principal topics that will need to be considered.
Key Deliverables:
1) Project Charter A clear definition of issues relating to The firm and implementation of the
BW system. This includes objectives, scope, implementation strategy, deadlines and
responsibilities. The project manager, as part of the Project Preparation stage, draws up the
Project Charter. The contents of the Project Charter include:
a) Project Scope To ensure that there is a precise understanding of the expectations and
objectives for the project.
b) Key Assumptions
c) Critical Success Factors Those key areas that have specific impact on the
implementation process. Typical factors include: executive sponsoring, change
management and control, resources (appropriate, enough and committed), issue
resolution, user involvement, clear objectives and scope
d) Project Team Organization Roles and responsibilities
e) Team Processes Status meetings, deliverable review, issue resolution
2) Project Abstract
a) Data Abstract High-level overview of the key data and system relationships.
b) Process Abstract (High-level Data Flow Diagrams) High-level understanding of the
data, process and technology of the proposed system. Provides a high-level summary of
the purpose, function and overall structure of the potential business analytical processes
to be investigated. It defines in general terms, the functions of the proposed DW system
and provides an initial boundary for the scope of the project.
c) Source System Abstract Provides an overview of the source systems relevant to the
scope of the project. The source systems include not just internal systems but also any
relevant external supplier systems, customer systems, syndicated data systems or
business partner systems.
d) Technology Abstract - Documents the current technology environment as well as the
proposed technology environment for the new system. The proposed environment will
often be based on the current BW environment supplemented with the planned
implementation of any new technologies or tools.
i) Include analysis of growth requirements and number and types of users.
ii) Where non-R/3 sources are used or where non-standard technology is employed
(e.g. 3rd party ETL tool or front-end tool other than Cognos), the technology
environment analysis needs to be present.
e) Data Conversion Abstract High-level understanding of the scope and complexity of the
different tasks necessary to create the initial data for the project.
3) ID the Guidance Review Team
4) ID the key Stakeholders
5) High-level Project Plan To identify the phases, tasks and steps to be used to successfully
complete the project.
6) Develop a Class 1 Estimate
Page 7
Project Start-up
During Project Start-up, a clear agreement on the scope of the project between management and
the project team is obtained. All standards, techniques and tools to be used should be defined
and the organizational structure of the project as well as the project reporting mechanisms should
be confirmed and agreed. A project risk assessment is completed. A Project Charter for the
project is also established and agreed between the business users and the project team.
Data Warehousing Project Abstract
The DW Project Abstract provides a high-level summary of the purpose, function and overall
structure of the DW system to be developed. It is built on an understanding of The firm's critical
success factors (CSFs) and related performance measures and uses the CSFs to validate the
business processes and business events to be included in the project. The DW Project Abstract
consists of the following major components:
Data Abstract which provides a high-level overview of the key data entities to be included
within the scope of the DW system;
Process Abstract which identifies the potential key analytical or decision processes within
the scope of the project and may also include definition of some of the key outputs from
the proposed DW system;
Source System Abstract which provides an overview of the source systems providing
inputs to the DW system;
Technology Abstract which provides an overview of the relevant current and target
technology environments for the planned DW system; and
Data Conversion Abstract which provides an overview of the scope and complexity of the
data conversion effort for the project.
In certain cases, a technology or "proof-of-concept" pilot will be required and, where this is
appropriate, it may be completed as part of the Technology Abstract definition. Additional
deliverables:
Alternative development options are identified and evaluated.
Identification of stakeholders and GRT
A Class 1 Estimate and high-level project plan is completed as part of the evaluation in
deriving a recommended development approach.
An initial assessment of the project risk is also completed.
Cross-Life Cycle Phases
During the Project Preparation Stage, several Cross-Life Cycle Phases may be started. These
include: Change Integration, Prototyping, and Technology Planning, Support and Cutover. These
phases are briefly described in the paragraphs below.
Change Integration
Change Integration is a Cross-Life Cycle Phase that spans all stages of the DW life cycle. Its
main purpose is to ensure that all relevant business and organization issues relating to the
development and roll-out of the DW system are properly addressed. Opportunities for
performance improvement may also be identified in the course of analyzing and developing the
DW. As appropriate, separate change integration projects may be initiated or the scope of the DW
project may be expanded to address these change issues.
Prototyping
The Prototyping Phase stretches across the DW life cycle. Depending on the development
strategy identified and adopted in the Project Start-up Phase, prototyping may be:
Performed exclusively in one phase; or
Repeated across the life cycle, potentially many times.
Prototyping may be used in various ways on a DW project such as:
Defining user requirements for the presentation systems (i.e., requirements prototypes);
Page 8
Proving the viability of a particular technology or suite of technologies (i.e., proof-ofconcept prototypes); or
Incrementally developing the DW system (i.e., evolutionary prototypes).
Page 9
A.1
Project Start-Up
Purpose
To plan for the effective and efficient conduct of the project.
Overview
All of the different tasks necessary to formally begin the project are completed in this phase. The
tasks and steps are intended to reduce the risk of unplanned scope adjustments, significantly
revised work products, misunderstandings and cost overruns. All projects, no matter how small or
unique, can benefit from an appropriate degree of preliminary planning.
The scope definition as defined in the proposal and any other contractual documents are formally
agreed.
The project management structure and standards are created together with detailed work plans
for all of the different resources.
The project team staffing skills and resources and organization structure are defined and
orientation sessions completed.
The Project Charter is prepared and agreed. The standard contents of the Project Charter
include:
Project background
Scope of the project
Project risk assessment and management approach
Project organization and staffing
Roles, responsibilities, skills required, staffing and training
Deliverables and third-part commitments
Change and issues resolution procedures
Estimates for the project
Project work plan
Detailed work plans
This list of contents includes all aspects of project management that need to be addressed by
management. Some sections of the document, such as the project work plans may be placed in
separate documents for ease of maintenance and to keep the size of the Project Charter
manageable. Any such stand alone documents are referenced within the main body of the Project
Charter to ensure that the Project Charter remains a single reference point for the management
of the project.
The Project Charter is a living document that will be continually updated throughout the life cycle
of the project. Amendments will be made during each stage as changes are identified and the
results of quality reviews are included. At the end of each stage, a major revision of the Project
Charter is required and this is the main focus of the Checkpoint Phases.
Revisions to the plan fall into three broad categories:
Minor changes to static information;
Major revisions to existing sections; and
The addition of new information not previously recorded.
As the project progresses, there will be a continual need to monitor and report on project status,
to identify and address scope issues and to conduct appropriate Quality Management Reviews.
Page 10
The structure established in the Project Start-up Phase is maintained during the Checkpoint
Phases and the deliverables initiated in this phase are refined as the project moves towards
completion.
Project Start-Up Task Relationships
P r o je c t O p p o r tu n it y
R o le s a n d R e s p o n s ib ilit ie s
A1
C o n f ir m a n d A g r e e
P r o je c t O b je c t iv e
P r o je c t M is s io n S t a te m e n t
B u s in e s s D r iv e r s
H ig h - le v e l P r o je c t S c o p e
A2
D e f in e P r o j M g m t
O r g a n iz a tio n S tr u c tu r e
a n d S ta n d a rd s
P r o je c t M a n a g e m e n t S t r u c t u r e
Is s u e r e s o lu tio n a n d s c o p e c o n t r o l p r o c e d u r e s
P r o je c t s t a t u s r e p o r t in g p r o c e d u r e s
A3
D e v e lo p D e ta ile d W o r k
P la n s a n d S c h e d u le s
P r o je c t r is k a s s e s s m e n t
P r o je c t w o r k p la n
A4
F in a liz e P r o je c t T e a m
O r g a n iz a tio n ,S t a ff in g
a n d L o g is tic s
P r o je c t t e a m o r g a n iz a t io n
A5
D e v e lo p P r o je c t
C h a rte r
Page 11
P r o je c t C h a r te r
Page 12
An organizational effort to integrate and share disparate data (e.g., as the result of a
business merger).
In determining the project drivers, it is also useful to determine whether the DW project is:
An organization-wide effort or focused on selected business segments; and
Driven top-down by senior management or proposed by lower level functional or
operational managers.
Understanding the business drivers helps in defining and focusing the objectives and scope of the
DW project.
Page 13
Page 14
Page 15
Page 16
Common Design
BW Security
Frontend Tools Selection
and Strategy
Long term strategy for BW use
Transition to BW On-going Support
CAB Process During Development
DBA Support
Consulting - Short-term / Long Term - Testing Strategies
Performance Testing
Integration Testing
Training Development / Documentation Standards
BW Project Unique Training
Consistency of Deliverables
Ownership of Common Data - CC/WBS Hierarchies
BW Design Architecture
Page 17
Page 18
Page 19
Page 20
Business
Blueprint
Realization
Final
Preparation
Project Management
DW Architecture
Business Design Team
Business Analysts
SAP Process
Training
Technical Team
Data Specialist
SAP BASIS
SAP ABAP
Page 21
Go Live &
Support
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 22
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 23
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.2
Purpose
To scope the BW project based on a high-level understanding of the data, process and
technology requirements of the proposed data warehousing system.
Overview
The Data Warehousing Project Abstract provides a high-level summary of the purpose, function
and overall structure of the potential business analytical processes to be investigated. It defines in
general terms, the functions of the proposed DW system and provides an initial boundary for the
scope of the project.
The organization's current and "to-be" (as appropriate) business is analyzed to obtain a proper
context and perspective in planning and developing the DW system. The focus is to align the DW
system development effort with The firms business objectives. The project requirements are
captured in a Data Warehousing Project Abstract consisting of the following major components:
Data Abstract;
Process Abstract;
Source System Abstract;
Technology Abstract; and
Data Conversion Abstract.
The Data Abstract is a high-level overview of the key data and system relationships. It is derived
primarily from an understanding of the existing system and from interviews with users concerning
the proposed system. In these interviews, any existing or new areas for which data will need to be
recorded should be defined. The Data Abstract also describes the relationships between the
potential kernel data entities in the system in the form of an Entity Relationship Model (ERM).
The Process Abstract graphically describes the potential key processes of the system and clearly
identifies the scope of the project. The Management Overview Diagram is the technique used for
this graphical description together with some narrative describing the processes. It also shows
the expected inputs and outputs and interfaces with other existing or future systems and business
processes and should also indicate activities that are outside of the project scope.
The Source System Abstract provides an overview of the source systems relevant to the scope of
the DW project. The source systems include not just internal systems but also any relevant
external supplier systems, customer systems, syndicated data systems or business partner
systems.
The Technology Abstract documents the current technology environment as well as the proposed
technology environments for the new system. The proposed environment will often be based on
the current environment supplemented with the planned implementation of any new technologies.
The Data Conversion Abstract provides a high-level understanding of the scope and complexity of
the different tasks necessary to create the initial data for the DW project. This may include:
Purging/cleansing existing data;
Balancing/reconciling existing data;
Creating new data; and
Converting existing data.
In some cases there may be a need for a Technology Pilot to provide a "proof-of-concept" that the
proposed technology components are a viable solution to the business problems identified.
Page 24
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Depending on the specific project circumstances, the objectives of a Technology Pilot may
include one or more of the following:
To assess and determine the capabilities of the various proposed hardware and software
components for the new technology components;
To demonstrate the connectivity and compatibility of the proposed hardware and software
components;
To simulate performance/capacity definition before embarking on a major project; or
To determine the organizational impact and training needs for both the IS and user
personnel in the new technology environment.
The rationale for the Technology Pilot and its impact on the project scope must be clearly
determined and documented in the project plans.
Alternative approaches to developing the DW system should be identified and evaluated to select
or confirm the most appropriate development approach to use. As needed, high-level Cost/Benefit
analyses for the alternative approaches may be completed to support the evaluation process.
The completed Data Warehousing Project Abstract is presented, discussed and formally
approved by senior management.
Page 25
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
P r o je c t O p p o r t u n ity
C h e v r o n I n f o r m a tio n A c c e s s S tr a t e g y
H ig h - le v e l P r o je c t S c o p e
B1
D e fin e C r itic a l
S u c c e s s F a c to rs
B3
P re p a re P ro c e s s
A b s tra c t
B4
P re p a re S o u rc e
S y s te m A b s tra c t
B2
P re p a re D a ta A b s tra c t
O v e r v ie w o f S o u r c e S y s te m s
B6
P re p a re D a ta
C o n v e r s io n A b s tr a c t
D a ta C o n v e r s io n A b s tr a c t
B7
E v a lu a te D a ta
W a r e h o u s in g
D e v e lo p m e n t O p tio n s
P r o je c t R is k A s s e s s m e n t
B5
P r e p a r e T e c h n o lo g y
A b s tra c t
In v e n to ry o f c u rre n t s y s te m s
T e c h n o lo g y A r c h ite c t u r e D ia g r a m
T e c h n o lo g y P ilo t R e q u ir e m e n t s
M a n a g e m e n t O v e r v ie w D ia g r a m
E n tit y R e la tio n s h ip M o d e l
CSFs
P e rfo rm a n c e M e a s u re s
B8
S u b m it D a ta
W a r e h o u s in g P r o je c t
A b s t r a c t D e liv e r a b le
Page 26
C o s t / B e n e f it A n a ly s is
D a t a W a r e h o u s in g D e v e lo p m e n t O p t io n s
R e c o m m e n d e d D a ta W a r e h o u s in g D e v e lo p m e n t O p t io n
U p d a te d P r o je c t R is k A s s e s s m e n t
D a t a W a r e h o u s in g P r o je c t A b s tr a c t D e liv e r a b le
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 27
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
CSFs should be identified at the level of tangible activities that can be associated with specific
performance measures, i.e., they must be capable of measurement.
Page 28
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 29
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 30
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 31
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 32
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 33
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.2.19Identify the systems that are outside the scope of the project.
Develop a list of the internal and external systems that are outside the scope of the project. For
each of these systems, determine as appropriate the impact of excluding the system from the
project on the proposed DW system in terms of any constraints or limitations placed on the
capabilities of the DW system.
Page 34
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
An Interface Abstract which shows further levels of details for the different interfaces
shown on the MOD(s) e.g., an interface may be shown as a single item on the MOD but
may be composed of different sub-systems and programs; and
A 3rd Party Report Tool Abstract which shows details of report production at a high level
e.g., where additional software or processing is required to generate outputs from the BW
system via ODBO or downloaded from the InfoCube. It is important that the estimated
volumes of data to be processed are also included as part of this Abstract.
As well as what is included within the project scope, statements of items/areas that are
specifically excluded from scope may also need to be prepared.
Page 35
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 36
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Experience in developing and implementing technology changes has shown that a project can
diminish or transfer the risks associated with using complex client/server and emerging
technologies early on through the use of a Technology Pilot.
As a Technology Pilot is sometimes difficult to estimate before a project commences, it may be
useful to treat the pilot as a distinct sub-project that is proposed for and managed separately from
the main project.
With an understanding of the proposed technology environment, determine the need to test and
validate the proposed environment through a Technology Pilot. Document the decision, the
reasons for the decision and a work plan for conducting the Technology Pilot.
Page 37
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Resolve as many issues as possible or agree that further investigation will be completed during
the Technology Planning, Support and Cutover Phase of the project. Ensure that further
investigations are included within the scope of the Project Preparation Stage.
Page 38
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 39
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
knowledge of the current system and will certainly require input from experienced
business users; and
The timing and duration of the project may differ (as discussed above).
During data conversion clean-up, data may be found that is incomplete or incorrect. This then
may require adjustments or write-offs that need senior management and internal/external audit
approval. This has even greater significance where these adjustments affect statutory financial
statements.
At the conclusion of a data conversion project, there must be formally approved documents,
signed-off by senior management, that contain the reconciliation(s) between the old and the new
systems and that comply with the formal acceptance criteria established to determine completion
of the data conversion project.
Page 40
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 41
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 42
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 43
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Compare these estimates against the anticipated benefits from implementing the new system.
Wherever possible, the users of the system should be assigned responsibility for providing details
of the anticipated benefits from the proposed system.
The costs and benefits can only be approximated but they are needed to provide an indication of
the scale of costs involved and to provide some indication of how the costs vary across the
different system development options.
Page 44
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 45
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
project broken into several smaller projects. Alternatively, project staffing or funding may require
changes.
Page 46
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.3
Change Integration
Purpose
To identify and address change integration issues across the DW development life cycle as the
organization envisions the management of organizational knowledge.
Overview
Changes may occur at two levels in the process of developing an integrated architectural view of
knowledge management for an organization:
Changes at the organization's strategic level. These are the strategic changes associated
with the management of knowledge and business information across organizational
boundaries addressing issues such as competitive drivers, strategic opportunities and
organizational structural, procedural and cultural changes; and
Changes at the organization's tactical level. These are the changes associated with
developing an information architecture that promotes and facilitates information sharing
within an organization. There are many areas within an organization that can benefit from
sharing the same piece of data, information or knowledge. However, an organization's
data resource is often not managed from an enterprise-wide, strategic perspective, rather
it is maintained by individual organizational units to meet local business needs.
Thus, to truly manage data resource as an organizational asset, an enterprise-wide data resource
management program is needed to provide a proper business focus and a framework for
coordinating the implementation and integration of the organization's data resources. Many
change issues must be addressed at the tactical level such as the organizational structure,
authority, responsibility and technology of the data resource management program.
To obtain the greatest value from a DW development initiative, it must be an integral part of a
broader organizational knowledge management program which does not only address technology
matters but, more importantly, must align technology with the organizational structures and
business processes.
While, in practice, many of the change integration tasks discussed in this phase may themselves
be separate organizational initiatives or projects, the focus of this phase is on determining the
impact of implementing a DW on the organizational structure, business processes and
other technologies and the improvement opportunities which a DW offers.
As an organizational initiative, the roll-out of a data warehousing program must be carefully
planned addressing various organizational issues such as stakeholder interest, knowledge
transfer, training and technical support.
The scope of a typical DW project often does not include addressing change integration issues. However,
these issues are often critical for the ultimate success of a DW program. Furthermore, additional
improvement opportunities may be identified in the course of analyzing or developing the DW. Thus, the
project team must bring to the attention of management any significant change integration issues
identified. As appropriate, separate change integration projects may be initiated or the scope of the DW
project may be expanded to address these issues. Any scope changes must be discussed with
management and formal written approval of the changes obtained before any work commences.
Page 47
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
C1
A s s e s s D a ta
W a re h o u s e Im p a c t o n
O r g a n iz a tio n
O r g a n iz a tio n a l Im p a c t
B u s in e s s P r o c e s s Im p a c t
T e c h n o lo g y I m p a c t
C2
D e v e lo p C h a n g e P la n
( o p tio n a l)
I m p le m e n t a t io n s t r a t e g y
C h a n g e P la n
C3
In te g ra te C h a n g e s
Page 48
Im p le m e n t e d c h a n g e s
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 49
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 50
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 51
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.3.12Integrate Changes
Purpose
To coordinate and integrate various change actions taking place on the project.
Overview
Many change activities are taking place on a typical DW project - both technical and
organizational.
The coordination of the technical tasks (e.g., coordinating the data, process and technology
design activities) is the responsibility of project management.
The broader change projects such as those identified in the change plan developed in Task C2
are often considered as separate project initiatives and are outside the scope of the DW project.
Thus, the focus of this task is on the organizational activities that need to take place in support of
the DW program being developed.
Page 52
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.4
Prototyping
Purpose
To create a common understanding of the requirements of the proposed DW system to facilitate
decisions relating to the "look and feel" of the new system and to confirm that the proposed
interface will appropriately support the roles and responsibilities of the business users.
Overview
Prototyping may be completed on a DW project as an approach for:
Defining user requirements for the presentation systems (i.e., requirements prototypes);
Proving the viability of a particular technology or suite of technologies (i.e., proof-ofconcept prototypes); or
Incrementally developing the DW system (i.e., evolutionary prototypes).
The key activity of this phase is the analysis and design of the user interface. Prototyping is a
technique often used to assist in the development and approval of the user interface but, even if a
traditional prototype is not going to be developed, these tasks are still relevant to help define how
the users will ultimately interact with the system being built.
The Prototyping Phase stretches across the DW life cycle. Depending on the development
strategy identified and adopted in the Project Start-up Phase, the tasks in this phase may be:
Completed exclusively in one phase; or
Repeated across the life cycle, potentially many times.
For example, if a waterfall development approach is adopted, the initial set of tasks for defining
acceptance criteria, standards, development and navigation strategies and building the
preliminary prototype would all be Business Blueprint tasks, as would the walk-through of the
prototype with the users. The goal of these tasks would be to use the prototype to identify and
confirm the organization's DW requirements. The further revisions and development of the
prototype would also potentially be Business Blueprint tasks, where the actual interaction with the
application system would be designed and documented.
However, If a spiral development strategy was to be adopted, it would be viable to complete all of
the tasks in the phase a number of times within what would be the Realization Stage of
development.
If an iterative approach was adopted, all of the tasks may be completed for each iteration of the
life cycle.
The decision on which strategy to follow should be made in the Data Warehousing Project
Abstract Phase and this will largely dictate how the tasks in this phase will be used.
The tasks defined in this phase will also be influenced by decisions made in the technologybased phases of the life. For instance, specific tools selected for the project may either constrain
or direct the developer towards appropriate standards and development techniques. However, the
outputs from the Prototyping Phase should drive the technical design of the application instead of
vice versa.
Page 53
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
CSFs
P r o je c t A b s t r a c t
T e c h n o lo g y P ilo t r e q u ir e m e n ts
CSFs
P r o je c t A b s t r a c t
T e c h n o lo g y P ilo t r e q u ir e m e n ts
D a ta T r a n s fo r m a tio n
R e q u ir e m e n t s
D2
P re p a re P ro to ty p e
D a ta
D3
P ro to ty p e D a ta
T r a n s fo r m a tio n
D1
D e fin e P r o to t y p e
P ro to ty p e s tra te g y
B u s in e s s s c e n a r io s
P r o to ty p e a c c e p t a n c e c r it e r ia
P ro to ty p e d a ta
D a ta T r a n s f o r m a tio n P r o to ty p e
D6
R e fin e P r o to ty p e
A n a ly s is a n d D e c is io n P r o c e s s
D e f in itio n
D4
P ro to ty p e E n d -U s e r
P r e s e n ta tio n
E n d - U s e r P r e s e n t a tio n P r o to t y p e
D5
R e v ie w a n d P r e s e n t
P ro to ty p e s
P ro to ty p e p re s e n ta tio n
D7
E v a lu a t e a n d S u b m it
th e P ro to ty p e
D e liv e r a b le
P r o to ty p e D e liv e r a b le
Page 54
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 55
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 56
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
While many users may be involved in the development of the prototype, one person should be
the ultimate authority. This person needs to be able to resolve conflicts between competing
requirements or expectations.
Page 57
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 58
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 59
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 60
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Ensure that all of the algorithms and formulae are fully documented and that manual examples of
the calculations exist to confirm that they are accurately replicated in the system.
Page 61
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 62
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.4.26Refine Prototype
Purpose
To iteratively refine the prototype.
Overview
The Business Scenarios created earlier in this phase represent realistic depictions of the
business world. They are developed in narrative form so that non-technical users may read and
understand them.
This task involves refining and finalizing the Business Scenarios that will be used as a
communications device as well as a support tool for the testing process.
Page 63
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 64
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.5
Purpose
To define, establish, and maintain the detailed IT environments in which the proposed Business
Information Warehouse system will be developed, tested and implemented.
Overview
Using the Technology Abstract as input, the IT infrastructure necessary to support the new DW
system is defined and designed in this phase.
The IT environment is often characterized as a complex integration of heterogeneous hardware,
software and communications components which must be flexible and scaleable to meet the
future needs of the organization and to take advantage of advances in technologies.
The IT infrastructure consists of hardware, peripherals, software and communications
components that may be transitioned and/or migrated from the existing environment and/or
procured from hardware and software suppliers.
Certain elements of the development environment such as components used in the Technology
Pilot tasks may need to be established very early in the project. It is important that project team
members with responsibility for these tasks communicate regularly with project team members
with responsibility for completing the DW Project Abstract and Prototyping Phases.
There are a number of distinct environments created during the implementation of a DW project
including development, staging and production environments. These separate environments are
built by analyzing the technology determining factors of the desired (production) system,
identifying a set of IT infrastructure alternatives and selecting the desired alternative. The choice
of the infrastructure components, in concurrence with the selection of primary suppliers, will drive
the requirements for the different environments.
The creation of each of the environments follows a set of steps, although the emphasis and the
content of the steps may differ from environment to environment. Many of these steps occur
concurrently within the creation of each environment, e.g., the design of the development
environment can proceed before analyzing all of the requirements.
Furthermore, the creation of each environment can occur concurrently with respect to the other
environments. Analyzing the requirements for the test environment may begin concurrently with
the requirements for the development environment. There is also a continual information
exchange between the life cycle of each environment, e.g., features of the development
environment may influence the requirements for the test environment.
The work in this phase can be discussed in terms of three areas:
Technology definition. The IT infrastructure support requirements for the target DW
system are defined based on an understanding of:
the current IT environment, and
the technology infrastructure requirements for the DW system addressing not just
technology issues but also organizational and business.
Strategies and standards are developed to guide the definition, design and implementation of the
different IT environments to be developed for the DW system. IT infrastructure and product
selection criteria are also determined.
Page 65
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Alternative IT infrastructures are identified and refined during the task to Identify IT Infrastructure
Alternatives. The selected infrastructure is defined in the IT infrastructure abstract which is
submitted to management for approval in the Select IT Infrastructure Alternative task;
Technology Pilot. Conduct Technology Pilot is an optional task and may be completed in
some project circumstances where the IT components and environment may be complex
and the use of a Technology Pilot is warranted. In general, a Technology Pilot may be
considered in project situations:
where the IT infrastructure will form a significant component of the project (e.g., a
high proportion of the total project cost),
where there are a large number of IT components that are new to an organization,
where there is a need to prove the effectiveness or compatibility of different IT
components or explore their function/use/applicability (e.g., adoption of new
peripheral components such as hand-held data input or recording devices), or
where there is a large technology training need or culture adjustment to be made
(e.g., the organization is a slow adopter of new technologies); and
Technology design. Based on the approved technology definition, the development, test
and production environments are designed. The completed IT environment designs are
then assembled into a Technology Design Deliverable for management approval.
While the methodology discusses only the design of the development, test and production
environments, the work steps may be adopted to design other IT environments as required by a
particular project (e.g., the Technology Pilot, prototyping or training environments).
Page 66
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
O v e r a ll w o r k p la n
T e c h n o lo g y A r c h ite c t u r e
T e c h n o lo g y A b s t r a c t
E x is tin g B W B a s is S t a n d a r d s
E x is tin g H a r d w a r e /S o ft w a r e c o m p o n e n t s
P r o je c t w o r k p la n
T e c h n o lo g y A b s t r a c t
P r o c e s s D e f in it io n
D a t a D e fin itio n
E5
D e s ig n a n d E s t a b lis h
t h e D e v e lo p m e n t
E n v ir o n m e n t
D e v e lo p m e n t E n v ir o n m e n t
R e q u ir e m e n ts
S tra te g y
D e s ig n
E1
D e v e lo p T e c h n o lo g y
I m p le m e n ta tio n W o r k
a n d S t a f fin g P la n s
T e c h n o lo g y I m p le m e n t a t io n w o r k p la n
E2
R e fin e U n d e r s ta n d in g
o f th e C u rre n t S y s te m s
E3
A n a ly z e IT
In fra s tru c tu re
R e q u ir e m e n ts
I T In f r a s t r u c tu r e R e q u ir e m e n t s
E4
D e v e lo p B W
E n v ir o n m e n t
Landscape
B W E n v ir o n m e n t L a n d s c a p e
E6
D e s ig n a n d E s t a b lis h
th e S t a g in g
E n v ir o n m e n t
E7
D e s ig n a n d E s t a b lis h
t h e P r o d u c tio n
E n v iro n m e n t
E8
I n s t a ll a n d C o n f ig u r e
T e c h n o lo g y
C o m p o n e n ts
S ta g in g E n v ir o n m e n t
R e q u ir e m e n t s
S tra te g y
D e s ig n
T e s te d E n v ir o n m e n t
E9
S u b m it T e c h n o lo g y
D e s ig n D e liv e r a b le
T e c h n o lo g y D e s ig n D e liv e r a b le
E10
S u p p o r t T e c h n o lo g y
E n v ir o n m e n ts
H e lp D e s k p r o c e d u r e s
B a c k - u p / A r c h iv e p r o c e d u r e s
E11
P la n n in g f o r
P r o d u c t io n C u to v e r
P ro d u c tio n S u p p o rt T e a m
C u to v e r p la n
Page 67
P r o d u c tio n E n v ir o n m e n t
R e q u ir e m e n ts
S tra te g y
D e s ig n
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 68
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.5.6 Discuss and confirm the work plan and staffing plans.
Discuss the technology implementation work plan and staffing plans. Resolve any questions or
issues and make changes as necessary. Obtain formal written approval.
Page 69
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 70
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 71
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.5.12Analyze the business issues that may impact upon the technology
infrastructure.
Determine business issues that may have an impact upon the technology infrastructure:
Identify categories of information and information access that are relevant to the overall
organizational DW program but are out of scope for this project.
This is a critical step in managing scope and user expectations for a DW project.
These information/data sources should be viewed with the understanding that they may be
integrated into the warehouse at a later time, and therefore, may add stress to the infrastructure.
Some examples of information sources may include: syndicated, public domain or other
corporate/internal information. Some examples of alternative categories of information access
may include: WWW/Internet, private networks or information sharing consortiums;
Determine the types of analytical/decision processes supported by the DW system
including:
performance measurement,
analysis and decision processes, or
quantitative analysis including data mining or profiling applications;
Page 72
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
As the business environments evolves, the scope and complexity of the DW environments will
also change. Thus, the business environment should be assessed on a continuous basis to
determine the impact of any changes on the DW technology infrastructure.
A.5.14Analyze the external factors that may impact upon the technology
infrastructure.
Analyze the external factors that may impact upon the technology infrastructure such as:
Regulatory factors that may influence the design of the technology infrastructure (e.g.,
financial reporting compliance, product development regulatory controls or privacy
considerations);
Industry standards that may have an impact on the design of the technology
infrastructure either currently or in the anticipated future, e.g., compliance with industryestablished data standards is important in conducting electronic data interchange (EDI)
or electronic commerce (EC) transactions with an organization's trading partners.
Page 73
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
With the increasing use of inter-enterprise systems (electronic partnering), the compliance with
industry standards in managing an organization's data resource (e.g., data definition) is becoming
an important business requirement;
Any potential or pending mergers or acquisitions and their impact on the design of the
technology infrastructure; and
Opportunities of selling the organization's data as a "product" (e.g., customer mailing list)
and their impact on the design of the technology infrastructure (e.g., privacy law
compliance, security issues, copyright issues, intellectual property issues).
Page 74
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
WAN or dial-up connections, line types, Internet/intranet connectivity and the system
distribution strategy;
Current and forecast geographic distribution requirements; and
Current and forecast network, Internet and intranet requirements.
As the technology factors evolve, the scope and complexity of the DW environments will change.
The above stress factors should be assessed on a continuous basis to determine their impact on
the DW technology infrastructure.
Page 75
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
up), partial (where only critical data is backed-up) or incremental (where only data that has been
affected since the last back-up is backed-up).
Consider the disaster planning back-up needs that are compatible with the organization's disaster
contingency approach (e.g., remote hot-site or cold-site facilities).
This information will impact upon the identification of back-up and recovery standards, strategies,
hardware and tools (e.g., it may be necessary to include disk mirroring in the configuration).
Determine the criteria, frequency and media for purging and archiving data.
Page 76
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.5.23Submit Landscape
Submit the Environmental Landscape for approval to the Environment Management group and
the Technical Team.
Page 77
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 78
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
application development. This type of network traffic can increase the network bandwidth
requirements during the development effort;
Estimate volume of network traffic. Using inputs such as type and frequency of network
traffic and the number of users, estimate the volume requirements for network traffic. This
information is a key input into the selection of networking technology and capacity for the
development environment;
Determine developer access requirements. Determine what access developers will need
to data and applications, both locally and remotely. If remote access is needed, access
methods must be provided such as WAN connections or dial-up access. Local access
requirements will help to determine LAN configuration and may affect platform and
environment decisions;
Do initial hardware sizing separately for the development system and the technical
experimentation system
Determine how many applications go live at the same time
Determine the requirements for backup, recovery, and reset of the development system
Consider the following:
What hardware platform will you use later for quality assurance, and for your
production system?
What language and time zone dependencies do you need to consider on the
development system?
What software development projects of your own do intend to carry out in the future?
What enhancements do you plan to make to the standard BW system?
What development system growth do you expect?
Technical experimentation system:
What system platform will you use later for quality assurance, and for your production
system?
What time window do you expect for system recovery in the future?
What time window will you allow for a data backup in the future?
How will you organize data backup in the BW system in future?
What BW interfaces will you have to use or administer in the future?
Desktop Level:
Standard PCs available, allowing for future network requirements and office products
Applications Server Level:
Memory sizing, reflecting BW sizing model
Disk distribution on the application server
Scalability of the application server (memory, CPU, disk space)
Database Server Level:
Memory sizing, reflecting BW sizing model
Database disk distribution
Growth path for the database server
Delivery lead times for additional disks, backup media, and similar
Scalability of the database server (memory, CPU, disk space)
Possible migration paths to more powerful database servers from the same hardware
vendor
Page 79
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Determine the start and target configuration for each of the necessary software profiles
Once these tasks have been established, the information can be channeled into the flowchart
available in the installation manual, to determine at which point during the process the
software should be installed.
All the systems throughout the BW environment are on supported and fully functional software
profiles. Early planning enables the scheduling of the non-BW software installs to be performed at
the most suitable point in the BW installation timetable.
A.5.28Order Hardware
Purpose
The purpose of this task is to place the initial hardware order with the appropriate vendor,
determine the expected delivery date of the hardware, and to determine the expected lead-time
needed to procure the productive system hardware. Also, order any additional dependent
hardware required for the development.
Procedure
On the basis of the internal project Service Level Agreements, and subject to the external
hardware partner Service Level Agreements, place the purchase order for the initial BW
development system hardware.
Inform the project steering committee
Where required by the internal service level description, the other project leaders and the project
steering committee should be kept informed of the status of the BW system installation.
A.5.29Order Software
Purpose
The purpose of this task is to place the initial software order with the appropriate vendor,
determine the expected delivery date of the software, and to determine the expected lead-time
needed to procure the productive system software. Also, order any additional dependent software
required for the development.
Page 80
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Procedure
On the basis of the internal project Service Level Agreements, and subject to the external
software partner Service Level Agreements, place the purchase order for the initial BW
development system software.
Inform the project steering committee
Where required by the internal service level description, the other project leaders and the project
steering committee should be kept informed of the status of the BW system installation.
Decide on a network provider to establish the physical network connection to the BW system
environment
Configure your online connection to the BW system environment
Check the necessary security aspects for the BW system environment
Implement network connection access
Page 81
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 82
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Determine test team, user and support personnel remote access requirements.
Determine security requirements.
Determine the necessary security for the testing effort. This may require the installation of security
software which was not needed in the development environment as well as the instigation of
security profile management. Determine who will have access to the staging environment and the
responsibility for maintaining the security of the staging environment. Members of the test teams
may also require education in the use of the security system.
Obtain formal written approval of the staging environment requirements.
Page 83
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Determine the start and target configuration for each of the necessary software profiles
Once these tasks have been established, the information can be channeled into the flowchart
available in the installation manual, to determine at which point during the process the
software should be installed.
All the systems throughout the BW environment are on supported and fully functional software
profiles. Early planning enables the scheduling of the non-BW software installs to be performed at
the most suitable point in the BW installation timetable.
A.5.36Order Hardware
Purpose
The purpose of this task is to place the initial hardware order with the appropriate vendor,
determine the expected delivery date of the hardware, and to determine the expected lead-time
needed to procure the productive system hardware. Also, order any additional dependent
hardware required for the development.
Procedure
On the basis of the internal project Service Level Agreements, and subject to the external
hardware partner Service Level Agreements, place the purchase order for the initial BW
development system hardware.
Inform the project steering committee
Where required by the internal service level description, the other project leaders and the project
steering committee should be kept informed of the status of the BW system installation.
A.5.37Order Software
Purpose
The purpose of this task is to place the initial software order with the appropriate vendor,
determine the expected delivery date of the software, and to determine the expected lead-time
needed to procure the productive system software. Also, order any additional dependent software
required for the development.
Page 84
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Procedure
On the basis of the internal project Service Level Agreements, and subject to the external
software partner Service Level Agreements, place the purchase order for the initial BW
development system software.
Inform the project steering committee
Where required by the internal service level description, the other project leaders and the project
steering committee should be kept informed of the status of the BW system installation.
Decide on a network provider to establish the physical network connection to the BW system
environment
Configure your online connection to the BW system environment
Check the necessary security aspects for the BW system environment
Implement network connection access
Page 85
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
At this time, the Security Administrator should know the BW Authorization Concept. Initially,
we recommend creating a new profile, modeled after the profiles in the Development
environment.
Please refer to "The SAP BW Authorization made Easy Guidebook, Chapter 7 Special Cases,
Setting Up Authorization Profile. Assign the activity group to all non-super users in the QAS
system and update their user master records as described in Chapter 8: Assigning Activity
Groups and Authorization Profiles to Users.
Create Operating System and Database Level Access for Project Team Members
Analyze the types of reports, interfaces, data feeds, and customizing that is projected.
Determine from this the reports and interfaces requiring operating system and database
access, and at what level. Provide security to ensure the integrity of the BW technical
environment and data.
Provide and Ensure Security for the BW Operating System and Database Users
Access to these users must only be given to key members of the technical project team.
Change the password, and provide security for the following users:
SAP users: SAP* and DDIC
Operating system users: <SID>adm, ora<SID>, pctemu
Database users: sapr3
Determine a Procedure for Requesting Operating System and Database Access for the
Development System
Access to the operating system and database in the project must only be permitted after
running the checks in step one above. Changes to access rights of project members must be
documented.
Page 86
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 87
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 88
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
such as ink jet printers, may be adequate for administrative needs. There may also be special
printer requirements such as for special stationery, remote printing or special fonts.
Confirm and refine client machine requirements. Confirm and refine the processor, memory and
storage requirements for the client machines.
Confirm and refine application server requirements. Confirm and refine the processor, memory
and storage requirements for all of the different types of application servers.
Confirm and refine database server requirements. Confirm and refine the processor, memory and
storage requirements for the database servers.
Refine production operations and management roles and responsibilities.
Confirm and refine configuration management/change control requirements.
Consider the difficulty of managing the configurations of production hardware, software and
network devices. Large or distributed production environments complicate the task of managing
configurations and may require more sophisticated tools and/or processes. Remote configuration
management allows the system administrator to update desktop configurations and software
electronically from a central or remote location and to accelerate the testing and deployment of
updated components.
Determine the start and target configuration for each of the necessary software profiles
Once these tasks have been established, the information can be channeled into the flowchart
available in the installation manual, to determine at which point during the process the
software should be installed.
All the systems throughout the BW environment are on supported and fully functional software
profiles. Early planning enables the scheduling of the non-BW software installs to be performed at
the most suitable point in the BW installation timetable.
Page 89
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.5.48Order Hardware
Purpose
The purpose of this task is to place the initial hardware order with the appropriate vendor,
determine the expected delivery date of the hardware, and to determine the expected lead-time
needed to procure the productive system hardware. Also, order any additional dependent
hardware required for the development.
Procedure
On the basis of the internal project Service Level Agreements, and subject to the external
hardware partner Service Level Agreements, place the purchase order for the initial BW
development system hardware.
Inform the project steering committee
Where required by the internal service level description, the other project leaders and the project
steering committee should be kept informed of the status of the BW system installation.
A.5.49Order Software
Purpose
The purpose of this task is to place the initial software order with the appropriate vendor,
determine the expected delivery date of the software, and to determine the expected lead-time
needed to procure the productive system software. Also, order any additional dependent software
required for the development.
Procedure
On the basis of the internal project Service Level Agreements, and subject to the external
software partner Service Level Agreements, place the purchase order for the initial BW
development system software.
Inform the project steering committee
Where required by the internal service level description, the other project leaders and the project
steering committee should be kept informed of the status of the BW system installation.
Decide on a network provider to establish the physical network connection to the BW system
environment
Configure your online connection to the BW system environment
Check the necessary security aspects for the BW system environment
Implement network connection access
Page 90
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Provide and insure proper security around the BW SAP operating system and database users
Change the password and provide security for the following users:
BW SAP users: SAP* and DDIC
BW Operating system users: <SID>adm, ora<SID>, pctemu
BW Database users
Determine an ongoing procedure for requesting operating system and database access for
the development system
Additional access to the operating system and database in production should be restricted to
exceptional cases. Every change of the access rights owned by the project members has to
be well documented.
Page 91
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 92
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
SAP License
SAPOSCOL for AIX
Performance increase by Shared Memory Pools
Delete Temporary Files
Change Permissions of the Transport Directory
Install Additional SDKs
Install the BW Front End
Starting and Stopping the R/3 System
Logging on to the R/3 System
Install the Online Documentation
Post-Installation Steps Described in the Online Documentation
Installation Backup
SAProuter
Online Service System (OSS)
Import Hot Packages after the installation
Preparation
Review OSS notes.
Create an installation directory.
Manual checks
Installation
Run startup
Monitoring the installation
Final Steps
Please refer to "R/3 add-on-installation and delta upgrade guide" for detailed steps.
Page 93
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Procedure
To set up the link from BW to Source System:
Go To Source System Screen -> Create R/3 Source System manually or automatically
Specify the Target Source (Source System)
Execute IMG: Bring the user to Source system IMG
To set up the link from Source System to BW
IMG at Source System -> Cross Application Components -> ALE Basic Configuration -> Set
Up Logical System ->
Maintain Logical System: Create BW Logical System
Allocate Logical System to the Client: Set Up the Source System
Transaction Code: SM59 ->
R/3 Connection
RFC Destination: Identify BW System
Target Machine: Identify application server
Test Connection
Page 94
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 95
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 96
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
The most important part of the scoring process is that consistency is applied when the evaluation
process is completed. Validate the weights and score system.
Page 97
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 98
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 99
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
The Cutover Plan schedule and timing must cover the procedures to close legacy systems
and entering data in the new system. Consider operations in other time zones. The
production system can start at a time earlier than 8 AM Monday morning.
The plan must include:
Data conversion estimates of the types of data, and how long the process can take
Timing of when the conversions are performed
Team leaders for the cutover
Roles and responsibilities of the core project team, power users, users, and so on
Team assignments and working hours
Company management involvement and decision-making designates
Procedures for shutting down legacy systems
Reconciliation procedures to ensure business transactions are cut off in the legacy
systems
Source Management Database (SMDB) Setup to handle Problem Reports and Product
Lines & Billing
Security
Periodic Processing
Tech Support
Environment Management
CAB Process
Reconciliation procedures to ensure data is converted to the new system
The written Cutover Plan must be reviewed and approved by the Project Manager, core
project team leaders, and key company senior management. The plan must be presented in
summary form to the Steering Committee.
Page 100
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
The Cutover Plan can provide for an initial start up of the new system before most users sign
on. For example, a Power User can enter the first transactions to ensure that everything is
operating as designed.
Page 101
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.6
Purpose
To confirm the results of the Project Preparation Stage of the DW project with management.
Additionally, in this Phase, the CPDEP Phase 1 task of Identify and Assess Opportunities is
completed, and a recommendation and Class 1 Estimate provided for the DRB/GRT.
Overview
Throughout all of the stages of the DW life cycle, there is a need to monitor and report project
status, to identify and address scope issues and to conduct appropriate Quality Management
Reviews.
However, at specific points in the life cycle, additional tasks will have to be completed. One of
these points is the completion of the Project Preparation Stage and these additional tasks are
described in the Project Preparation Checkpoint Phase.
The primary objective of this phase is to check progress against the Project Charter and to
update the plan to reflect the start of a new stage.
The Project Charter is a living document that will be continually updated throughout the life cycle
of the project. Amendments will be made during each stage as changes are identified and the
results of quality reviews are included. At the end of each stage, a major revision of the Project
Charter occurs and this is the main focus of the checkpoints.
Revisions to the plan fall into three broad categories:
Minor changes to static information;
Major revisions to existing sections; and
The addition of new information not previously recorded.
In the Project Preparation Checkpoint, many of the items fall into the first category. Items such as
project background, change and issue resolution procedures and project organization will
generally change little on transition from Project Preparation to the Business Blueprint.
Revisions may be needed to the project work plan and the detailed work plans, the risk
assessment needs to be reviewed to reflect changed risks in construction, the baseline for the
project will change from the Project Preparation Stage to the Business Blueprint Stage
deliverables and estimates should be reviewed to reflect this.
Page 102
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
D a ta w a r e h o u s in g p r o je c t a b s t r a c t
F1
C o n fir m C o m p le te n e s s
o f th e P r o je c t
P re p a r a tio n S ta g e
C o n f ir m e d P r o je c t P r e p a r a t io n S t a g e D e liv e r a b le s
O p e n Is s u e s
F2
R e v ie w I s s u e s
P r o je c t p la n s
F3
U p d a t e P r o je c t P la n s
U p d a te d p r o je c t p la n s
Q u a lit y P la n
F4
U p d a t e Q u a lit y P la n
U p d a te d Q u a lit y P la n
F5
O b t a in A p p r o v a l o f
P r o je c t P r e p a r a tio n
S ta g e D e liv e r a b le s
P r o je c t P r e p a r a t io n S t a g e D e liv e r a b le s
F o r m a l A p p r o v a l P o in t
Page 103
Is s u e r e s o lu tio n s
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 104
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 105
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 106
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
The estimates should include the amount of time and effort required from the users.
Any changes to the project work plan necessitated by resource or other constraints should be
reflected accordingly.
Page 107
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 108
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 109
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 110
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Business Blueprint
The Business Blueprint Stage derives a design for the DW system including the data
transformation system, the data warehouse database (InfoCubes), and the presentation system.
In design, an integrated model is developed to depict "how" the requirements are to be
implemented in the new system from a business and technical perspective.
During this stage, you also:
Refine the original project goals and objectives
Refine the overall project schedule and implementation sequence
Key Deliverables:
1) Business Analytical Process Definition
a) Business requirements both general and subject area specific
b) Detailed presentation requirements for each subject area
c) History Considerations
d) Load timing
e) Users of the DW
f)
Reporting
i) Online
ii) Ad-hoc
iii) Operational reporting requirements
2) Data definition and design (CDM, LDM, MDM)
3) High-level Design
4) Data Conversion Specification
5) Data Transformation System design
a) InfoObject Characteristic, including master data, texts, and hierarchies
b) InfoSource
c) InfoCube
d) DataSource metadata
e) InfoObject key figure
6) Presentation System design
a) Restricted key figure
b) Variable specification
c) Query specification
d) Structure specification
e) Calculated key figure
f)
Worksheet specification
g) (others as identified for 3.0B/3.1c)
7) Security design
8) Testing approach and plan
9) User documentation (cross life-cycle)
10) Ongoing support documentation
a) Process external interfaces to SAP
b) Process SAP interfaces
11) Training (cross life-cycle)
12) Technology design
a) Review of overall technology design
b) Sizing
Page 111
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Analysis and Decision Process Definition
An understanding of the analysis and decision processes within the scope of the planned DW
system is obtained. Based on an analysis of the organization's business events, the analysis and
decision processes including the performance measurement processes, quantitative analysis
processes and ad-hoc analytical processes are identified. The data requirements and
presentation characteristics of these processes are determined to form a basis for defining the
data requirements for the data warehouse and the end-user data access requirements. The data
access requirements provide the key inputs to the development of the DW presentation system.
Analytical Processing Data Definition
The requirements for the data warehouse are defined in the form of a Conceptual Data Model. To
support the analysis and decision processes, data definition on a DW project focuses on
determining the information needs of the various user views including performance measurement
and quantitative analysis. Unstructured data entities, such as multimedia data, required to support
business processes are also identified, if applicable.
The source data and the derivation or transformation rules are also determined. The data
transformation requirements provide the key inputs to the development of the data transformation
system.
Data Transformation System Design
The data transformation (DT) requirements defined during the Project Preparation Stage are
translated into a set of technical specifications for the DT system. DT system design must be
completed within a relevant overall BW architecture.
Data transformation is the process of transforming source data to target DW data. The key
components of a DT system include:
Data extracting;
Data cleansing/scrubbing (Transfer rules/routines);
Data transforming (InfoSource integration);
Data transfer;
Data refreshing, update and maintenance; and
InfoCube monitoring.
Presentation System Design
The data access and presentation requirements are translated into a set of technical
specifications for the DW presentation system. The presentation system design should address
issues relating to the three major layers of a presentation system architecture:
Data access layer is concerned with accessing the required data for the analytical
applications addressing issues such as data volume needed for the planned analysis,
level of data aggregation, format of data, data source and its media;
Data analysis layer is concerned with the core analysis tasks such as performance
measurement, data mining, statistical analysis and decision analysis; and
Data presentation layer is concerned with the media used in presenting the analysis
results to the end-users such as visualization, audio presentation, graphical presentation
or other forms of presentation media.
Data Design
The data requirements defined during the Analytical Processing Data Definition Phase, in the
form of a Conceptual Data Model (CDM) with the associated performance measurement,
quantitative measurement and unstructured data entities, are translated into a DW database
design consisting of a Logical Data Model (LDM) and a Multi-Dimensional Data Model (MDM).
[duplicate from a few paragraphs above]
Page 112
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Cross-Life Cycle Phases
During the Business Blueprint Stage, several Cross-Life Cycle Phases may have started
including User Documentation, Training, and Acceptance Testing. These phases are briefly
described in the paragraphs below.
Other Cross-Life Cycle Phases, all started during the Project Preparation Stage and continuing
beyond the Business Blueprint Stage, include:
Change Integration;
Prototyping; and
Technology Planning, Support and Cutover.
Depending upon the specific project requirements, each of these Phases may have specific tasks
and steps to be completed during the Business Blueprint Stage.
User Documentation
Identifies and analyses current documentation and defines the purpose, audience, content and
media of any new procedures.
Training
Identifies the personnel to be trained, defines the training scope and strategy, creates the courses
and course materials, completes any pilot courses as appropriate and commences the training
program.
Acceptance Testing
Defines the acceptance test criteria against which the eventual system will be assessed. The test
plans needed to conduct and document the results of the acceptance test are defined and
generally refined during the Business Blueprint and Realization Stages and the final test is
conducted in the Final Preparation Stage.
Project Management
Although project management activities are ongoing throughout the project life cycle, in the
Business Blueprint Checkpoint Phase a formal confirmation of progress against the Project
Charter is completed and plans are developed for the Realization and Final Preparation Stages.
Page 113
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.7
Purpose
To obtain an understanding of the organization's analytical processes within the DW project scope
to provide a basis for determining the data requirements for the DW and the end-user DW access
requirements.
Overview
This task obtains an understanding of the business analysis and decision processes within the
scope of the project to provide a basis for:
Determining the data (content) requirements for the DW, i.e., the data which end-users
require in conducting the analytical processes. These data requirements are defined and
modeled in the Analytical Processing Data Definition Phase which is completed
concurrently with this phase; and
Determining the end-user data access requirements in conducting these analytical
processes. These requirements are the primary drivers for the design and development
of the presentation architecture for the proposed DW system.
Based on the business drivers identified for the project, an understanding of the organization's
business processes and events is obtained to provide a foundation for identifying the analysis
and decision processes. Business event modeling is discussed as a technique to be used in the
analysis of an organization's business processes and events.
The analysis and decision processes generally fall into two main categories:
Performance measurement processes; and
Quantitative analysis processes (e.g., data mining applications).
The characteristics of these processes in terms of their data and data access requirements are
determined. In addition, if possible, typical ad hoc analysis processes and their characteristics
should also be determined.
Page 114
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
G 2
Id e n tify P e r fo r m a n c e
M e a s u re m e n t
P ro c e s s e s
P e rfo rm a n c e m e a s u re m e n t
p ro c e s s e s a n d d a ta
p r e s e n ta t io n c h a r a c te r is tic s
Q u a n tita tiv e a n a ly s is p r o c e s s e s
a n d d a ta p r e s e n ta tio n
c h a r a c te r is tic s
G1
A n a ly z e B u s in e s s
P ro c e s s e s a n d E v e n ts
G3
Id e n t ify Q u a n tit a tiv e
A n a ly s is P r o c e s s e s
G5
D o c u m e n t A n a ly s is
a n d D e c is io n
P ro c e s s e s
B u s in e s s p r o c e s s e s
B u s in e s s e v e n ts
G4
Id e n tif y A d h o c
A n a ly s is P r o c e s s e s
A d h o c a n a ly s is p r o c e s s e s a n d
d a ta p r e s e n ta tio n
c h a r a c te r is tic s
A n a ly s is a n d d e c is io n
p r o c e s s d e s c r ip tio n s
G6
C o n fir m R e la t e d
P r o c e s s M o d e ls
C o n f ir m e d a n a ly s is a n d
d e c is io n p r o c e s s d e s c r ip tio n s
G7
S u b m it P r o c e s s
D e f in it io n D e liv e r a b le
P r o c e s s d e f iin it io n d e liv e r a b le
Page 115
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 116
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Scope of Process Definition
On a DW project, the scope of process definition is typically limited to a high-level understanding
and description of the analytical processes and user access requirements for the purpose of
developing the DW data transformation and presentation systems.
If some or all of these analytical processing applications are also going to be developed, they
should be considered as separate custom or package software development projects.
The scope of the DW project may be expanded to include the development of these analytical
applications or a separate custom or package software development project may need to be
defined.
Critical Concepts of Event-Driven Business Modeling
Event-driven business modeling is the concept of building systems that fully support a business
by capturing all relevant information contained in "business events". The contention is that by
focusing on business events, a comprehensive view of an organization centered around business
processes can be developed as opposed to the more traditional and often limited view that
concentrates on the functions or information processes of an organization.
The critical concepts of event-driven business modeling include:
Page 117
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Information at the higher level is often required on an as-needed basis (as opposed to the often
scheduled reporting of operational performance information); and
Phases of a decision process. A decision process can be described as consisting of
three phases (Herbert Simon, The New Science of Management Decisions, revised
edition, Prentice-Hall, 1977):
Intelligence - searching the environment for conditions that call for a decision. During this
phase, management seeks to gather data to:
perceive future trends having an impact on the organization before they actually
occur, or
more clearly define the problem and provide some input to the solution process;
Design - finding possible courses of action. During this phase, management seeks to
invent, develop and analyze possible courses of action. It involves manipulation of the
data obtained to develop various solutions to the problem, and
Choice - choosing among courses of action. During this phase, management seeks to
select the best among the alternatives developed during design using techniques such as
sensitivity analysis.
Management decision support information needs may be identified by focusing on the information
management needs in each of these decision making phases.
Examples of Quantitative Analysis
Quantitative analysis may be completed in many forms or shapes to support an organization's
business analytical or decision making processes. Some examples include:
A custom quantitative analysis model for a large downstream retail chain, including a
causal-based sales forecasting model that leveraged daily point-of-sale data from
approximately 400 stores to track promotional effectiveness;
Estimating inventory shrinkage at the store level for a convenience store using a samplebased statistical model;
Combining business rules with statistical sampling expertise to improve the market for
service stations.
Data Mining/Data Profiling
Data mining/data profiling (DM/DP) is the analysis of data, typically using quantitative analysis
techniques, to uncover valuable business information to support an organization's analytical or
decision making processes. The key elements in planning for a DM/DP effort consist of:
The business objectives to be supported; and
The quantitative analysis technique(s) to be employed including the associated input and
output data.
DM/DP addresses five kinds of data:
Classification (or pattern recognition i.e. rules) such as:
Classifying customers into those with "high profit potential" versus those with "low profit
potential" or those "satisfied" versus those "unsatisfied" to:
Target special promotions,
Differentiate level of service provided, or
Improve customer satisfaction;
Classifying locations/channels into those with "high profit potential" versus those with "low
profit potential" to:
Target direct expansion to new locations with highest potential, or
Target incentives to best channels,
Prediction/forecasting such as:
Page 118
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
DM/DP techniques include: statistical analysis (e.g. regression, optimization, linear programming,
time series analysis), neural networks and expert systems, fuzzy logic, intelligent agents,
multidimensional analysis, data visualization and decision trees.
Scope of Analytical Process Definition
The main purpose in defining the analytical processes in this phase is to provide a basis for
determining the data requirements for the DW and the data access requirements for the
presentation systems. Actually developing the application systems supporting these analytical
processes is typically outside the scope of a DW project. Thus, the documentation of the
analytical processes as discussed in this task is not at the same detailed level as would be
required for the actual project implementation. However, depending on the specific project
objectives, the scope of the DW project may be expanded or a separate project may be initiated
to encompass the development of selected performance measurement or quantitative analysis
processes.
Understanding business processes/events helps in identifying analytical information needs from an
enterprise perspective. In particular, the characteristics of business processes/ events often provide useful
inputs in determining performance measures, quantitative analysis or other analytical processing
information needs which are the foundation for multidimensional data analysis as discussed in this phase
and in the Data Design phase.
Page 119
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Business events should be identified at the level at which management wishes to plan, control
and evaluate to accomplish specific business objectives; e.g., the focus of an acquisition/
payment process is on purchasing only what is needed and can be paid for, receiving only what is
ordered, paying for only that which is received and making sure that the resources acquired are
available when needed. Thus, while an acquisition/payment process may involve a wide variety of
goods and services (e.g., human resources, financial resources, supplies or inventories), the
basic types of business events may include:
Request the goods or services;
Select a supplier;
Order the goods or services;
Receive the goods or services;
Inspect the goods or services; and
Pay for the goods or services.
As needed, the business events may be further decomposed to lower level events to meet the
needs of a specific analysis or objective.
In determining business events, it is important to distinguish between business events and
information processes. The latter are the activities that record business event data, maintain the
reference data about business events and report the information to authorized users. Thus, the
relationship between the two is that - business events trigger information processes.
The differences between the two concepts can be illustrated using the following examples:
Business Event
Deliver a product to a customer
Receive a payment from a customer for goods
and services provided
Information Process
Record data about the delivery
Record data about customer payment
Page 120
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Professional Publishing, 1993]) may be used as a framework for business event analysis. REAL
is an acronym for:
Resources - Organizational resources used by the business event either directly or
indirectly. If it is difficult to identify a resource for a business event, the event is probably
unnecessary or adds little value. Resources may be physical (e.g., buildings, equipment,
supplies, inventories) or conceptual (e.g., employee skills, patents, financial resources);
Event - The business event being modeled;
Agents - People or machines that execute the business event (i.e., play roles or perform
responsibilities with respect to the business event). Agents could be internal (e.g.,
engineers, cashiers, production workers) or external (e.g., customers, suppliers). Roles
may sometimes be performed by machines (e.g., ATM machines, robots); and
Locations - Places where the business event took place.
In REAL modeling, "time" is an attribute of a business event. Alternatively, a "time" dimension may
be explicitly modeled.
Using the REAL business event modeling formalism, business events and the associated
resources, agents and locations are represented in rectangular boxes with lines connecting the
event box with the associated resource, agent and location boxes. Resources and locations are
typically placed on the left side of events and agents are placed on the right side.
Analyzing business events provides a process perspective in determining the information content
of the data models. By capturing the essential characteristics of an organization's business
events, the information needs (or views) of the various information users can be supported. Thus,
analyzing business events facilitates the integration and sharing of data within the organization.
A.7.6 Discuss and confirm the business event analysis results with
management.
Discuss the business events and their characteristics with knowledgeable users, in a structured
walk-through format, as appropriate. Resolve any questions or issues and make any changes as
necessary.
Page 121
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 122
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 123
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
The scope impact on the current DW project must be assessed and discussed with management.
It may be appropriate to prepare presentations for management on the concepts and benefits of a
new performance measurement system. Senior management commitment and support is critical
for the success of a performance measurement project.
Discuss with management and determine whether it is appropriate to expand the scope of the
current DW project or to initiate a separate performance measurement project for the
development of a new performance measurement system.
If the scope of the DW project is to change:
Formally document the amended scope;
Determine the resource requirements;
Prepare an amended work plan and timing and cost estimates; and
Obtain formal written approval of the changes before proceeding.
Page 124
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Any new information created as a result of the analysis processes (e.g., resulting
performance measures);
Whether the analyses are currently performed or are planned in anticipation of the
availability of the DW data; and
Page 125
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 126
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 127
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Identify the relevant quantitative measurement (QM) data either as DW input to or as DW source
data from the quantitative processes. Define the QM data describing:
The business objectives supported by the QM data;
The quantitative analysis process that uses or creates the data;
The quantitative analysis techniques used in deriving the QM data; and
The interpretation of the resultant QM values focusing on their business implications.
Page 128
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 129
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 130
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 131
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 132
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 133
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 134
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.8
Purpose
To determine the data requirements for the proposed DW.
Overview
This phase determines the requirements for the DW database being developed. While the DW
system (and its database) may be developed on an incremental basis (e.g., on a subject area by
subject area basis), the development should be guided by an enterprise Conceptual Data Model
(CDM). Otherwise, the various DW databases developed over time may become "islands of
information" with data that may be inconsistent or redundant. An enterprise CDM also ensures
that the DW databases are consistent with the transaction processing databases which are the
major data sources for the DW databases.
As illustrated in Exhibit E-1, an organization's enterprise data model environment can be viewed
as consisting of three data model types:
The Enterprise CDM;
The Analytical Processing/OLAP Data Model in the form of performance measurement
(PM) entities, quantitative measurement (QM) entities and unstructured data entities; and
The Transaction Processing/OLTP CDM,
Page 135
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
P ro c e s s a b s tra c t
D a ta a b s tra c t
H2
Id e n t ify P e r fo r m a n c e
M e a s u r e m e n t E n titie s
H1
D e v e lo p C o n c e p t u a l
D a ta M o d e l
H3
I d e n tif y Q u a n tita tiv e
M e a s u r e m e n t E n t it ie s
P e rfo rm a n c e m e a s u re m e n t
e n tit ie s
C o n c e p tu a l D a ta M o d e l
H4
Id e n tify U n s tr u c tu r e d
D a ta E n titie s
U n s tr u c t u r e d a t a e n t itie s
Q u a n t ita tiv e m e a s u r e m e n t
e n t itie s
H5
Id e n tif y S A P S o u r c e s
H6
A s s e s s R e q u ir e d n o n S A P D a ta S o u rc e s
M a s te r D a ta r e q u ir e m e n ts
S A P D a ta M a p p in g d o c u m e n t
N o n - S A P D a ta M a p p in g
docum ent
H7
I d e n t ify M a s te r D a ta
R e q u ir e m e n ts
H8
D e te r m in e D a t a
T r a n s fo r m a t io n
R e q u ir e m e n ts
D a ta tr a n s fo r m a tio n r e q u ir e m e n ts
H9
V a lid a t e P r o c e s s / D a ta
D e fin it io n
V a lid a t e d d a t a d e f in it io n s
H10
S u b m it D a t a D e f in it io n
D e liv e r a b le
D a t a d e f iin it io n d e liv e r a b le
Page 136
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 137
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 138
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 139
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
The data outputs of QM entities are also of various types and some examples include:
Estimated probabilities of default for individual bank loans;
Sales forecasts for individual products in specific markets;
Vehicle routing schedules that minimize total transportation costs; and
Targeted lists of prospects for direct marketing campaigns.
The definition of the QM entities should include information such as:
The knowledge worker(s) or analyst(s) responsible for the entity;
The model performance measures (e.g., statistical confidence measures);
The model parameters (e.g., regression coefficients); and
The quantitative techniques used (e.g., neural networks, optimization algorithms).
Thus, the scope of QM data is broad and may overlap that of other entities. In particular, the data
inputs and outputs of the QM entities are often attributes of other entities in the CDM and may
also be classified as performance measures.
Page 140
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
With the advances of various multimedia enabling technologies such as data transmission (e.g.,
fiber optics or cable), data compression techniques, special-purpose processors (e.g., video
servers) and improved disk or optical storage capacities, there is an increasing need for an
organization to address unstructured or multimedia data issues in managing its data resource.
Unstructured or multimedia data management issues may fall into one of the following three basic
categories:
Extracting data from unstructured data sources (e.g., extracting specific information such
as new product announcements from news releases);
Efficiently storing the unstructured data (e.g., as Binary Large Objects or BLOBs); and
Delivering data to users in a multimedia format (e.g., a voice presentation).
Page 141
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
This task focuses on determining the unstructured or multimedia data requirements in terms of
the data sources, the extraction requirements and the output requirements. Data storage issues
are addressed in Data Design as appropriate.
Page 142
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Determine whether any source data context information is available for verifying the accuracy of
the data extraction processing such as:
Data-provider supplied tags and other uniformities in the data;
Formatting;
Logical structure of the document (e.g., abstract versus text body, titles and headings);
Numbering of items or cross-references;
Arithmetic relationships; and
Fields in semi-structured data.
Determine the input media format such as:
On-line service suppliers (e.g., Dialog, Bloomberg);
World Wide Web (WWW);
Industry databases;
CD-ROM's;
Tapes; or
Diskettes.
Page 143
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
The following exhibit illustrates the levels of effort for data transformation based on a paradigm of
internal/external and structured/unstructured dimensions.
Page 144
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Identify potential SAP R/3 source system data at the functional module level, which could
satisfy unique BW analysis and reporting requirements
Review potential R/3 data sources with Power User, Data Architect/Modeler, and BW
Analysis/Extraction Developer team members
Identify the GOLDEN Source of R/3 system data
Capture Data Mapping and Transformation Rules (i.e., Source to Target Mapping)
Review all documentation for new BW queries and reports (i.e. report mockups and query
samples)
Collect data samples from Source System, one sample for each data frequency (i.e., daily,
weekly, monthly, etc.)
The data contained in the various sources may vary per frequency.
Review Source data collections for data availability and determine GOLDEN Source of data
Update data mapping document with actual file or database names.
Study current data collection field formats, data types, etc., and determine reformatting, type
conversion needs, and update data mapping document
Page 145
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Identify different methods of moving data from the source systems to the BW system, and
update data mapping document
Define extract processing frequency, will full updates or delta file updates (data changes only)
be performed, before/after images be compared, time-stamped data or log/audit files be used
Update data mapping document.
Determine filtration needs and update data mapping document
Identify summarization requirements and update data mapping document
Review all data models and field layouts for each SAP R/3 reporting environment
Review all reporting documentation for each SAP R/3 data source
Identify the following BW components to satisfy End User requirements:
Characteristics
Key figures
Master Data
InfoObjects
Result
To clearly identify and map applicable SAP R/3 source system data that will satisfy the BW query
and reporting needs.
Page 146
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Result
To clearly identify data from Non-SAP sources necessary to satisfy end user requirements.
A.8.22Develop Plan for Capturing Non-SAP Source Data into Flat Files
Purpose
The purpose of this task is to develop a plan for capturing Non-SAP source Data into Flat Files.
Consider utilization of 3rd party Extract and Transform tools (see ETL Strategy Document).
Procedure
Result
Flat files will be generated to load into BW
Page 147
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Collect data samples from Source System, one sample for each data frequency (i.e., daily,
weekly, monthly, etc.)
The data contained in the various sources may vary per frequency.
Review Source data collections for data availability and determine GOLDEN Source of data
Update data mapping document with actual file or database names.
Study current data collection field formats, data types, etc., and determine reformatting, type
conversion needs, and update data mapping document
Identify different methods of moving data from the source systems to the BW system, and
update data mapping document
Define extract processing frequency, will full updates or delta file updates (data changes only)
be performed, before/after images be compared, time-stamped data or log/audit files be used
Update data mapping document.
Determine filtration needs and update data mapping document
Identify summarization requirements and update data mapping document
Review all data models and field layouts for each Non-SAP reporting environment
Review all reporting documentation for each Non-SAP reporting environment
Identify the following BW components to satisfy End User requirements:
Characteristics
Key figures
Master Data
InfoObjects
Result
To clearly identify and map applicable Non-SAP source system data that will satisfy the BW query
and reporting needs.
Page 148
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Result
A clearly defined Master Data Requirements Document
Page 149
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
The master data source fields and the master data merging strategy should be clearly defined.
Master data can be sources from multiple R/3 and Non-R/3 systems.
Page 150
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 151
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
The rules for mapping the source data to the target analytical data (e.g., calculation or
aggregation); and
The strategies for resolving conflicts between different data sources regarding data definitions
or values.
Result
The exact flow of key figures, characteristics, master data and hierarchies from the source
system to an InfoObject in a report is clearly defined. All aspects of the of this data for end users
should be reviewed and clearly documented in the Data Conversion Document.
Review all key figures, characteristics and hierarchies that can benefit from adding aggregate
tables
Review all combinations of key figures, characteristics and hierarchies that can benefit from
adding aggregate tables
Result
The exact number of aggregate tables to be created in the production environment should be
defined and documented in the Data Conversion Document.
Identify how the pre-staging process can be accomplished. Some pre-staging methods are:
Loading data into flat files and using utilities to cleanse the data.
Store the data in an database and cleanse the data using programs
Review the following documents for granularity requirements for each InfoObject:
End User Requirements Document
Current Data Warehouse and Information Access Environment Document
Page 152
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
SAP Data Mapping Document
Non-SAP Data Mapping Document
Master Data Requirements Document
Identify granularity rules for each InfoObject in the BW
Some data warehouses summarize data from monthly to quarterly for older years. Therefore,
year1 through year 2 hold monthly granularity data. But, year 2 through year 5 data is lightly
summarized, it is stored at the monthly level.
A.8.32Estimate Volume
Purpose
The purpose of this task is to identify the size and growth estimates of the target production
database.
Procedure
The exact filtering rules for each InfoObject is clearly defined. All aspects filtering rules for this
data should be reviewed and clearly documented in the Data Conversion Document.
Page 153
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 154
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.8.40Partition the development of the CDM (only for large complex data
models).
For large complex data models, the process of building the CDM may be easier by building
separate, smaller CDM pieces. Each component CDM is built from a copy of the ERM.
The identification of data elements is achieved by review of the processes. Each process is
assigned to one and only one component CDM. The number of processes determines the
number of component CDMs that need to be created. The processes can be partitioned based on
their logical relationships such as common activities and transaction types of common subsystems. This partitioning process is not necessary for simpler data models as all entities can be
accommodated on one CDM.
Where necessary, complete the partitioning of the CDM.
Page 155
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 156
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
A.9
Purpose
To confirm the results of the Business Blueprint Definition stages (Analysis and Decision Process
Definition, Analytical Processing Data Definition, and the Technology Definition) with
management. Additionally, in this Phase, the CPDEP Phase 2 task of Generate and Select
Alternative(s) is completed, and a recommendation and Class 2 Estimate provided for the
DRB/GRT.
Overview
Throughout all of the stages of the DW life cycle, there is a need to monitor and report project
status, to identify and address scope issues and to conduct appropriate Quality Management
Reviews.
The primary objective of this phase is to check progress against the Project Charter and to
update the plan with any required modifications.
Revisions to the plan fall into three broad categories:
Minor changes to static information;
Major revisions to existing sections; and
The addition of new information not previously recorded.
In the Design Checkpoint, many of the items fall into the first category. Items such as project
background, change and issue resolution procedures, project organization and much of the
contractual information will generally change little.
Business Blueprint Definition Checkpoint Task Relationships
P r o c e s s d e f in it io n d e liv e r a b le
D a t a d e f in it io n d e liv e r a b le
I1
C o n fir m C o m p le te n e s s
o f t h e B u s in e s s
B lu e p r in t D e f in itio n
O p e n Is s u e s
I2
R e v ie w I s s u e s
P r o je c t p la n s
I3
U p d a te P r o je c t a n d
Q u a lit y P la n s
Page 157
C o n f ir m e d B u s in e s s B lu e p r in t D e fin itio n S ta g e
D e liv e r a b le s
Is s u e r e s o lu tio n s
U p d a te d p r o je c t a n d q u a lit y p la n s
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 158
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 159
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 160
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Possible options may include reviewing the project and strictly monitoring any deviations from the
project work plan to be able to identify possible symptoms of a larger problem or may involve
changing the project scope or project team composition to try to resolve the issues.
In this latter instance, assess the impact of making changes to the project plan on the associated
strategy documents developed in this stage. Update the implementation strategy, data conversion
strategy and appropriate testing strategies as necessary.
Page 161
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
J2
D e v e lo p D a t a E x tr a c t
S y s te m In te rfa c e
D e s ig n
J1
D e v e lo p H ig h - le v e l
D a ta T r a n s fo r m a tio n
S y s te m D e s ig n
J3
D e v e lo p D a t a
T r a n s f o r m a tio n
S y s te m In fo S o u rc e
D e s ig n S p e c if ic a tio n s
S y s te m D ia g r a m
In te r fa c e D e fin it io n s
C u s to m E x tr a c t P r o g r a m L o g ic S p e c s
I n t e r f a c e A c c e s s a n d S e c u r it y D e f in it io n s
M a n a g e m e n t O v e r v ie w D ia g r a m
(u p d a te d )
H ig h - le v e l D a ta T r a n s f o r m a t io n S y s t e m D e s ig n
J4
D e v e lo p D a t a
C o n v e r s io n D e s ig n
S p e c ific a t io n s
D a t a C o n v e r s io n D e s ig n S p e c ific a t io n s
D a t a C o n v e r s io n P r o g r a m S p e c if ic a t io n s
D a t a C o n v e r s io n U n it T e s t P la n s
I n f o S o u r c e S p e c if ic a t io n s
T r a n s fe r R o u tin e S p e c ific a t io n s
U n it T e s t P la n s ( I n it ia l)
J5
S u b m it D a t a
T r a n s f o r m a tio n
S y s te m D e s ig n
D e liv e r a b le
Page 162
D a t a T r a n s f o r m a t io n S y s t e m D e s ig n D e liv e r a b le
THE SERVICE ARM OF THE FIRM Data Warehousing BW Project Plan Methodology
Page 163
Legacy
System
Source Systems
PSA
Inf
o
C
ub
es
Legacy
System
E
xt
ra
ct
Legacy
System
Update
Rules/
Routines
Queries/
Reports/
HTML
BEx
Anal
yzer
Legacy
System
In
te
rf
ac
e
E
nv
ir
o
n
m
en
t
Pr
ot
oc
ol
Legacy
System
Transfer
Rules/
Routines
InfoObjects
O
L
A
P
E
ng
in
e
Non R/3
Tr
an
sf
er
St
ru
ct
ur
e
(D
at
a
S
ou
rc
e)
BAPI/3rd Party
BEx
Bro
wser
Non R/3
C
o
m
m
un
ic
ati
on
St
ru
ct
ur
e
(In
fo
S
ou
rc
e)
tRFC
SAP R/3
Scheduler/Monitor(InfoPackage)
Users
D
at
a
C
on
ve
rsi
on
Bl
ue
pri
nt
an
d
R
ea
liz
ati
on
Page 165
Result
Utilize the existing sources for a better performance
A.10.10
Determine the audit and monitoring requirements of the data
transformation system.
Determine the audit and monitoring requirements of the DT system.
The requirements to be considered for the three DT system components include:
Data extract audit requirements:
audit and control points based on the extract system processing model,
high-level audit and control requirements and generalized format,
mapping selected data elements to audit/control requirements/formats,
requirement rules for audit processing including selection/creation/conversions to
common extract output requirements/format,
extract system conversion requirements for audit/control reports including control
totals (e.g., record totals, value totals, hash totals, file control record comparisons),
integrity errors, cleansing errors, range checking, data change reporting,
Page 167
Determine whether any encryption or other protection is required for sensitive or confidential data
which is contained within the transformation process. If required, define the selected approach.
Develop an approach for providing feedback to the source data provider concerning the usability
and quality of the source data. Some of the alternatives include:
Correcting the source data directly in the source data system;
Correcting the source data through normal operational transactions (e.g., using a
reversing/correcting transaction pair); or
Not responsible for correcting the source data but maintaining a cleansed and integrated
operational data store.
Correcting source data is often a sensitive organizational issue. Different approaches may be
appropriate for different source data systems.
A.10.11
Page 168
A.10.12
Purpose
To identify and specify the interface requirements for data extraction from the source systems,
both existing and proposed.
Overview
This task develops the specification for data extraction from the source feeder systems. These
requirements will need to include a provision for any ongoing balancing procedures that may be
required.
A.10.13
Refine the interface requirements for data extraction from the
source systems.
Review the Management Overview Diagram(s) or other Abstract diagrams/documentation to
confirm which interfaces are required. Include also the refresh/update interface requirements.
Obtain and review any requirement or design specifications that have been prepared as outputs
from the Prototyping Phase.
Identify the data elements that will be extracted.
Define the file layouts or extract structures for each interface required as input to and output from
the new system.
The aim should be for input files to provide data to the data extract system in a form that can be
processed through the normal input programs and validation routines in the same way as any
other input data.
A.10.14
When designing interfaces, it is imperative that a careful review of each source system is
completed (i.e., the system that will provide the interface input). The review should address:
Level of detail - whether the system is capable of generating the necessary information at
the appropriate level of detail or whether some additional interim processing will be
needed (e.g., does a payroll system provide pay information at a departmental total level;
at an individual employee pay level; or at an individual employee pay level sub-divided by
job);
Information availability - whether all of the data required for input is able to be generated
and in the appropriate format (e.g., for an interface to a general ledger system, does the
source system provide records or data for each type of financial transaction completed or
are some transaction types ignored?);
Internal integrity - whether the source system balances within itself (e.g., for an inventory
system: opening balances + receipts - issues +/- adjustments = closing balance or with
an accounts payable system for each check run: opening balance value - invoices paid +
credit notes adjusted +/- adjustments = closing balance value) and reports are available
from the system with this type of information to assist with the ongoing balancing of the
new systems, both for reconciliation purposes and for error correction if the systems do
not balance;
System maintenance - whether the source system is properly maintained to ensure data
integrity (e.g., DBMS integrity utilities are regularly run to ensure that any corrupted data
is corrected);
Page 169
Validation/error checking - whether the input validation and error checking for the source
system itself is sufficient to prevent inaccurate or incomplete data from entering the
source system and thus the interface; and
Currency/timing - whether the source system is kept current and up-to-date and that the
interface data is available with the desired timing on a consistent basis both during the
financial year and at year-end.
Review each source system. Where current systems are either old or poorly documented, this
may require considerable effort or an alternative approach (e.g., use of software reengineering
tools to re-document the system).
Similar considerations are required for interfaces that are output files from the newly developed
system and the integrity of the systems which will receive any interface output files.
A.10.15
Page 170
A.10.16
Design and document each interface including the access and security definitions and Unit and
Component Test Plans. Also include the design for the data refresh/update interface.
Determine the volumes of records for each interface and also the timing and availability of the
data (e.g., daily, weekly or monthly). Also determine whether there are variations in volumes
caused by such items as holidays, peaks (e.g., summer), month-end or year-end.
The volumes may also impact upon the timing of the interfaces (e.g., if the interface is only
required weekly or monthly but this will cause interim storage or processing difficulties and it may
then be necessary to run the interface on a daily basis to improve efficiency).
Define the ongoing interface system balancing:
Identify potential changes in the organization's methods of operation that are likely to
occur as a consequence of implementing a new system and the interface balancing
impact of these changes;
Ensure that the ongoing integrity of the system is maintained by establishing and
documenting the procedures for reconciliation and balancing of the interface data.
Consider:
interface sub-system balancing (e.g., the integrity of the source system before the
interface data is presented),
internal data completeness of the new system application itself (e.g., total opening
balance plus receipts minus issues plus/minus transfers plus/minus adjustments
equals closing balance), and
physical record numbers through such means as run activity control numbers for both
the source and receiver systems; and
In addition to the technical design, define the clerical procedures to ensure that the
system is continually reconciled. This should include the responsibilities and tasks that
must be completed:
daily,
weekly,
monthly, and
year-end.
Define the interface program/procedure specifications. Where special programs need to be
written to provide interfaces or where existing sub-systems need to be modified to produce
interface data with the correct degree of integrity and at the appropriate level of detail, prepare
program design specifications.
For BW, interface designs can follow one of the following approaches, and can depend primarily
on the nature of data transformation requirements and source data analysis:
Utilize standard Business Content extractors: In this case, no extra design work is
necessary apart from the previously mentioned interface system balancing concerns and
reconciliation procedures. The extractors should already have been installed previously
during the Technology Planning, Support, and Cutover phase.
Extend Business Content extractors: In cases where the standard content extractors do
not provide all of the necessary data elements to be brought into the BW, develop
specifications that define which new data fields to add to the extractors. Follow the
necessary steps as defined in the THE SERVICE ARM OF THE FIRM BW Cook Book or
documentation to add the proper fields and refresh the meta data in BW.
Page 171
Generic Data Extractor: Using the source system analysis from the Data Definition and the
high level designs, develop specifications for building DB views or ABAP/4 Query
specifications as sources for the Generic Data Extractor.
ETL tool: If none of the above meets the project requirements for source data extraction,
and an ETL tool has been selected and chosen, follow the specific tool guidelines and
approaches in developing or generating extraction programs.
Custom ABAP extraction: The following steps represent standard interface design
considerations from an R/3 system. Follow any standards in place for non-R/3 systems being
interfaced from:
1. Identify the data which has to be exchanged: The data of the business transaction and the
way the data has to be exchanged must be determined for each interface based on the
business objects. The structure and typing of the data could be pre-determined by R/3
interface definition or there are recommendations for their handling (see specific scenario
with its business objects in the "Interface Advisor").
If there is no business scenario that was used, an own structuring of the data could be used
filled from specifically identified data sources in SAP (outbound) or for SAP (inbound). The
use of standard formats and structures is recommended (e.g. the SAP IDOC container
structure). In certain cases it may be beneficial to use a newly defined data structure.
Standard structuring becomes more important when many different systems interact with
each other. There are certain vendors for conversion software offering various interfacing
techniques and arbitrary formats for data exchange with external systems.
2. Determine the roles of the systems and how the data handling has to be done: There are two
basic types of requests passing through an interface:
(1) requests for processing the passed data in a remote system and
(2) requests for getting information for analysis purposes.
Both types of request could be addressed by one single interface. The data transfer for
requests processing data in a distributed transaction has to keep the data consistent. Several
processing issues have to be considered on this subject.
Specific parts of a data form logical units of work (LUW) for data integrity reasons: Creating
the transfer data and maintaining the application data in the sending system, sending data
and setting the status of the transmission, in the receiving system update the application data
and processing status, sending back the acknowledgement and setting the transmission
status.
The "lead" system maintaining the business objects data and triggering the data exchange
has to be determined.
Depending on the business needs the serialization of the transferred data packets and their
processing has to be implemented. This could drastically increase the complexity of an
interface considering the implementation logic and error handling.
The received data has to be processed only once in the receiving system. This is especially
Page 172
A.10.17
Complete a structured walk-through of the extract system
interface design.
Discuss and confirm each interface's (input and output) detailed design by completing a
structured walk-through of each design.
Submit and discuss each interface design with management. Resolve any questions and issues.
This step may involve more than one meeting.
Update the Management Overview Diagram and other documentation as appropriate as a result
of the various interface system design specification preparations.
Obtain formal written approval of the design of each different interface.
Page 173
A.10.18
Develop Data Transformation System InfoSource Design
Specifications
Purpose
To complete and assemble the data transformation system InfoSource design specifications.
Overview
For the custom developed data transformation components, this task brings the results of the
previous tasks together into a final InfoSource specification package for each procedure,
elementary process and common logic module. The InfoSource specification package will be
used in the Realization Stage to develop the BW components for the application.
The format and content of the InfoSource specifications will vary depending on the development
language and/or approach. The InfoSource distribution guidelines and the interaction layer
architecture defined in this phase are both used to determine the most appropriate development
approach, before developing InfoSource specifications.
To prepare an InfoSource specification package, prepare Edit/Validation Specifications and
identify associated error processing, identify Table Definitions for codes/decodes, develop detail
logic requirements and then assemble all additional relevant documentation into the package.
A.10.19
The Communication Structure defines the format of the InfoSource. Within the overall
transformation architecture, they represent the end-point of transformed, integrated subject areabased data. Communication Structures contain mapped fields from source system DataSources
(see below) as well as derived fields as identified in the Data Definition.
A.10.20
The Transfer Structure represents the format of incoming data from source system interfaces. It
defines the layout of DataSources, from which InfoSources can be mapped. DataSources can
map to one or more InfoSources, and an InfoSource can map one or more DataSources to
integrate the data required.
A.10.21
Prepare Edit/Validation Specifications for Local Transfer
Routines.
Data integrity must be maintained by preventing invalid data from entering the system. The user
must be notified when information entered is invalid, told why it is invalid and be allowed to reenter corrected data.
Each data element has a characteristic format that must be enforced when it enters the system.
Typically, data element level edits can be completed without reference to any other stored
information. Such edits include checks for numeric data, null entries, date formats, valid dates,
syntax edits, range checks and required fields
Data must also be compared and cross-validated against other data for illogical combinations of
data, reasonableness checks and data comparisons.
Document the edit/validation rules so that they can be considered for subsequent implementation
as part of a stored procedure(s).
Page 174
A.10.22
Table Definitions relate to code/decode definitions. They are small utility data files for look-up or
data validation purposes. Each must be developed in terms of structure and, if reasonably small,
data content. Decisions on whether the table should be physically maintained as part of a
InfoSource, as part of the Data Dictionary, in a separate file or in a database will have an impact
on the tasks completed in both this phase and the Data Design Phase.
During this step, identify all known tables and gather documentation regarding valid values for
codes/decodes. This will help in preparation of test data, detail test steps and data conversion
activities.
A.10.23
Specify the associated action (e.g., Help message, error or warning message) for each edit and
validation.
A.10.24
Prepare initial test plans for each InfoSource and component
grouping.
Develop test objectives, test conditions and general expected results at the InfoSource level. This
provides valuable input for the BW developer. The Unit Test Plan further describes nuances of the
logic that the BW developer must understand and will often validate the other parts of the
InfoSource specification.
A.10.25
Conduct a structured walk-through of the InfoSource design
specifications.
Review and cross-reference the InfoSource design specifications against the application level
documentation as confirmed in the Analysis Checkpoint Phase.
Conduct a structured walk-through, within the project team, to confirm the completeness and
consistency of each design specification and to ensure compliance with both quality and
development standards.
A.10.26
Any shortcomings in the design should be resolved through the formal issue resolution process
and changes made, where appropriate.
Page 175
A.10.27
Purpose
To determine what data needs to be created and/or converted and to plan how this will occur.
Overview
Other than scope, no set of issues has a more profound impact on project planning than the data
conversion strategy.
The current and proposed data elements are analyzed to develop a systematic approach for
capturing currently available information and creating any new data for the proposed system.
During this task, a data conversion plan with resource estimates and data conversion
requirements is prepared.
A.10.28
The data in the existing systems, whether manual or computer-based, is reviewed to determine
its condition. Check:
How often and in what ways is the existing data used or managed;
Completeness and accuracy of data. Computer programs may have to be written to
check the existing system for such items as blank fields, missing or inconsistent data;
Whether there is any non-current data in the system;
Whether the system has been balanced and reconciled regularly;
Whether the system has its own internal integrity (e.g., opening balances + receipts issues +/- adjustments = closing balances);
If the system has sufficient input validation and error checking to prevent any invalid data
from entering the system; and
If DBMS software is used, whether the housekeeping routines have been run regularly to
check the integrity of the DBMS.
Depending upon the condition of each source system, a detailed review of some existing
applications may be required to determine the integrity of existing data. If some of these systems
are either old or poorly structured/documented, this may require considerable effort or a different
approach to extract and review the present data. e.g., the use of software reengineering tools to
document old programs or extra data.
A.10.29
Develop a Data Conversion Map by assigning the old data elements in the existing system to the
data elements in the new system to decide which data is:
To be transferred;
To be translated/converted;
Redundant;
New and has to be created and by what means; and
To be verified/balanced/reconciled.
This is a very important part of determining what needs to be completed in the data conversion
process. Careful mapping should provide more accurate information on the time and resources
required for data conversion.
Page 176
A.10.30
Determine the volumes of each file to be converted. Initial estimates may need to be adjusted
following purging/cleansing.
Determine volumes of data that need to be created. Initial estimates may need to be further subdivided, depending upon the various means of creation determined in subsequent steps.
If opening balances are to be created in the new system, determine the volumes of data for
creation and conversion for these types of records.
If tables have to be created or converted, derive their volumes.
A.10.31
Determine the source data for conversion and the conversion
method.
Once the required conversion data has been identified, the source for each data element must be
identified. These data elements may be currently maintained on an existing computerized system
or on paper documents or for some new systems, it may all have to be created.
Once the source and the format are determined, the method of conversion may be decided.
Select from the alternative methods to create the new master file data and table files including:
Keying the data;
Service bureau (keying);
Computer software bulk or mass maintenance routines;
Custom programs;
Utilities;
File conversion software;
Scanners to scan data; and
External databases (buy data).
Generally, a combination of these different alternatives are used to create the data. Note the
source of the input for each data field in the new system.
Select the most appropriate method to create the new system's opening balances and history
files where needed.
The source of each data element and the means should be noted on the Data Conversion Map.
A.10.32
Define required selection and purge criteria, creation and
translation rules.
The data requirements must be reviewed to determine if:
Any limiting selection criteria are necessary (i.e., only source data after a certain date will
be converted);
Any purge criteria are necessary. The purge criteria are basically the opposite of the
selection criteria. Selection criteria specify what to include; purge criteria specify what to
exclude; and
Any translation of data formats is required.
It should be noted that selection, purge and translation rules may not be all encompassing. In
most instances, human intervention and decision making will be required to decide if some
information should be converted or not converted and exactly how it should be translated or filled
out.
Page 177
A.10.33
Review data conversion requirements to determine error
identification and cleansing strategies.
The data conversion requirements are reviewed to determine edit and validation requirements
and to determine error handling procedures. The Edit/Validation Specifications defined for the
application system being developed should assist in determining the error identification
requirements.
This step may result in the identification of additional purge criteria that should be added to the
purge criteria specified in the previous step. Identify data cleansing strategies and document the
data cleansing procedures. In addition, it is important to ensure that:
The approach to data clean-up has been thoroughly planned to ensure that
dependencies between data items are maintained;
Data items to be amended and enhanced have been identified;
Criteria have been agreed for determining the data conversion rules;
An auditable trail of the changes applied is produced for management review and sign-off
by the data owner or nominated deputies;
All changes applied are reversible or can be backed out using back-up copies of the data;
and
Changes are never applied directly to production data.
If during the conversion process it is discovered that the existing system contains anomalies such
as incorrect information or items not capable of being reconciled, these may need to be written
off. If this leads to an impact on the organization's financial statements and/or the need for formal
management approval is required, agreement must be reached on the responsibility and the
methods used to make these adjustments.
A.10.34
Specify the tests that will be needed to prove that converted data is sufficiently "clean" to be used
and that data inaccuracies have not been introduced during the conversion process.
A.10.35
Develop data conversion acceptance criteria and acceptance
test strategy.
Derive the conditions that will determine that the data conversion has been successfully
completed and documented.
After the data conversion acceptance criteria are defined and agreed, develop a strategy for
conducting the acceptance testing of the data conversion process.
The definition of the criteria and the strategy may need to include internal and external audit
resources as well as the definition for responsibility for formal written sign-off of completion of the
testing process.
The criteria must be clear, explicit, measurable and verifiable.
A.10.36
Page 178
A.10.37
Define staff resources and project management required for
conversion.
Depending upon the size and nature of the data conversion effort required, a separate project
team may be needed to complete the various tasks.
If a separate project team is to be created, derive the type, quantity and level of personnel
resources required. Consider the mix of skills needed including:
Detailed knowledge of the existing systems and data;
Clerical effort for manual reconciliations;
Programming effort to cleanse, purge and list data and to create conversion programs;
Systems programming skills to create the data conversion system;
Project management skills to manage the data conversion project team; and
Audit skills to approve reconciliations and/or write-offs.
Determine the availability and scheduling for all resources.
The entire data conversion process should be treated as if it were a separate systems
development project. Thus, the normal project management controls should be in place,
including:
Adherence to a structured systems development methodology;
Use of project management tools and techniques to define and manage resources,
outputs, the critical path, costs and time;
Project risk assessment; and
The responsibility for approval and the means for any write-offs or adjustments
discovered as part of the data conversion process are defined.
Define and agree the project management structure required.
A.10.38
Assemble the data conversion strategy and prepare a data
conversion work plan.
Assemble the various inputs from the work steps into a data conversion strategy document.
Prepare a detailed data conversion work plan that includes:
Project management strategy, responsibilities and organization structure;
Staff resources required;
What data will be converted;
The source of each data element;
The cleanliness of the data and the strategy for cleansing;
The conversion, selection and purging rules;
The acceptance criteria and acceptance testing strategy;
The data conversion reconciliation procedures and documentation filing requirements;
The different conversion means;
Hardware and technology component resources required;
When the data conversion will begin;
How long the data conversion is estimated to take; and
Whether mock conversion will be employed for testing purposes.
Hundreds of detailed steps need to be accomplished in a precise sequence during a data
conversion process and the migration of the application to production status. To ensure that all
factors are considered, a controlling schedule/checklist must be designed and prepared. The use
of critical path techniques and software is most appropriate in this area.
Page 179
A.10.39
Submit and discuss the data conversion plan with
management.
Submit the Data Conversion Plan to management and resolve any questions and issues. If
necessary, prepare an action list of items to modify. This step may require more than one
meeting.
A.10.40
Obtain formal written approval of the data conversion plan including agreement of the
responsibility for the sign-off of the completed data conversion reconciliations.
Page 180
A.10.41
Purpose
To assemble the various DT system design deliverables.
Overview
When approved by management, the design specification assembled in this task will be refined in
the Realization Stage and be developed as InfoSources.
The DT system design deliverable is discussed with management and approved. In some
circumstances, this task may be delayed and completed as part of the Business Blueprint
Checkpoint Phase.
A.10.42
Review all of the documentation for each design specification and ensure that all of the relevant
work products are present. Ensure the consistency and completeness between individual
InfoSource specification packages.
Review the Management Overview Diagram or other relevant Abstract documents to ensure that
any changes resulting from the completion of the DT System Design have been correctly
included.
A.10.43
Conduct a structured walk-through, within the project team, to confirm the completeness of each
design specification and to ensure compliance with both quality and development standards.
A.10.44
Any problems in the design should be resolved through the formal issue resolution process. All
open DT system design issues should be resolved before the completion of this phase.
A.10.45
Submit and discuss the data transformation system design
deliverable.
Discuss the DT system design deliverable with management and resolve any questions and
issues. If necessary, prepare an action list of items to modify. This step may require more than
one meeting.
A.10.46
Obtain formal written approval of the data transformation
system design deliverable.
Page 181
P r o c e s s D e f in itio n
D a ta D e fin itio n
K1
D e v e lo p H ig h - L e v e l
P r e s e n ta tio n D e s ig n
H ig h - L e v e l P r e s e n t a t io n D e s ig n
C h e v r o n S e c u r it y P o lic y
K2
C r e a t e A u t h o r iz a t io n
D e t a ile d D e s ig n
A u th o r iz a t io n D e s ig n
K3
D e v e lo p P r e s e n t a tio n
S y s te m In te rfa c e
D e s ig n
Q u e r y s p e c if ic a t io n s
W o r k b o o k s p e c if ic a t io n s
S t r u c t u r e s p e c ific a tio n s
R e s tr ic t e d K e y F ig u r e s p e c ific a t io n s
C a lc u la t e d K e y F ig u r e s p e c if ic a t io n s
V a r ia b le s p e c if ic a t io n s
U n it T e s t P la n ( In it ia l)
K4
S u b m it P r e s e n ta tio n
S y s te m D e s ig n
D e liv e r a b le
P r e s e n ta tio n D e s ig n D e liv e r a b le
Page 182
Central office;
Regional office;
Branch office; or
Stand alone office.
The system diagram should highlight the interfaces between the various technology components
of the respective presentation system environments addressing issues such as:
Organizational structure;
Performance measurement;
Decision support;
Business processes;
Technology infrastructure; and
Geographical distribution of user groups.
Within this broader organizational framework, the presentation system architecture requirements
identified in Phase G are analyzed and consolidated into a consistent overall presentation
system. The key user requirements driving the design of the presentation system include items
such as:
Page 183
Page 184
Based on the identified presentation system architecture requirements, determine whether one or
more analysis tools would be required.
A.11.6 Discuss and obtain formal written approval for the high-level
presentation system design.
Page 185
Ask Questions
What is the security policy in your organization?
What level of security does your data require?
How much risk can you permit in each application area? The less risk the organization
assumes, the greater the resources needed to implement BW Security.
Communicate your Security Implementation Concept
Conduct a workshop with the authorization administrators and application areas (data
owners) to review the BW Authorization Concept, and how it impacts data owners.
Explain the implementation framework. Then interview each application area.
Page 186
Can users in the department execute a selected set of transactions (DO SOME), and
have they access to data across the organization (SEE ALL)?
Can users in the department execute a selected set of transactions (DO SOME), and
have they access to only specific data across the organization (SEE SOME)?
Review the enterprise-wide job role matrix before you build the activity groups.
Determine the security options (authorization object options) for each job role. You can do
this by creating a sample activity group for a job role and generating its authorization
profile. Discuss the security options (for example, setting up authorization groups, and so
on) with each application area (data owner), based on the security options you have
concerning the automatically generated and, therefore, necessary authorization
objects. Application data owners must be aware of the potential risks (such as, excessive
accesses) that result from their decisions.
A.11.10
Purpose
The purpose of this task is to obtain information about the types of data, and who owns the data.
To set up access to the data, you must specify which data a user uses, and the type of access the
user must have. For example, a user can be allowed to change certain data, and to display other
data. Access to remaining data must be denied to the user.
Page 187
Review
Review the enterprise-wide job role matrix before you build the activity groups. Determine
the security options (authorization object options) for each job role. You can do this by
creating a sample activity group for a job role and generating its authorization profile.
A.11.11
Purpose
The purpose of this task is to identify the needs of users to access information, such as business
data not directly related to job functions. Other types of access include access to SAPoffice and
online documentation.
Determine the need for access to BW System services, such as printing, faxing, and SAPoffice
for each user.
Procedure
Identify access rights or service use users need for their job functions. Such access rights
can be to SAPoffice, access to online documentation, reporting, Online Service System
access, basis services permissions, such as printing and faxing.
Page 188
A.11.12
Purpose
To design the presentation system interface both with the users and the other components of the
BW system.
Overview
This task develops an interface design for the BW presentation system in terms of both the user
access interface and interface between the presentation system and the other components of the
BW system.
A.11.13
Purpose
The purpose of this task is to define required presentation/layout preferences or layout sets for
BW standard reporting. Identify the processes and functions where layout sets must be used.
Determine how documents must be formatted to reflect the corporate standards.
Procedure
The procedure describes how to create Layout Set definition. Obtain the information needed to
develop standard company reports and forms.
Page 189
A.11.14
Purpose
The purpose of this activity is to determine uses of SAP pre-configured Reports.
Procedure
A.11.15
In addition to the preparation of standard query definition templates, the following should also be
performed:
A.11.16
A.11.17
Conduct a structured walk-through of the Business Explorer
Object specifications.
Review and cross-reference the program design specifications.
Conduct a structured walk-through, within the project team, to confirm the completeness and
consistency of each design specification and to ensure compliance with both quality and
development standards.
A.11.18
Any shortcomings in the design should be resolved through the formal issue resolution process
and changes made, where appropriate.
Page 190
A.11.19
Purpose
To assemble the various presentation system design deliverables.
Overview
When approved by management, the design specification assembled in this task will be refined
and coded in the Realization Stage.
The presentation system design deliverable is discussed with management and approved. In
some circumstances, this task may be delayed and completed as part of the Business Blueprint
Checkpoint Phase.
A.11.20
Review all of the documentation for each design specification and ensure that all of the relevant
work products are present. Ensure the consistency and completeness between individual
program specification packages.
Review the Management Overview Diagram or other relevant Abstract documents to ensure that
any changes resulting from the completion of the presentation system design have been correctly
included.
A.11.21
Conduct a structured walk-through, within the project team, to confirm the completeness of each
design specification and to ensure compliance with both quality and development standards.
A.11.22
Any problems in the design should be resolved through the formal issue resolution process. All
open system design issues should be resolved before the completion of this phase.
A.11.23
Submit and discuss the presentation system design
deliverable.
Discuss the presentation system design deliverable with management and resolve any questions
and issues. If necessary, prepare an action list of items to modify. This step may require more
than one meeting.
A.11.24
Obtain formal written approval of the presentation system
design deliverable.
Page 191
Page 192
C o n c e p tu a l D a ta M o d e l
L1
D e v e lo p L o g ic a l D a t a
M odel
L2
D e v e lo p
M u ltid im e n s io n a l D a ta
M odel
H ig h - le v e l D a t a T r a n s fo r m a tio n
D a ta T r a n s fo r m a tio n S y s te m D e s ig n
M a s te r D a ta re q u ire m e n ts
L3
In f o C u b e D e s ig n
L4
S u b m it D a t a D e s ig n
D e liv e r a b le
Page 193
L o g ic a l D a ta M o d e l
M u lt id im e n s io n a l D a t a M o d e l
In fo C u b e s
A g g re g a te s
D a t a D e s ig n d e liv e r a b le
Refining the definitions of attributes including defining any new design or overlooked
business entities and attributes, placing each entity into third normal form, determining
the primary and foreign keys and finalizing the definition of the attributes.
New entities and attributes may be identified during the Data Transformation System
Design and Presentation System Design Phases; e.g., some new attributes may have
been created to manage and/or control application processes or to reduce redundant
processing (e.g., storing totals to eliminate frequent recalculation of those totals). Some
entities or attributes may simply have been overlooked in developing the CDM.
Each entity is placed into third normal form. When an entity has several candidate keys, a
primary key is selected. Foreign keys are also determined for implementing the identified
entity relationships.
The definitions of each entity and attribute are reviewed to ensure that all appropriate
information to be recorded in the Data Dictionary is complete and up-to-date;
The entity relationships defined in the CDM are reviewed and refined as necessary from a
database design perspective considering such design issues as:
refining entity relationships based on design criteria (e.g., resolving many-to-many
relationships),
refining subtyping hierarchy structures,
determining entity relationship cardinalities, and
defining referential integrity requirements; and
Estimating the data volumes for use in determining data storage requirements.
The data volumes for the implementation of the database environment are estimated considering:
volumes of all data for conversion,
historical data requirements,
Page 194
The data volume estimates need to be reviewed and revised as necessary as the LDM evolves or
as more knowledge is gained about the application environment.
Page 195
Page 196
Determine whether the code set is extensible (i.e., whether the code values are static or
dynamic); and
Determine the ownership of the code sets (i.e., the user groups responsible for defining
and maintaining the code sets).
A code set is defined as a code table entity consisting of at least two attributes - the data code
value and the data code name (e.g., the Country Code Table consists of COUNTRY CODE and
COUNTRY NAME.) The primary key for a code table is the attribute representing the data code
value.
A code table may be associated with one or more entities to characterize the occurrences of the
related entity. Thus, the data code attribute (e.g., COUNTRY CODE) in the related entity is the
foreign key for accessing the Country Code Table entity.
Review the attributes of each entity to determine whether any attribute or group of attributes
occurs multiple times for each occurrence of the primary key of the entity.
Remove any repeating attributes from the entity and create a new entity for the repeating
attributes. This new entity should have a one-to-many relationship back to the original entity.
Typically, this new entity will be an associative or characteristic entity and inherit the primary key
of the original entity. Once all repeating groups of attributes have been eliminated, all entities will
be in first normal form;
Review each attribute of each entity to ensure that it is determined by the entire primary key of
the entity rather than only part of the primary key.
Remove any attributes which are dependent on only part of the key (i.e., the attributes which
describe a fact about a part of the entity's key). Create a new entity for these attributes. Typically,
this new entity will be an associative or characteristic entity and will include part of the primary
key of the original entity.
Add a one-to-many relationship between the new entity and the original entity with the "many" on
the original entity end. The CDM is in second normal form once all partial dependencies have
been removed from the model; and
Review each attribute of each entity in the model to ensure that it is dependent on only the
primary key of the entity and no other attribute of the entity (i.e., there is no transitive
dependency).
Remove any attributes within the entity that describe a fact about another attribute that is not part
of the key. Create a new entity for these attributes which has the non-key attribute they describe
as the key for the new entity.
Page 197
Page 198
Page 199
Page 200
Page 201
The nature of each entity access required by each event (i.e., add, update, delete or
read-only);
The response time requirement or allowable duration for each event; and
The expected volumes of data to be stored for each entity.
Estimate the data volumes for the production environment including as appropriate:
The conversion data, historical data and interface data requirements as determined in the
previous steps;
Code tables;
Transaction master data;
Back-up data;
Work space tables; and
Index tables.
Estimate the data volumes in terms of:
At the beginning of production;
At the end of first year of production; and
The projected annual growth rates of the data volumes.
Multimedia data is space intensive (e.g., audio or video data). In estimating the data volumes for
multimedia, the data storage configurations and the capability of the DBMS supporting the multimedia
data must be considered as appropriate. For example, multimedia may be stored directly in a database as
Binary Large Objects (BLOBs) or long fields or in a separate mass storage device.
Document any assumptions used in determining the various data volume estimates and the data
volume growth rates.
Discuss and confirm with management and appropriate user groups the data volume estimates
including any assumptions used in making those estimates.
Revise the estimates and/or assumptions as appropriate.
Page 202
A.12.10
Purpose
To model the business analytical processing information requirements in the form of a
multidimensional data design.
Overview
A multidimensional data model organizes data to support analytical and decision making processes in a
manner similar to how the business people think of their data. It makes it easier and more efficient for the
end-users to access the critical business information. The multidimensional data model is flexible and
more easily manages reorganizations, new products and different types and levels of analysis.
Multidimensional data modeling organizes the data into two types of entities - facts and dimensions. Fact
entities contain key numerical measurements of business performance and the dimension entities contain
the description information about the performance indicators. The facts are the WHAT information and the
dimensions are the HOW an analyst wishes to query and view the information
Multidimensional data modeling is a useful technique for modeling an organization's analytical
processing information requirements. While the specific information that a user needs in business
or decision analysis may be ad hoc in nature and difficult to predefine, the dimensions of analysis
are generally rather stable. These dimensions define the various data views or perspectives of
interest to users in their business analysis or decision making activities such as:
Whom does the organization serve (i.e., the Customer dimension)?
How does the organization serve the customers (i.e., the Distribution Channels
dimension)?
Where does the organization serve the customers (i.e., the Markets dimension)?
What does the organization provide to its customers (i.e., the Products/Services
dimension)?
How is the organization performing through time (i.e., the Timing of Business
Event/Activity dimension)?
The information about these dimensions or their combinations which is of interest to a business
analyst is referred to as facts. As illustrated in the exhibit, an analyst may be interested in
knowing about the facts of sales and marketing, profitability, quality, risk, productivity or
compliance relating to the dimensions of customer, service, market, delivery channel, product,
organization or time. For example, an analyst may be interested in the total dollar sales amount
(a sales fact), for a particular product (product dimension), during a particular month (time period
dimension) in all markets (market dimension).
In particular, the "time" dimension is commonly found in multidimensional modeling as it captures
historical data required to support many types of analytical applications such as trend analysis.
Data may be analyzed at various aggregation levels (e.g., at the national, division or region
levels) or may be "drilled down" to examine the underlying detail data.
The primary inputs to multidimensional data modeling in this task come from various sources
including:
The Conceptual Data Model, including the performance measurement entities,
quantitative measurement entities and unstructured data entities, developed in the
Analytical Processing Data Definition phase; and
The understanding of the organization's business events and processes obtained in the
Analysis and Decision Process Definition phase.
Page 203
A.12.11
Identify the analytical dimensions and facts by examining the business analytical information
needs:
Examine current data usage (i.e., bottom-up driven) such as:
report parameters (e.g., the selection criteria may directly suggest dimensions),
contents of the standard or selected ad hoc management reports (i.e., examining the
facts of interest to determine the corresponding dimensions), or
record keys (e.g., records keys such as Customer-ID or Product-ID may suggest
dimensions);
Examine the analysis and decision processes identified in the Analysis and Decision
Process Definition phase, specifically focusing on the performance measurement
processes and quantitative measurement processes;
Review the essential data characteristics of the organization's business events or
processes identified in the Analysis and Decision Process Definition phase. Determine
the major data views or perspectives of interest to users such as:
any events of a business process by time, agents, resources or locations,
any period of time by events, agents, resources or locations,
resources by type of event, time, agents or location,
agents by type of event, time, resources or locations, and
location by type of event, time, agents or resources;
Review the entity types defined in the ERM or CDM (e.g., a kernel entity may be a
candidate analysis dimension). The following exhibit depicts the identification of potential
dimensions on a CDM; and
Identify the attributes for each dimension describing the aspects of the dimension which
are of interest to the business analysts. These attributes may also be used as constraints
in selecting specific "acts" of interest in analytical processing; e.g., a sales trend analysis
may focus only on selected regions by selecting specifying selected values of the
attribute Region of the MARKET dimension.
These attributes will become either true characteristics of an InfoCube dimension or
navigational attributes of a dimensional characteristic.
Develop a logical data model of the relevant dimensions and facts within the scope of the project
as illustrated in the sample model.
Dimensions and facts are interdependent. On the one hand, dimensions may be determined
based on the identified facts; on the other hand, knowing the dimensions, facts may be
determined. Thus, identifying dimensions and facts is completed concurrently as an integral step.
Page 204
A.12.12
Page 205
Page 206
A commonly used attribute in defining dimension tables is the "level" attribute which
describes the various aggregation (summary or roll-up) of atomic level facts relevant to
the related dimension; e.g., in the MARKET Dimension Table example, the MARKET
LEVEL attribute indicates that facts relevant to the "market dimension" may be retrieved
at the "national", "division", "regional" or "market" (atomic) levels. These are known as
internal hierarchies in BW;
External hierarchies may also be defined for dimension keys or attributes of a dimension
that define additional ways a given characteristics values can be aggregated; and
A sequence or order attribute may be defined to facilitate consistent and intelligent sorting
or ordering of the dimension table when the sort requirements are not based on numeric
data; e.g., a "MONTH ORDER" attribute may be defined to facilitate displaying data in
month name order.
An effective-date and/or currency indicator attributes will be required if source system data
changes need to be stored in the dimension tables; e.g., the following exhibit reflects the fact that
a customer, whose name was Mary Smith up until December 1994, changed her name to Mary
Jones since then. Mary Jones is still her current name. Query processing will have to take this
into consideration when returning the current information. These will determine if time-dependent
characteristics are to be included in InfoCubes;
Default or unknown values may be allowed in a dimension table especially when the
corresponding source data for the attribute is not mandatory; e.g., GENDER is often not a
mandatory attribute and thus should allow for an unknown attribute value;
Denormalized dimension tables may be defined to enhance their understandability. In
general, business information users may find it easier to visualize data consolidated in a
single chunk than to perceive data broken down into many smaller tables.
Denormalization may also simplify end-user access by providing short, medium, or long
text descriptions instead of just using code values.
A dimension table may also be denormalized for technical design considerations such as
improving performance by reducing the number of joins required.
Other concepts to be considered in defining dimension tables include:
Page 207
A TIME PERIOD dimension table is almost always required to facilitate the historical or
trending aspect of a DW; and
The more dimension tables, the greater the number of rows is required in the fact tables.
A.12.13
Page 208
Perform calculations involving specific "cells" of the fact tables to meet the information
needs of a particular business or decision analysis situation.
A.12.14
Cross-check the dimension and fact tables to ensure their consistency. Revise or refine the tables
as needed.
A.12.15
Develop a physical database structure supporting the dimension and fact tables defined in the
previous steps considering factors such as:
Denormalization requirements;
Design parameters or BW InfoCubes and InfoObjects; and
Performance considerations.
The next exhibit provides an example of a physical data model in the form of a star schema
linking a fact table to the related dimension tables.
Page 209
Page 210
A.12.16
Complete a structured walk-through of the multidimensional
data model.
Complete a structured walk-through of the multidimensional data model to confirm its
completeness and consistency. Resolve any questions or issues and make any changes as
necessary. Obtain formal written approval.
Page 211
A.12.17
InfoCube Design
A.12.18
Purpose
The purpose of this activity is to identify and document key figure (facts) requirements for each
functional area.
Procedure
Review the following documents for technical and functional requirement for each Key Figure:
Information Access Environment Document
Data Definition
Multi-Dimensional Data Model
High-level Design
Identify, review and document the following:
InfoSource enhancement requirements for each Key Figure
Aggregate requirements for each Key Figure
Data extract schedule requirements for each Key Figure
Data volume requirements and infrastructure implications for each Key Figure
Transfer Routine user exit requirements for these key figures in BW
Conversion Routine user exit requirements for these key figures in BW
Update Rules requirements for each Key Figure
VBA routines required in the Excel Workbooks for each key figure
Key Figures include in target InfoCube, Templates, Channels, Excel Work Books
Currency or unit of measure requirements for these key figures
Cross modular integration and cleansing requirements each cross modular key figure
Presentation characteristics of each key figure (summarized with hierarchy, shown in
thousands, etc.)
Navigational characteristics of each key figure
Delta versus full update for each key figure
Key figures used for calculated key figures
Key figures used for restricted key figures
Result
The exact flow of key figures from the source system to an InfoObject in a report is clearly
defined. All aspects of the transport and presentation of the key figures for end users should be
reviewed and clearly documented.
A.12.19
Purpose
The purpose of this activity is to identify and document characteristic requirements for each
functional area.
Procedure
Review the following documents for technical and functional requirement for each
Characteristic:
Information Access Environment Document
Page 212
Data Definition
Multi-Dimensional Data Model
High-level Design
Identify, review and document the following:
InfoSource enhancement requirements for each Characteristics
Aggregate requirements for each Characteristics
Data extract schedule requirements for each Characteristics
Data volume requirements and infrastructure implications for each Characteristics
Transfer Routine user exit requirements for these Characteristics in BW
Conversion Routine user exit requirements for these Characteristics in BW
Update Rules requirements for each Characteristics
VBA routines required in the Excel Workbooks for each Characteristics
Characteristics include in target InfoCube, Templates, Channels, Excel Work Books
Currency or unit of measure requirements for these characteristics
Cross modular integration and cleansing requirements each cross modular
characteristics
Presentation characteristics of each characteristics (using a hierarchy)
Navigational characteristics of each characteristics
Delta versus full update for each characteristics
Language requirements for each characteristics
Result
The exact flow of characteristics from the source system to an InfoObject in a report is clearly
defined. All aspects of the transport and presentation of the characteristics for end users should
be reviewed and clearly documented.
A.12.20
Purpose
The purpose of this activity is to identify and document dimension requirements for each
functional area.
Steps
Review the following documents for technical and functional requirement for each Dimension:
Information Access Environment Document
Data Definition
Multi-Dimensional Data Model
High-level Design
Identify, review and document the following:
InfoSource enhancement requirements for each Dimension
Aggregate requirements for each Dimension
Data extract schedule requirements for each Dimension
Transfer Routine user exit requirements for these Dimension in BW
Conversion Routine user exit requirements for these Dimension in BW
Update Rules requirements for each Dimension in BW
VBA routines required in the Excel Workbooks for each Dimension
Dimension include in target InfoCube, Templates, Channels, Excel Work Books
Currency or unit of measure requirements for these dimension
Cross modular integration and cleansing requirements each cross modular dimension
Presentation characteristics of each dimension (using a hierarchy)
Page 213
A.12.21
Identify Compound Characteristics Required for Each Subject
Area
Purpose
The purpose of this activity is to identify and document compound characteristic requirements for
each functional area.
Procedure
Review the following documents for technical and functional requirement for each Compound
Characteristic:
Information Access Environment Document
Data Definition
Multi-Dimensional Data Model
High-level Design
Identify, review and document the following:
InfoSource enhancement requirements for each Compound Characteristics
Aggregate requirements for each Compound Characteristics
Data extract schedule requirements for each Compound Characteristics
Data volume requirements and infrastructure implications for each Compound
Characteristics
Transfer Routine user exit requirements for these Compound Characteristics in BW
Conversion Routine user exit requirements for these Compound Characteristics in BW
Update Rules requirements for each Compound Characteristics
VBA routines required in the Excel Workbooks for each Compound Characteristics
Compound Characteristics include in target InfoCube, Templates, Channels, Excel Work
Books
Currency or unit of measure requirements for these Compound characteristics
Cross modular integration and cleansing requirements each cross modular Compound
characteristics
Presentation characteristics of each Compound characteristics
Navigational characteristics of each Compound characteristics
Delta versus full update for each Compound characteristics
Language requirements for each Compound characteristics
Result
The exact flow of Compound characteristics from the source system to an InfoObject in a report is
clearly defined. All aspects of the transport and presentation of the Compound characteristics for
end users should be reviewed and clearly documented.
Page 214
A.12.22
Identify Calculated Key Figures Required for Each Subject
Area
Purpose
The purpose of this activity is to identify and document the calculated key figure requirements for
each functional area.
Procedure
Review the following documents for technical and functional requirement for each Calculated
Key Figure:
Information Access Environment Document
Data Definition
Multi-Dimensional Data Model
High-level Design
Identify, review and document the following:
InfoSource enhancement requirements for each calculated Key Figure
Data extract schedule requirements for each calculated Key Figure
Infrastructure implications for each calculated Key Figure
VBA routines required in the Excel Workbooks for each calculated key figure
Currency or unit of measure requirements for these calculated key figures
Cross modular integration and cleansing requirements each cross modular calculated
key figure
Presentation characteristics of each calculated key figure (summarized with hierarchy,
shown in thousands, etc.)
Navigational characteristics of each calculated key figure
Delta versus full update extract of data impact for each calculated key figure
Result
The exact definition of the calculated key figure clearly defined. All aspects of the presentation of
the calculated key figures for end users should be reviewed and clearly documented.
A.12.23
Purpose
The purpose of this activity is to identify and document hierarchies requirements for each
functional area.
Procedure
Review the following documents for technical and functional requirement for each Hierarchy:
Information Access Environment Document
Data Definition
Multi-Dimensional Data Model
High-level Design
Identify, review and document the following:
InfoSource enhancement requirements for each Hierarchies
Aggregate requirements for each Hierarchies
Data extract schedule requirements for each Hierarchies
Data volume requirements and infrastructure implications for each Hierarchies
Transfer Routine user exit requirements for these Hierarchies in BW
Page 215
Result
The exact definition of the hierarchies are clearly defined. All aspects of the presentation of the
hierarchies for end users should be reviewed and clearly documented.
A.12.24
Purpose
The purpose of this activity is to identify and document aggregates requirements for each
functional area.
Procedure
Review the following documents for technical and functional requirement for each Aggregate:
Information Access Environment Document
Data Definition
Multi-Dimensional Data Model
High-level Design
Identify, review and document the following:
Data extract schedule requirements for each Aggregates
Data volume requirements and infrastructure implications for each Aggregates
Aggregates include in target InfoCubes, Excel Work Books
Delta versus full update implications for each Aggregates
Result
The identification and documentation of the exact aggregate required .
A.12.25
Purpose
The purpose of this activity is to identify and document business rules by field for each functional
area.
Procedure
Review the following documents for technical and functional requirement for each Business
Rule:
Information Access Environment Document
Data Definition
Multi-Dimensional Data Model
High-level Design
Identify, review and document business rules for the following:
For each Characteristic
Page 216
Result
The identification and documentation of the exact business rule required for all objects, processes
and reports in BW.
A.12.26
Purpose
The purpose of this task is to identify data and retention requirements for BW InfoSource and
InfoCube Objects. Identify the processes and functions where InfoSources and InfoCubes are
updated and/or archived. Determine how long data should be stored on-line and how long data
should be archived according to defined corporate standards.
Procedure
What Data is needed to ensure successful external and internal auditing in the applications?
For example:
Double archiving
Archives kept externally
Are there inconsistencies, gaps, or overlaps for the applications?
Are owners for the standard procedure appointed for the archiving processes in the
application?
Is the division of responsibilities defined between the application and Basis?
Is there a Company procedure depicting the division of responsibilities:
Is BW application team is responsible for customizing, scheduling, performing, and
validating archiving?
Does the IT department provides the system support, and is it responsible for retaining
data to meet auditing requirements?
Page 217
Results
Create an archiving manual, based on the information gathered, to record standard procedures
and schedules, and what to do in exceptional circumstances
Page 218
A.12.27
Purpose
To assemble the various data design deliverables.
Overview
When approved by management, the data design specification assembled in this task will be
refined in the Realization Stage and be developed as InfoCubes and ODS.
The data design deliverable is discussed with management and approved. In some
circumstances, this task may be delayed and completed as part of the Business Blueprint
Checkpoint Phase.
A.12.28
Review all of the documentation for each design specification and ensure that all of the relevant
work products are present. Ensure the consistency and completeness between individual
specification packages.
A.12.29
Conduct a structured walk-through, within the project team, to confirm the completeness of each
design specification and to ensure compliance with both quality and development standards.
A.12.30
Any problems in the design should be resolved through the formal issue resolution process. All
open data design issues should be resolved before the completion of this phase.
A.12.31
Discuss the data design deliverable with management and resolve any questions and issues. If
necessary, prepare an action list of items to modify. This step may require more than one
meeting.
A.12.32
Page 219
Page 220
R o le s a n d R e s p o n s ib ilitie s
M 2
E s ta b lis h P a p e r - b a s e d
D o c u m e n ta t io n
S ta n d a rd s
U
U
D
D
D
ser
ser
ocu
ocu
ocu
D
D
m
m
m
o c u m e n ta t io n S t r a t e g y
o c u m e n ta t io n w o r k p la n
e n ta t io n A u d ie n c e t a r g e t
e n t a t io n D is tr ib u t io n lis t
e n ta t io n C o n te n t
D o c u m e n t a tio n s t a n d a r d s
D o c u m e n t a t io n d e liv e r y m e c h a n is m
D o c u m e n t a tio n r e s p o n s ib ilitie s
M 7
D e v e lo p O n - lin e
D o c u m e n ts S u b S y s te m
O n - lin e U s e r D o c u m e n t a t io n
D o c u m e n t a t io n C o n t e n t
M 3
P re p a re P a p e r-b a s e d
D o c u m e n t In t r o d u c t io n
D o c u m e n t a t io n in t r o d u c t io n
P r o c e s s D e f in it io n D e liv e r a b le
A p p lic a t io n D e s ig n D e liv e r a b le
M a n a g e m e n t O v e r v ie w D ia g r a m
M 4
D e v e lo p P a p e r - b a s e d
U s e r D o c u m e n t a t io n
M 5
P re p a re P a p e r-b a s e d
A p p e n d ic e s
D o c u m e n t A p p e n d ic e s
- S e c u r it y P r o f ile s
- P r o c e s s in g S c h e d u le s
M 6
S u b m it P a p e r - b a s e d
U s e r D o c u m e n t a tio n
P a p e r - b a s e d U s e r D o c u m e n t a t io n D e liv e r a b le
M a n a g e m e n t O v e r v ie w D ia g r a m
Page 221
Page 222
Page 223
A.13.10
Determine responsibilities.
A.13.11
Discuss and obtain formal written approval of the paper-based
documentation standards.
Page 224
A.13.12
Purpose
To provide an introduction for each paper-based document.
Overview
Each document should have an introductory chapter or module that describes the format and
organization of the volume, the audience for whom it is intended and instructions for its use.
A.13.13
Using the definition of types of documentation to be developed produced earlier in the phase,
prepare an introduction for each document. This should describe:
What aspect of the system the document addresses (e.g., user guides, system
operations manuals);
Which users the particular document addresses (e.g., casual users, full-time users);
The intended use of the document; and
Related documents.
A.13.14
Prepare a narrative that describes the format of the document. This should describe the
organization and medium of the document and include sample forms, a catalogue and description
of icons (symbols) used and a list and definition of any reserved words.
This task should also indicate how the materials are to be used, e.g., should the user read it, use
it in concert with the application or use it as a training reference.
A.13.15
Page 225
A.13.16
Purpose
To create the main section of each document to be produced.
Overview
The main section of each document is written during this task. Any appendices or supporting
documentation to accompany the main body is created in the next task.
A.13.17
Determine what user documentation is required. Key considerations are who will use the
information and how much detail is required.
A.13.18
Develop a structure to incorporate all user procedures identified into groups of logically related
processes. Each logical grouping of user procedures is defined in the table of contents for the
documents.
A.13.19
The User Documentation Overview provides a visual representation of the system and the
procedures needed to support the system. It shows the user's interactions with the system and
defines the scope of the system to the user. All user procedures identified are placed into groups
of logically related processes.
Define a numbering scheme for the user procedures and assign a number to each user
procedure.
A.13.20
Prepare the management summary which is a general description of the system. It should
incorporate an updated Management Overview Diagram or other Abstracts and support the User
Documentation Overview by explaining the system in terms of:
Business needs and objectives that the system is designed to meet;
Scope of the system in terms of the processes incorporated to meet the objectives;
What the system does;
Primary users of the system;
Key features of the system; and
Major inputs and outputs of the system.
A.13.21
Prepare the User Procedures which form the main body of the document. They describe the
manual activities to be executed in sequence and their main purpose is to inform users how to
complete the different tasks that compose the various processes in the system and identify:
The starting and ending points of the procedure;
The steps to be completed;
The sequence of the steps; and
The person(s) responsible for completing the steps.
The main components of each User Procedure are:
Responsibility - Who completes the action;
Page 226
Procedure steps - What action to take presented in a logical time sequence and any
dependencies on other User Procedures; and
Supporting documents - Which things (e.g., reports, input forms or control logs) are
related to and needed to complete the procedure step.
Each procedure step should begin with a strong action verb telling the person responsible what to
do. The procedure steps are described in order of occurrence and sequentially numbered.
A.13.22
Where the responsibility for a set of procedure steps is long, complex or detailed, then it may be
easier to document these steps as a Detail Task Description, which details work steps for a single
individual. The Detail Task Description does this without overloading the User Procedure with too
much detail as they are written at a more detailed level than the User Procedure.
It describes the sequence of work steps in a similar manner to the User Procedure with action
verbs and work steps in a logical time sequence, sequentially numbered.
Prepare any Detail Task Descriptions, as necessary.
A.13.23
Assemble samples of each report for inclusion with the User Procedures. These provide a visual
representation of the information produced by the system.
Cross-reference the Report Layouts to the procedures they relate to for ease of use. The Report
Layouts can be either a mock-up of the report or a copy of an actual report from the system. The
means used to create or initiate the reports is also included.
A.13.24
Document the System Responsibilities which define the roles of the users as they relate to the
system. They define the title of the user and the types of procedures for which the user is
responsible.
A.13.25
Discuss and confirm the content of the draft User
Documentation content.
Page 227
A.13.26
Purpose
To create any supporting paper-based appendices.
Overview
To make it easier for users to navigate through a particular document, information that is useful
but not appropriate to place in the body of the document should be included as an appendix.
Alternatively, the appendices may be prepared as separate documents and not be included as
part of the User Procedures manual as the distribution may be different.
A.13.27
Prepare glossary.
Develop a glossary of key terms in simple and clear language and terminology with which the
users are familiar. Also include definitions of key data elements and their relationship to other
data elements used in the system.
A.13.28
A.13.29
Document the processing schedules which include details of the frequency with which each part
of the system is run such as a monthly closing schedule.
The schedule should include references to the User Procedures covered by the schedule, the
names of the User Procedures, when they are executed, in what order they are executed and any
cut-off dates that must be met.
A.13.30
A.13.31
Discuss and confirm the draft paper-based appendices
content.
Page 228
A.13.32
Purpose
To provide a management checkpoint at which the paper-based user documents can be reviewed
and verified.
Overview
The change control procedures and responsibility for ongoing maintenance of the paper-based
documents are defined.
All of the paper-based documents developed during this phase are consolidated into a user
documentation deliverable. This deliverable is reviewed for content and clarity and presented to
management for approval.
A.13.33
Define change control steps necessary to update the user documentation with amendments
derived from system testing.
Define ongoing maintenance responsibilities and steps to ensure that the user documentation is
kept current once the system has commenced live production.
A.13.34
Consolidate the individual parts of the paper-based User Documentation prepared during this
phase.
Review on the basis of content and clarity of presentation.
A.13.35
Discuss the documents with management and resolve any questions and issues.
If necessary, prepare a list of items to be modified. In practice, this step may involve more than
one meeting to resolve outstanding issues.
A.13.36
Obtain formal written approval of the completed documents
and arrange distribution.
Obtain formal written approval.
Complete the reproduction and distribution of the completed documents. Include the control
logging of the distribution for subsequent update purposes.
Page 229
A.13.37
Purpose
To develop the on-line portion of the user documentation.
Overview
This task outlines the activities that support the development of on-line documents either as a
supplement to or replacement of paper-based documentation.
Very often a project will include the development of on-line documents for all or part of the User
Documentation. These generally include two main types of on-line document:
Help - where the User Documentation is programmed as an integrated part of the new
system's functionality; and
Information - where the User Documentation is in a similar format to the paper-based
procedures but is stored and presented in an electronic fashion.
Depending upon the specific project, this can be a long, complex task. The development of Help
systems must be integrated with the development of the system itself. On some occasions,
because of the scale of the project and the different skills mix required, a separate sub-group or
project team may need to be formed to complete the Help system development.
A.13.38
As mentioned above, many operating systems provide facilities for the development of on-line
documentation. If the operating system or development tools do not provide for this, select an
appropriate on-line documentation development tool at this time.
This may require a formal tool requirements definition to be prepared and a package software
evaluation and selection to be completed.
A.13.39
Define or confirm use of Screen/Window Layout and text
composition standards.
Define on-line document standards that should include:
Screen/window types (e.g., warning, Help, glossary);
Screen/window size, layout and colors;
Criteria for creation (when should a screen/window be created);
Content (e.g., what is included/excluded); and
Navigation guidelines (e.g., function key usage).
Where on-line document standards are already defined and being used, they should be adhered
to. The on-line document screen/window standards should also conform to application
screen/window standards for the project (where applicable). Before commencing text
composition, standards must also be established including:
Verb tense;
First, second or third person usage; and
Style.
A.13.40
Using the selected development tool, design, write and create the Help text. The Help text design
will need to map exactly to all of the system's functionality including:
Page 230
A.13.41
There is always difficulty in loading and testing the Help text during a project, as it is rare that the
final system is completed and time is available for separate addition of the Help system.
Ideally, the Help text should form part of the Unit and Component Testing of each program in the
system that contains Help text. If this is not possible, the Help text must be added to these fully
tested single programs and then re-tested to ensure that the Help text functions as designed and
the program continues to function as specified after the addition of the Help text.
This will necessitate careful consultation with the application development group to minimize
duplication in the testing process. Load and test the Help text at the program level.
A.13.42
Ensure that all of the unit/component tested text is included within the system testing process.
The Help system should form an integral part of the overall System Integration and Testing and
not be treated separately.
Change control steps necessary to update the Help text with amendments derived from system
testing need to be created.
A.13.43
Develop ongoing maintenance procedures for the Help
system.
To ensure that whenever any subsequent changes are made to the system once it has
commenced live processing, define the tasks necessary to maintain the Help system in
synchronization with the changed functionality.
The responsibilities/resources necessary to complete the maintenance should be determined and
agreed.
A.13.44
Develop the on-line information where the User Documentation is presented as an electronic form
of paper-based documents, the development process should largely follow the steps described
above for the development of the Help system.
A.13.45
Discuss the on-line documentation content with management and resolve any questions and
issues.
If necessary, prepare a list of items to be modified. In practice, this step may involve more than
one meeting to resolve outstanding issues.
Page 231
A.13.46
Obtain formal written approval of the completed on-line
documents sub-system.
Obtain formal written approval of the completed on-line documents sub-system. This should also
include definition and approval of the responsibilities for:
Ongoing maintenance/updates;
Distribution of changes and additional access to the sub-system; and
Training linkages.
Page 232
A.14 Training
Purpose
The objectives of this phase are:
To provide management with an overview of the new system; and
To teach users how to complete their work procedures with the new system.
Overview
It is assumed that all of the Training Phase tasks and steps will be completed in conjunction with
the organization's Training or Human Resource departments. This should ensure that there is full
participation and involvement, which should result in a seamless hand over of all items.
This phase is aimed primarily at the training needs of the user community and facilitates the
introduction of the DW system to its users to ensure that there is both comprehension of the
capabilities of the system and an understanding of how to use it.
Training should be tailored to management and groups of users based on their position in the
organization and their expected use of the system. For each of these groups, course materials
are developed with the approach and content of each course module being determined by its
learning objective.
Instructors are trained to ensure that they are able to convey the course content using the
prescribed approach. If the course is to be repeated several times, then a pilot training session
can be run to validate the training. The pilot training session is evaluated and the training
approach and materials modified, if necessary, before the main training sessions are scheduled
and conducted.
The training needs assessment occurs throughout the DW project and training needs are defined
and actioned progressively throughout the project. Although the emphasis in this phase is on user
training, the same tasks can be used to assess other training needs such as the needs of the
development project team and to schedule/develop appropriate training. For instance, it may be
necessary to have team members attend specific tool training in the Prototyping Phase before the
Realization Stage activities.
Page 233
P e r s o n n e l d e t a ils
T r a in in g n e e d s
N1
Id e n t ify P e r s o n n e l t o
b e T r a in e d
N2
D e t e r m in e T r a in in g
S c o p e a n d S tra te g y
N3
D e v e lo p T r a in in g
C o u r s e M a t e r ia ls
N4
D e v e lo p a n d C o n d u c t
I n s t r u c t o r T r a in in g
( o p tio n a l)
N5
C o n d u c t P ilo t T r a in in g
S e s s io n s ( o p tio n a l)
N6
C o n d u c t T r a in in g
S e s s io n s
Page 234
P e r s o n n e l r e q u ir in g t r a in in g
T r a in in g S c o p e a n d S tr a t e g y
T r a in in g c o u r s e m a te r ia ls
- C o u r s e M o d u le
- L e a r n in g O b je c t iv e s
- T r a in in g M e t h o d s
- T r a in in g T e c h n iq u e s
- C o u r s e P la n /S c h e d u le s
- In s ru c to r M a n u a l
- P a r tic ip a n t W o r k b o o k
In s tr u c to r G u id e
I n s t r u c t o r T r a in in g M a t e r ia ls
R e v is e d T r a in in g M a t e r ia ls a n d A p p r o a c h
T r a in e d P e r s o n n e l
Page 235
Page 236
A.14.10
The instruction strategy defines the means of delivery of the different courses. Consider various
options such as:
Self-taught courses, e.g., interactive computer-based training (CBT);
Classroom sessions taught by a team of instructors;
Classroom sessions taught by trained staff members who become the trainers of other
staff members;
One-on-one training for some senior managers;
Use of supplier or external third-party training; and
Use of discovery learning (simulation of actual tasks).
A.14.11
If standards do not exist, they should be created for such items as:
Numbering and naming conventions;
Style (language and text format);
Organization issues;
Format, stationery and binding;
Page size;
Responsibility for ongoing storage, maintenance, publishing and distribution; and
Production responsibilities.
Standards are required for:
Presentation and preparation software (e.g., word processing, graphics, desktop
publishing);
Student workbooks - style and content;
Overheads and 35 mm slides;
Instructor guides - style and content (optional); and
CBT authoring software.
A.14.12
Construct a comprehensive training development work plan, based on the training strategy,
A.14.13
Discuss and obtain formal written approval of the training
scope and strategy.
Page 237
A.14.14
Purpose
To prepare course materials for each course module based on the specific method of delivery.
Overview
The content of each course module is determined by considering the course learning objectives
and the level of target audience(s). Training approaches and supplemental training techniques
are selected for each course module. The course modules are sequenced and course materials
are developed.
A.14.15
Determine learning objectives for each course and course
module.
For each course, decide what should be accomplished.
The learning objectives describe the purpose of the course module. They emerge from the
training needs analysis and by assessing how the users will execute the different tasks to
maintain the system.
Objectives are written from the learner's perspective.
A.14.16
A training course should employ both discovery learning and information giving as required by the
system being implemented and by the specific course module objectives.
Discovery learning (exchanging and doing) Involves a high degree of trainee participation. This
approach uses hands-on problem solving or interactive techniques to add reinforcement and
understanding to the learning experience.
Information giving (telling and showing) Involves verbal or visual presentation of material to the
trainee. Examples include lectures, reading materials and demonstrations.
Consider and determine the different delivery approaches.
A.14.17
For each course, divide the training course into logical, manageable modules, each representing
one training session. Describe the content and flow for each module.
A.14.18
Each module's content should be able to be presented in no more than one-and-a-half hours.
There should be no more than two procedural tasks included in a course module. Use key words
to identify major concepts, techniques or information that must be included.
A.14.19
Page 238
A.14.20
Using the training standards, create the student workbooks, which should provide structure for
the course materials as well as materials that can be taken away for later reference. Consider
including:
Parts of the user procedures;
Copies of slides, overheads and charts;
Outlines of videotape materials;
Exercises/case studies;
Examples and tips, tricks and traps;
Relevant readings;
Pages for note taking; and
Guides or schedules for individual and group activities.
A.14.21
Dependent upon the different delivery approach, create or acquire the training materials.
A.14.22
Where the training is to use a subset of the production computer application system, a separate
technology environment may need to be acquired and installed.
Define the necessary computer environment and configuration and where appropriate, acquire
and install all of the various components. This may also require a separate installation (on a
smaller scale) of the production application.
Define the responsibilities for the management and maintenance of the training system.
A.14.23
Practice exercises may need to use the computer system. Define and create these exercises and
case studies.
When using the automated system there is additional work to prepare for computerized exercises
that may include:
Setting up user IDs and passwords;
Setting up test libraries and test data; and
Loading and testing the exercises and case studies.
A.14.24
Discuss and obtain formal written approval of training course
materials.
Submit topics/modules to designers and functional experts for review. Make corrections and
additions where required.
Discuss and obtain formal written approval of the materials.
Page 239
A.14.25
Purpose
To ensure that the training instructors are knowledgeable and are able to convey the course
content using prescribed methods.
Overview
Instructor materials are prepared and instructor training is completed. The mix and experience of
instructors will determine the depth and type of instructor training required before teaching
commences.
A.14.26
Determine the instructors and define their individual training
needs.
Determine who will fill the training instructor roles. Determine each instructor's individual training
needs. Consider:
Orientation for those who are technically competent and are experienced in using the
educational method chosen; and
Instructor training for those who are technically competent but not experienced in either
teaching or using the educational method chosen.
A.14.27
A.14.28
Prepare the audio-visual or other materials needed to conduct the instructor training sessions.
A.14.29
Complete appropriate individual training for each instructor to ensure that they have the
appropriate teaching skills before they begin to provide tuition.
For the course content training, prepare the schedule and arrange for:
A location for the course;
Audio-visual, computer or other equipment availability;
Course material preparation and distribution; and
Establish test training technology environment (e.g., components such as LAN, database
and application). Allow adequate time to "fine tune" the environment.
A.14.30
A.14.31
Discuss and obtain formal written approval of instructor
training materials and course.
Page 240
A.14.32
Purpose
To test the training to ensure that all training courses are understandable, relevant and wellreceived by students.
Overview
Pilot training session(s) are scheduled and conducted to enable the course designers and
instructors to evaluate the course content, approach and techniques. Modification of the training
approach and materials may be required as a result of feedback from the pilot training session(s).
A.14.33
Define which students will attend the pilot course and which instructors will teach.
Prepare schedule and arrange for:
A location for the course;
Audio-visual, computer or other equipment availability;
Course material preparation and distribution; and
Establish test training environment (LAN, database and application). Allow adequate time
to "fine tune" the environment.
Observe student reaction to the course and request student evaluation(s) of the course
presentation, materials and exercises.
A.14.34
Evaluate the training session in relation to objectives, presentation, content, materials, exercises
and test results based on:
Student reaction;
Student evaluation;
Instructor evaluation; and
Observer evaluation.
Modify training approach/materials as required based on the evaluation.
A.14.35
Discuss and obtain formal written approval of final training
materials and approach.
Present the revised materials for review and obtain formal written approval.
A.14.36
Provide copies of all training materials developed for the project and any other developmental
data, documents or programs.
Ensure that responsibility for ongoing maintenance and updating of the materials has been
assigned to the appropriate personnel.
Page 241
A.14.37
Purpose
To train the management and users on how to use the new system in an efficient manner.
Overview
The training session(s) are scheduled, students notified and the courses are conducted. If a
course is to be offered more than once, initial sessions should be evaluated and appropriate
modifications included in subsequent sessions. If many sessions are to be offered, it may be
appropriate to conduct a pilot session for the specific purpose of evaluation and modification.
Ongoing course maintenance and subsequent training responsibilities are established.
Depending upon the specific project, completion of the training to begin the implementation of the
new DW system may take some considerable time, particularly if there is a wide or dispersed
audience for the training (e.g., if there is a staged or phased implementation). In these
circumstances, this step may need to be supplemented with additional work tasks, additional
planning and resources and increased project management responsibilities.
A.14.38
A.14.39
Notify the personnel who will attend each training session of the location, date, time and any
required preparation.
Any required pre-course materials should be distributed to the students.
If a Help Desk support function is to be provided to support the new system, early training of the
Help Desk personnel may be required before the bulk of the user training occurs.
A.14.40
Complete the schedule of training sessions. This should not be limited to the sessions completed
at the commencement of the new system but should address other issues such as those related
to a staged or phased implementation and include refresher courses as well as addressing staff
transfers, new hires and other ongoing training needs.
A.14.41
Observe student reaction to the courses and request student and instructor evaluation(s) on the
course presentation and materials. If necessary, modify the training courses based upon the
evaluations. Course evaluations should be included in all courses presentations.
Page 242
A.14.42
Define and approve responsibilities for ongoing maintenance
and presentation of training courses and materials.
As changes are made to the system, the training courses and materials need to be maintained to
keep them in step with the system and presented to the users during the life of the system.
Page 243
Page 244
O 1
A g re e A c c e p ta n c e
C r it e r ia a n d T e s t
S tra te g y
S t a f f r e s p o n s ib ilit ie s
A c c e p ta n c e c r itie r ia
A c c e p ta n c e te s t s tra te g y
Test C ases
P ro to ty p e
B u s in e s s S c e n a r io s
O2
D e v e lo p A c c e p ta n c e
Test C ases
A p p r o v e d A c c e p ta n c e C r ite r ia
A c c e p ta n c e T e s t P la n
A c c e p ta n c e T e s t S tra te g y
S y s t e m T e s t e d A p p lic a t o in
O3
C o n d u c t A c c e p ta n c e
Test
A c c e p ta n c e T e s t D a ta
A c c e p ta n c e S ig n - o f f
Page 245
Page 246
Page 247
When this process is complete, the acceptance test strategy is agreed and signed-off by the
project team manager and the user team representative.
Review, agree and obtain formal written approval from management.
Page 248
Page 249
A.15.10
Review the acceptance test cases developed by the system users. If they have not been
developed at this point, assist them in building appropriate test cases.
Provide the system users with copies of the generic test forms (Test Plan, Test Data Specification,
etc.) and the appropriate phases of the methodology to give them a framework for developing
acceptance test cases.
Suggest the use of Business Scenarios developed jointly with the users in the Prototyping Phase
as useful starting points.
Complete a structured walk-through of the acceptance test script and test cases with the
Acceptance Test Team. Make any adjustments as necessary.
A.15.11
Discuss and confirm the acceptance criteria and the overall approach to be used in conducting
the acceptance tests with management including the responsibilities for verifying the test results.
Develop an acceptance test work plan with the system users to prioritize and sequence the
acceptance test cases and to organize available resources for conducting the tests.
If necessary, prepare a list of items to be modified. In practice, this step may involve more than
one meeting to resolve outstanding issues.
A.15.12
Obtain formal approval of the acceptance criteria and
approach.
Obtain formal written approval of the acceptance criteria, the approach to be used, the
acceptance test cases and the responsibilities for management and execution of the acceptance
test.
Page 250
A.15.13
Purpose
To confirm that the completed system meets the agreed requirements.
Overview
The acceptance test(s) is completed and results verified. Based upon successful completion of
the test, the operational system is approved as ready for production.
A.15.14
Depending on the degree of involvement in the acceptance test process, complete either of the
following:
Provide support to both the users of the system and the staff operating the system during
the acceptance tests; or
Direct the acceptance test on behalf of the users with users and operations staff (if
possible, to avoid potential conflicts of interest, this should not involve the same
personnel that have been responsible for system testing).
The acceptance test may have to be completed more than once, depending upon the means of
system implementation. If there is a staged or phased implementation or parallel running of the
old and new systems, each acceptance test must be documented and checked. At the completion
of all tests, the individual results may need to be aggregated.
A.15.15
The results of the acceptance tests must be fully documented and checked against the test plan
to confirm that all of the acceptance criteria have been successfully met.
In the event of the acceptance criteria not being met, the standard issue resolution procedures
should be invoked. The reasons for non-acceptance should be documented and the issues
investigated further.
The outcome of this process will either be that the problem is successfully resolved, in which case
approval can take place or that a modification to the acceptance criteria or system is negotiated
between the relevant groups and the tests re-executed.
A.15.16
Obtain formal written acceptance sign-off of the operational
system.
Confirm that the DW system meets the user acceptance criteria and obtain a formal written
approval and acceptance sign-off from all parties concerned, especially the identified system
owner.
Archive the acceptance test results with the project working papers, along with the formal sign-off.
Cross-reference to the Project Charter.
Page 251
Page 252
P1
C o n fir m C o m p le te n e s s
o f t h e B u s in e s s
B lu e p r in t S t a g e
C o n f ir m e d B u s in e s s B lu e p r in t S ta g e D e liv e r a b le s
O p e n Is s u e s
P2
R e v ie w I s s u e s
P r o je c t p la n s
P3
U p d a t e P r o je c t P la n s
U p d a te d p r o je c t p la n s
Q u a lit y P la n
P4
U p d a t e Q u a lit y P la n
U p d a te d Q u a lit y P la n
P5
O b t a in A p p r o v a l o f
B u s in e s s B lu e p r in t
S ta g e D e liv e r a b le s
B u s in e s s B lu e p r in t S t a g e D e liv e r a b le s
F o r m a l A p p r o v a l P o in t
Page 253
Is s u e r e s o lu tio n s
Page 254
Page 255
A.16.6Review Issues
Purpose
To identify all outstanding issues and either resolve or agree future actions.
Overview
All issues must be reviewed and appropriate action agreed/taken.
Page 256
A.16.10
Develop plans for the Realization Stage considering the understanding, findings and issues
relating to the DW project gathered during the Business Blueprint Stage.
A.16.11
Ensure that all plans are included in the project work plan that combines all tasks, dependencies
and milestones for the remaining stages of the project.
A.16.12
Review the original estimates for the Realization Stage which were made based on the best
knowledge at the time. Experience gained during execution of the Business Blueprint Stage may
indicate variances from these estimates that will need to be reflected in the estimates for the next
stages.
Consider experiences on similar projects and on any prior construction work that might already
have been undertaken on this project to date and use to validate the estimates.
A.16.13
Develop detailed work plans for the Realization Stage. Use the plans currently contained in the
project work plan and the revised estimates, taking into account the available staff and skills.
Identify the tasks needed to complete these stages. The standard tasks may, however, need
some tailoring for the particular project. For each task estimate:
Number of project staff;
Amount of time each person is allocated to a task;
Duration and scheduled completion of each task;
Cost of staff time and expenses; and
Cost of system and administrative support staff time.
The estimates should include the amount of time and effort required from the users.
Reflect any changes to the project work plan necessitated by resource or other constraints.
A.16.14
Develop final cost and timing plans, based on the detailed work plans. For each subsequent
phase, prepare a high-level estimate including:
Staffing resources by skill level;
Amount of time estimated to complete a phase;
Scheduling;
Cost of staff time; and
Page 257
Some of the data needed to prepare an estimate may not be available. For these situations,
determine and document the assumptions used to prepare the estimates.
Compare the final cost and timing plans with the originally agreed figures, review any
discrepancies and take appropriate action. This may include:
A negotiated reduction in scope based on changed costs;
Re-phasing of the work to ensure that time scales are met;
Revisiting the original scope to see if scope has changed over the lifetime of the project;
Agreeing a revised cost of the project;
Increasing the staff complement in an attempt to reduce time scales at increased cost; or
Reducing the staff complement to reduce cost but increase time scales.
Formal written approval should be sought for any actions to be taken and plans revised
accordingly.
Page 258
A.16.15
Purpose
To ensure that all aspects of planning and quality have been properly addressed.
Overview
To this point, the Business Blueprint Checkpoint Phase has largely been concerned with work
planning. Many issues are addressed in the Project Charter that go hand-in-hand with the work
plans. These are reviewed and appropriate amendments made to reflect the end of this stage and
the start of the next.
Those sections of the Project Charter that need to be updated for Construction are reviewed and
amended or created, as appropriate. In particular, the Project Risk Assessment needs to be
revisited.
A.16.16
Review the risk assessment as a result of the Business Blueprint Stage experience and the
updated project plans for the Realization Stage. Consider:
How has the anticipated project commitment been in practice?
How has the anticipated commitment and quality of third parties been in practice?
How has the scope changed and why did this occur?
Have the volumes and anticipated performance levels changed significantly?
Has the user base changed significantly?
Are the resources needed still available?
Is the experience and knowledge of available resources as anticipated?
Has information about the business and the technology been learned that was unknown
before?
Has the anticipated processing changed significantly and does this affect previous
assumptions?
Have budget and time considerations changed (e.g., by project overrun)?
To what extent have the needs of the business changed during the project to date?
Have other projects started that have made demands on the resources required for this
project?
Have advances in the marketplace (e.g., new versions of software) significantly affected
the project?
A.16.17
Identify where the risk ratings described in the risk assessment are higher than expected or are
greater than originally anticipated. Obtain agreement with management on the appropriate
actions to take.
Possible options may include reviewing the project and strictly monitoring any deviations from the
project work plan to be able to identify possible symptoms of a larger problem or may involve
changing the project scope or project team composition to try to resolve the issues.
In this latter instance, assess the impact of making changes to the project plan on the associated
strategy documents developed in this stage. Update the implementation strategy, data conversion
strategy and appropriate testing strategies as necessary.
Page 259
A.16.18
Review the Project Risk Assessment completed earlier to identify any changes from the initial
assessment. For identified threats, document:
The likelihood of impact on the project;
The severity of the risk;
How will the fact that the risk factor has come to fruition be recognized; and
How will the situation be addressed (e.g., avoidance, control, assumption or transfer).
A.16.19
Many areas within the Project Charter may not have been completed in full until this point since
they primarily affect the Realization Stage. Ensure that all these areas have been properly
addressed before progressing to the Realization Stage.
Revise the Project Charter as appropriate.
A.16.20
Approval for the revised Project Charter should be sought and the plan should be reviewed for
completeness and correctness including:
Any issues should be discussed between project management and the independent
assessor;
Corrective actions to be taken should be agreed by all parties and documented. Such
actions should have clear time scales and responsibilities assigned; and
Further reviews should be scheduled to ensure corrective action is undertaken.
Page 260
A.16.21
Purpose
To seek final review and formal written approval of the Business Blueprint Stage deliverables. . At
this point, the deliverables and a Class 3 estimate are provided for approval to the GRT/DRB as
per CPDEP.
Overview
Throughout the Business Blueprint Stage, formal written approval should be sought for
deliverables on an ongoing basis. At the close of the stage, a final review of deliverables is
undertaken and formal written approval obtained for all of the Business Blueprint Stage
deliverables.
A.16.22
Discuss the Business Blueprint Stage deliverables with management and resolve any questions
and issues.
Responsibility and time scales for formal sign-off should have been defined in the Project Charter
and project contract.
Many of the deliverables should have already been subject to review and formal written sign-off
by user representatives. Review the work deliverables and address any outstanding sign-off
issues.
A.16.23
It is extremely important that the design of the DW system and its surrounding components to be
approved and "frozen". However, throughout the Realization and Final Preparation Stages,
changes may need to be made to the design.
Ensure that project management and senior management agree on a process to manage these
proposed design changes. This could include formal change requests with justification and cost
estimates.
A.16.24
Obtain formal written approval of the Business Blueprint Stage
deliverables.
Obtain formal written approval of the Business Blueprint Stage deliverables as defined in the
Project Charter and the project contract. Complete the necessary approval procedures to
complete CPDEP Phase 3 Develop Preferred Alternative(s).
*** FORMAL APPROVAL POINT ***
Page 261
Realization Stage
The purpose of this stage is to construct system components based on the Business Blueprint.
The objectives are final construction and testing of the system.
Key Deliverables:
1) Program code, constructed BW components, and documentation
a) Data extract programs Extractors, Custom ABAP, or ETL Generated Components
b) Data transformation and DB components InfoSources, InfoCubes, Transfer Rules,
Update Rules
c) Presentation components Queries/Workbooks, Web components, common query
objects
2) Unit and Integrated Test strategy
3) Unit to System Test migration plan
4) Converted data (if applicable)
5) Security testing
6) Stress testing with high data volumes
7) Sizing
Custom Extract System Construction, BW Administrators Workbench Construction, and
BW Explorer Construction
These three phases develop detailed specifications from the design documentation and then
construct, test and document the individual components.
Testing in these phases consists of:
Unit testing;
String testing; and
Preparing the Unit to System Test Migration Plan.
Types of Testing
The purpose of software testing is to identify and correct errors before implementation. In the
methodology, a comprehensive set of testing tasks are described that span the Construction and
Final Preparation Stages of the development life cycle. The main testing activities are:
Unit Testing;
String Testing;
System Testing;
Integration Testing;
Stress Testing;
Back-up/Recovery Testing; and
User Acceptance Testing.
Certain of the types of testing listed are described as discrete phases of work within the
methodology. Others are combined with tasks with which they are tightly coupled (e.g.,
component development and unit testing) and still others are defined as tasks or even steps
within a phase (e.g., stress testing is considered to be a form of system testing).
All of the types of testing are important but they are not all at the same level of decomposition
from the perspective of project planning. e.g., stress testing may be referenced implicitly within
system testing but if it is a critical part of a project, it may be elevated to a separate set of tasks
on a project plan. The decision on the relative importance of the different types of testing will
Page 262
Page 263
Page 264
Page 265
Page 266
Page 267
Q1
P re p a re P ro g ra m
S tru c tu re C h a rts
F in a liz e d P r o g r a m s p e c if ic a tio n s
Q9
C o n s tru c t D a ta
C o n v e r s io n S y s te m
D e t a il lo g ic c h a r t s
Q3
C o m p le t e S t r u c t u r e d
W a lk - t h r o u g h o f
P r o g r a m S p e c if ic a t io n s
B u s in e s s S c e n a r io s
P r o g r a m s p e c ific a t io n s
P r o g r a m d o c u m e n t a tio n
A p p ro v e d p ro g ra m
s p e c if ic a t io n s
Q 10
E x e c u te D a ta
C o n v e r s io n
Q4
D e v e lo p E x e c u ta b le
P ro g ra m C o d e
P ro g ra m c o d e
P r o g r a m d o c u m e n ta t io n
Q5
C o m p le t e S t r u c t u r e d
W a lk - t h r o u g h o f
P ro g ra m C o d e
A p p ro v e d p ro g ra m c o d e
Q6
D e v e lo p /E x e c u t e U n it
Test
U n it t e s t o b je c t iv e s
U n it t e s t c o n d itio n s
U n it t e s t p la n s
U n it t e s t d a ta
U n it t e s t e n v ir o n e m n t
T e s t p r o b le m r e p o r t s
T e s t p r o b le m r e p o r t lo g
Q7
E x e c u t e S tr in g T e s t in g
S
S
S
T
T
Q8
D e v e lo p U n it to
S y s t e m T e s t M ig r a tio n
P la n
R e v is e d p r o g r a m d o c u m e n t a t io n
U n it t o s y s te m te s t m ig r a tio n p la n
t r in g
t r in g
t r in g
est p
est p
Page 268
te s t s tra te g y
t e s t p la n
te s t d a ta
r o b le m r e p o r t s
r o b le m r e p o r t lo g
C o n s tru c te d D a ta
C o n v e r s io n S y s te m
C o n v e rte d D a ta
Page 269
Major function (e.g., UPDATE, SORT, PROCESS, OPEN, CLOSE). All process related
specifications should depict the possible program pathing decisions, (i.e., the start of
each program path);
Called program interfaces (i.e., execution of subroutines). The inter-module interface
indicates the data to be passed and/or the parameters;
Subordinate functions (e.g., data set input and output, data manipulation at the segment
or record level, data initialization and error processing); and
Execution sequence of the modules (e.g., the definition of all control levels). The control
levels should depict all program linkages, specify the lower level modules to be invoked
and specify under which conditions this takes place.
This final factoring of the Program Structure Chart takes the input and output modules to their
external entry and exit points, respectively and decomposes the central transform into modules of
manageable size while maintaining high cohesion and low coupling within the modules. In
general, in a Program Structure Chart:
Input modules should break down into input and transform modules;
Output modules should break down into output and transform modules; and
Transform modules should break down only into smaller transform modules.
Page 270
Cohesion is the measure of the relationships among code structures contained in the
same module.
Where a module is limited to dealing with one data entity, the degree of cohesion is enhanced.
The intent is to have a module with high cohesion and groups of modules to have loose coupling.
Page 271
A.17.10
Purpose
To prepare program specifications in the appropriate format for all programs that are to use a
program generator.
Overview
The objectives of this task are to:
Employ the program generator in an efficient and effective manner;
Select the functions of a program that will be produced via custom coding and the
functions that will be generated;
Modify the program specifications for submission to the program generator;
Generate the program code;
Prepare the source code from the generator and the custom-produced code for unit
testing;
Provide a set of program documentation in accordance with defined standards; and
Use the available facilities of the generator to capture, store and document program
specification contents.
The steps necessary to define modified program specifications in a format and/or syntax
compatible with the requirements of a chosen code generator are defined here.
Program generators are intended to reduce the effort necessary to produce structured programs
and do so by standardizing and automating the production of processing modules such as
mainline and paragraph control, transaction processing, report formatting and edits and
validations.
The work steps are based upon the use of a modified program specification that consists of three
categories of components - input specifications, processing specifications and output
specifications. Each component will vary depending on the requirements of the specific generator
as some generators eliminate the need to encode logic using a precise syntax, whereas others
require it. Typically, each component will consist of:
Input record definitions
Output record definitions
Processing specifications:
Structure Charts,
Program Logic Specifications,
Table Definitions,
Edit/Validation Specifications,
Process Descriptions,
System Messages, and
Detail Logic Charts.
References to Detail Logic Charts are added to the design as part of finalizing the program
specifications, as they will be used to develop customized program code rather than being
generated.
The input, output and processing specifications may have been captured during the Analysis
and/or Business Blueprint Stages if an automated tool has been used on the project and:
Program generation function is an integral part of the automated tool used; or
Page 272
An automated interface exists between the automated tool used and a program
generator.
Regardless of the project situation (i.e., whether an automated tool is used), the steps for
program generation are still applicable, although some of these steps may be completed at
different times in the project depending upon the type of automated tools being used.
A.17.11
Specify program generator produced documentation to be
employed.
Indicate which items on the program documentation checklist are to be machine generated and
which are to be manually produced. Ensure that templates provided with the program generator
to optimize the use of the product are reviewed and used wherever possible.
A.17.12
Not all modules may be appropriate for generation. If a program is not going to be generated, it
should be designed and documented in accordance with the one of the alternative paths through
this phase.
Before ruling out a program generator for specific programs, consideration should be given to
using the program generator to produce skeleton programs to which custom code is added. The
benefit of skeleton code produced in this fashion is that the final program will conform to the
structure of programs wholly produced by the program generator. This should ease future
maintenance of the system.
All modules to be generated should have a final processing specification defined that is
appropriate to the generator.
A.17.13
Finalize the content, format and usage of input and output screens/windows/reports.
Capture the screen/window and report definitions within the program generator. Ensure that the
input/output data items are consistent with the Data Dictionary descriptions, edit/validation
specifications and database contents.
Finalize record definitions, including external data tables. Review the data attribute definitions
contained in the Data Dictionary for completeness and accuracy. Analyze primary and secondary
access keys and examine access paths for logical segments and records to minimize path length
where practical.
Capture record definitions within the program generator. Convert data attribute definitions to the
data format rules associated with the program generator. Enter record definitions to the program
generator.
A.17.14
Finalize Edit/Validation Specifications and define System Messages for all programs to be
generated. Identify potential areas for reuse of common processing.
Finalize processing controls. Include, input controls (e.g., log-on verification, access restrictions,
check digits, batch control totals), processing controls (e.g., data file control records, run-to-run
control totals), as well as output controls (e.g., end-of-report control totals).
Page 273
Page 274
A.17.15
Complete Structured Walk-through of Program and/or
Component Specifications
Purpose
To ensure the completeness and quality of the program or component specifications to facilitate
the coding, testing and maintenance activities.
Overview
The structured walk-through process provides a formal procedure for quality assurance of the
program specifications to enable the timely discovery of design errors before beginning the
program development tasks.
For each program, a program specification is assembled, reviewed, cross-referenced against the
Program Control Checklist, subjected to a structured walk-through and finally approved.
A.17.16
Ensure that the program specifications deliverable is
complete.
Review and cross-reference the program specifications against the Program Control Checklist.
Other necessary documentation not produced by the development tool may need to be
developed or modified from documentation developed during the Business Blueprint Stage.
A.17.17
Conduct structured walk-through of the program
specifications.
Conduct a structured walk-through of the program specifications. This process enables the
project team to identify and resolve any discrepancies, omissions and logic errors. Adherence to
standards and conventions as well as structured design techniques should be monitored.
Changes to program specifications should be documented and approved using the System
Change Request form.
Document and resolve issue resolutions, as required. The issue resolution process provides a
formal reporting mechanism for documenting and resolving major decisions/problems.
A.17.18
Obtain team leader or equivalent approval of the program
specifications.
Page 275
A.17.19
Purpose
To develop and compile functionally correct program code that is easy to maintain by ensuring
that it conforms to the approved program coding standards, naming conventions and program
specifications.
Overview
All programs are coded from the final program specifications using established language-specific
program coding standards and conventions. Coding proceeds in a "bottom-up" fashion, where
reusable modules and programs are the first to be created so they may be prepared for general
access in a protected library.
Code all shared, common, support and/or utility programs and modules. These are modules and
programs that will be used in or called by more than one program and should be coded once for
the entire project and then shared through a protected library catalogue. In a client/server
environment, they would include the stored procedures and the API library(ies) accessed by
remote procedure calls (RPCs).
A.17.20
Code all input/output program modules. Because all program specifications to this point have
addressed logical data entities, the program modules to execute input/output should address
physical data records as defined in the Data Design Phase. There should also be reusable code
in these modules as well because the structure of the stored data:
Should be stable;
Will be reflected in the structure of the program; and
Will be needed wherever the same data is accessed.
A.17.21
A.17.22
Execute the program generation software. The result will either be program code (from a
parametric code generator, e.g., TELON) or a ready-to-run program (using 4GLs such as
PowerBuilder).
A.17.23
Resolve all syntax and diagnostic program errors. This is an iterative step. If syntax errors are
found, they need to be corrected and the program code generated again until no errors are found.
Page 276
A.17.24
A.17.25
Compile the program code and command language code and identify and correct any syntax and
diagnostic errors. The program code is then recompiled until no syntax or diagnostic errors are
found.
A.17.26
Ensure that all of the program documentation is complete using the Program Control Checklist.
Page 277
A.17.27
Purpose
To minimize the maintenance effort and to ensure program quality through the timely discovery of
program coding errors and/or omissions, deviations from prescribed specifications and the proper
adherence to established program coding standards and naming conventions.
Overview
Each program is submitted to a structured walk-through and approval process to ensure that only
standardized code is produced.
Program specification changes are documented and approved and the program code is
submitted to the project team leader for formal written approval.
All program code should comply with the program specifications. Adherence to approved program
coding standards and conventions, as well as structured programming techniques, should be
stressed for custom-coded modules. Changes to program specifications must be documented
and approved through the System Change Request process.
Note: The number of programs that are actually walked-through will depend on the size of the
system, the complexity of the processing and the skill and experience of the programmer(s).
A.17.28
Review and cross-reference the documentation against the Program Control Checklist. Add any
missing items.
A.17.29
Document any required changes to the program
specifications.
Document any changes to program specifications that may be required to achieve the desired
system processing. Any modification of program specifications can potentially impact other
programs that access the same data. It is essential that all program specification change requests
be adequately researched to determine any possible side effects. All program specification
change requests must be fully documented and approved through the System Change Request
process.
A.17.30
Conduct a structured walk-through of the clean-compiled
program code.
Conduct a structured walk-through of the clean-compiled program code. Correct any errors and
omissions identified and recompile the program code. All programs should comply with the set of
program specifications produced during program design.
A.17.31
Submit program code and System Change Requests to the
project team leader and obtain formal written approval.
Page 278
A.17.32
Purpose
To complete unit testing of the program code.
Overview
A Unit Test Plan is developed for each program. The plans are reviewed and approved by the
team leader before execution. All conditions, steps to be executed to test each condition, required
unit test data and expected results of each test step are defined. The expected results are
derived from the program specifications to ensure that the program performs as specified.
The structured walk-through process is used to ensure that a complete set of conditions and
steps has been defined. All test steps and/or data files required to complete the unit tests are
created in this task.
The test steps and data files are made up of multiple transactions with variations of data, both
valid and invalid. Examples of invalid data are included to test edit/validation and error
management routines. Hard copies of unit test transactions are attached to each Unit Test Plan. A
test data generator can be used to prepare the unit test data. The recommended approach is to
develop ONE common set of test data to be used in all of the various test steps. Changes to the
command language instructions may be necessary to execute specific modules at the appropriate
times. To some extent, this will depend on whether the changes made are modifications to
existing programs or the addition/deletion of programs. If necessary, command language
instructions are prepared for each program. Special utility steps to facilitate review and
verification of unit test results may be included as needed.
All unit tests are conducted until each test has been successfully completed. Actual results are
reconciled with the expected results and all program logic errors are corrected. Final test results
are attached to the Unit Test Plan for each test condition/step. Successfully conducted Unit Test
Plans are reviewed by the team leader. If a top-down unit test strategy has been adopted, lowerlevel modules will be added to the current set of test modules and tested with the appropriate test
data or scripts. In a bottom-up unit test strategy, the successfully tested program will be moved
into the system test library.
A.17.33
Page 279
Where appropriate, multiple Test Data Specifications should be defined for each test condition. All
processes need to be tested by use of "cascades" of Test Data Specifications containing at least
three transactions to determine the integrity of program logic, e.g., testing for invalid data entry
should include Test Data Specifications for different invalid values and for various combinations of
invalid data;
Define individual unit test steps for each test condition. Expand the test conditions to form
the test steps. The test steps represent the different program paths that may be executed
to accomplish the same overall test condition.
Begin at the module level and define unit test steps for simple test conditions. Progress to more
complex conditions. Add test steps for successive modules until all components of each program
path (series of modules) have been tested. Unit test steps are required to create the appropriate
settings for the test condition to be effectively tested. They will occasionally require data to be
available that is referenced rather than directly used by the test condition.
Review the Test Data Specifications to ensure that all of the data required by the unit test steps
for a specific unit test condition is available;
Define the expected results of each test step. Include:
output to be generated (screen, data file and report),
unique program paths to be executed,
status of interim data element values,
status of applicable data file values upon completion, and
status of control totals upon completion; and
Conduct a structured walk-through of the Unit Test Plans by comparing them with the
program specifications (not the source code). A correct Unit Test Plan is one that
addresses all processing contained within the program specification.
Wherever possible, include a member of the System Test Team in the Unit Test Plan walkthrough. This involves the System Test Team early in the development process and allows them
to confirm that the program is being adequately tested as a unit before the start of System
Testing. Update Unit Test Plans as needed with changes resulting from the structured walkthrough.
Obtain project team leader formal written approval of Unit Test Plans.
A.17.34
Page 280
Generate the unit test steps and data files. Test steps and data files may be created
directly from the Unit Test Plan if the test conditions and steps are defined clearly.
Physical data files containing the records required to meet the unit test steps are created.
In general, valid data should be tested first, followed by exception conditions. For on-line
processing, unit test data is a predefined series of screens/windows. Test data transactions
should be sufficient in detail and number to satisfy all test objectives, conditions and steps;
Generate module interface test data, where applicable.
Module interface test data is required to test a "called" module before integrating it with the
"calling" module in a bottom-up testing strategy. When testing called modules this way, it may be
necessary to generate separate interface data files. When a top-down strategy is employed,
interface data files will be created by the higher-level module;
Ensure that a hard copy of the unit test data is produced and attached to each applicable
Unit Test Plan.
The documentation required consists of screen/window printouts or data file dumps of the actual
data used;
Add test data to a common pool or build a test database to improve the testing process
and ease the burden of producing test data. This step should be executed or supervised
by the Database Administrator to ensure that database integrity is maintained; and
During the course of testing, additional test data and conditions may be required. To
facilitate this update process, the means to control the maintenance and distribution of
the common test data set must be established and agreed.
A.17.35
A.17.36
Page 281
The definition and development of a testing environment should have been initiated in the
Technology Planning, Support and Cutover. At that time, hardware, software and operational
requirements (e.g., configuration management) should have been defined and implemented for
the environment.
Load source programs and copybooks into the library and include record and/or segment
descriptions and common processing routines. This will ensure that multiple programs are
referencing the same items;
Restrict access to unit test libraries and test data to the responsible developer and/or
project team leader. Ensure adequate back-up and restore procedures are in place; and
Tools exist for the generation of test data, the repetition of regression tests and the
emulation of concurrent on-line sessions. If not already specified, determine the type and
specific instance of tools to be employed as well as the procedures related to the use of
automated test tools.
Some projects may require the construction of a test program driver to control the testing of a
variety of transactions through a series of linked programs.
A.17.37
Page 282
Page 283
A.17.38
Purpose
To identify errors in linked units of a system that exist as a result of the integration of the units and
not within the units.
Overview
String testing is an extension of unit testing that tests the linkages between structurally connected
units and is completed at several levels, which includes testing:
Linked modules of a program;
Linked programs of a functional sub-system; and
Linked sub-systems of a complete system.
Unit testing must be complete for each module of a program before the program level string
testing can be completed; program level testing must be complete before sub-system level testing
can be completed; and so on.
String testing checks for compliance with the program specification and the intended design
structure. As several programmers may be involved in the construction of various units of the
string, testing can be completed by a dedicated tester. Once each component of a program has
been fully unit tested, string testing attempts to identify errors caused by linking the units and in
the interfaces themselves.
A.17.39
Identify all internal interface standards and check that they have been adhered to rigorously. Any
interface not within standards should be documented as a failed test and be returned for
redevelopment.
Standards should exist that determine the rules for interfaces between the components of a
functional group. These standards should include:
Those that are implicit in database and calling sequence;
The order of objects by type within a call;
Format of the calling element;
Validation, at what point calling or called;
Indirect reference calls;
Rules for call by value and by name; and
Call syntax.
A.17.40
Specify test conditions to be developed to satisfy the test objectives and list the job steps to be
followed in creating the test conditions. Document the test data needed to exercise the test
conditions and indicate where "stub" programs are to be used to test the string.
Stub programs would be appropriate in testing client/server applications where a workstation
program may access a local database to provide input values to verify the functionality of the
program instead of having to access a database server across the network. Testing on the
appropriate technology platforms will be completed during.
A.17.41
Execute and log the string tests using one or a combination of:
Page 284
Top-down testing:
design, develop and test stubs that simulate the functions of the lower level
units/programs,
test the top or driver unit/program using stubs in the place of lower level
units/programs,
add each lower level unit/program re-testing after each one is included, and
regression test to check that new units/programs have not had an adverse effect on
previously tested units/programs;
Bottom-up testing:
design, develop and test a program driver that simulates the driver aspects of the test
group,
test the lower level unit/program using the simulated driver modules, and
add the driver module and re-test the complete group; and
Group testing:
link the units of the string together,
test the integration of the group by exercising all input and output parameters, all
modules and calls and program options and special utilities, and
test the minimum functionality in each test so that errors identified can be retraced.
Each test when executed is fully documented so that the test result can be duplicated. This
includes back-ups of all affected files so that they can be returned to their original condition
before the test. Documentation includes:
Execution date and time;
Completion date and time;
System conditions at the time of testing such as system usage volume, database load
and concurrent processing and system software and utility versions;
The test case executed and the data used; and
Test results (e.g., output generated, data passed/received, errors logged, database
updates).
A.17.42
Where errors are identified within the strings, these errors are documented and returned with the
test log and any Test Problem Reports to the programmer for correction. After correction, the
program re-enters the testing sequence. Errors that require changes to the original specification
or design should proceed through the issue resolution process.
A.17.43
After all components have been tested at one level, repeat the string testing procedure for the
next level of integration.
A.17.44
Submit program documentation with string test results to the
team leader and obtain formal written approval.
A.17.45
Page 285
A.17.46
Purpose
To execute a quantitative and qualitative check on the unit test deliverables, to ensure
completeness and compatibility with standards and to develop a plan for the controlled transition
from a unit to a system test environment.
Overview
This task provides a formal program hand over checkpoint. The program specifications and the
unit and string test results are reviewed to ensure that all of the necessary documentation has
been developed and that the contents of the documents meet the predetermined standards for
coding practices and quality control.
The migration plan for the transfer of unit tested programs and documentation to a system test
environment is reviewed to confirm the acceptance of both the direct migration path and any
contingency plans and, where necessary, updated.
As a final step, a migration plan to the system test environment is completed to prepare for the
System and Integration Testing Phase.
A.17.47
Review the program specifications for all programs and confirm that all of the supporting
documentation has been provided and is complete. Use the Program Control Checklist as a
means of checking that all documentation has been completed and approved. Check for such
items as Program Logic Specifications, Edit/Validation Specifications, Report Definitions, System
Messages, Program Structure Charts and program code.
A.17.48
Review all programs, as appropriate and confirm consistency of style and programming practices
as well as compliance with standards for coding and quality control.
A.17.49
Review the Unit Test Plans and the expected results to confirm that all of the tests have been
executed successfully and verified. Confirm that all string tests have been successfully executed.
Ensure that all of the documentation needed to re-create the tests has been retained in a secure
form. Check specifically for Unit Test Plans and expected results, unit test data and system
change requests. Ensure that test utilities have been removed.
A.17.50
Confirm that all of the necessary program documentation has been provided including any job
control statements and library management documentation. Review for completeness and
accuracy.
A.17.51
This step is generally completed by the System Test Team in conjunction with the project team as
it is the System Test Team that will ultimately be responsible for the migration process.
Develop a unit to system test migration plan that includes:
Confirming that the system test environment is properly established;
Recording the software configuration components of the unit test environment
Migrating the unit tested programs to the system test environment;
Page 286
Recovering the transferred programs to a stable base in the event of problems during the
migration process;
Testing the results of migration programs before the start of the system tests to ensure
that the migrated data is correct;
Establishing a software configuration system test baseline; and
Confirming that the responsibilities of the System Test Team and the Project Team for
system and integration testing are clearly defined and understood.
A.17.52
Follow the procedures outlined in the previous step and record any problems using the issue
resolution process. Review issues that arise with management and determine if the fall back
procedures need to be invoked or if the System and Integration Testing Phase can commence.
A.17.53
Obtain formal written sign-off from the System Test Manager
that the migration has been successful.
Page 287
A.17.54
Purpose
To build the programs and other means needed to convert data to the required format to initialize
the BW system.
Overview
The data conversion system design is reviewed and, depending upon which technique or mixture
of techniques is being used to create the master file and opening balance data, the conversion
programs or methods are constructed. If programs need to be written, they are written and tested.
The degree of effort required for these programs should not be underestimated as the conversion
sub-system may contain numerous, very complex programs.
The data conversion task interfaces with BW Administrators Workbench construction tasks.
Conversion input data must be closely aligned with the format required of historical data sources
identified and constructed in those tasks.
A.17.55
Depending on which methods of conversion are being used, the different detailed work tasks will
vary. The various work tasks are allocated to the project team members together with the
responsibilities for management of the process.
A.17.56
Key the data to be created in-house or through an external
service bureau.
If data is to be keyed using the new system's standard input routines, ensure that any batch input
needs are met, data input operators have the appropriate training and the necessary hardware is
available and functioning correctly.
If an external service bureau is to create the data, the contractual issues, detailed instructions,
media, delivery and receipt of data, error checking and timing should be finalized.
A.17.57
If program routines or utilities exist that enable mass creation and replication of data, training
must be given in the use of these features.
A detailed structure and sequence of the creation process must be prepared together with the
correct clerical procedures to control the creation process.
The bulk or mass maintenance features should be tested to ensure an understanding of their
functionality and to determine when and what hardware capacity will be required. This is
particularly significant if there is a large amount of data to be created using this means.
A.17.58
If custom programs are to be written, then the following steps should be completed:
Design the data conversion programs using the programming techniques described in
earlier tasks in this phase;
Code the data conversion programs using the structured coding techniques described in
earlier tasks in this phase. Alternatively, these programs may be created using a program
generator or file utility; and
Page 288
Test the data conversion programs using the techniques outlined in the earlier tasks in
this phase. The conversion programs should be tested rigorously, particularly any
programs that are involved with reconciliation (e.g., control portions of conversion
programs or old versus new post-conversion comparison programs).
Note: This testing is to determine the proper functioning of the programs and not for validating
source data. Testing of the source data (i.e., cleansing), is completed when the program is
executed during the next task. In addition, test data must reflect erroneous source data to enable
verification of these processes in the data conversion programs. Where practical, mock
conversions can be utilized as a part of conversion systems testing. A mock conversion is a full
execution of the conversion system using real data to ensure reliable production execution of the
conversion.
A.17.59
Use utilities.
If utilities are to be used for some components of the data conversion process, then the tasks
should be completed as for the custom programs.
A.17.60
If data is to be created on PCs and then uploaded to another machine, the tasks should be
completed following the same tasks/steps as for the custom programs.
Hardware utilization and capacity needs should be determined and the facility tested to ensure
that the transfer process functions correctly and at the planned speed.
A.17.61
If third-party file conversion software or packages are to be used for components of the data
conversion process, this software will need to be installed and tested (using the standard baseline
testing tasks and steps) to ensure that it functions correctly.
Hardware utilization and capacity needs should be determined for usage.
A.17.62
Scan data.
If data is to be input through scanning devices, the scanners should be tested to determine:
Level of accuracy;
Speed of input;
Hardware requirements and utilization; and
Data correction tasks.
A detailed structure of work tasks, data preparation and sequence of the creation process must
be prepared together with the correct clerical procedures to control the creation process. The
routines for checking the scanned data for completeness and accuracy must be established
together with any error correction routines.
A.17.63
If data is to be acquired externally, the purchasing considerations should be addressed and the
data tested to determine:
Security needs (e.g., Firewalls if the Internet is being used);
Accuracy and completeness of data and any error correction routines;
Method of receipt;
Input means; and
Hardware requirements and utilization.
Page 289
A detailed structure and sequence of the creation process tasks must be prepared.
A.17.64
Document all of the data conversion procedures. The documentation of the data conversion
procedures should follow the documentation rules described in the User Documentation Phase,
but it generally need not be as comprehensive as for the new application. However, particular
attention should be paid to documenting the reconciliation and balancing procedures.
A.17.65
Determine training requirements for the data conversion
project team.
Determine if there is a need for additional training materials or training courses to be developed to
educate the data conversion project team members. If required, refer to the Training Phase for
more details.
A.17.66
Unit, string and system testing of all parts of the data conversion systems should be tested in the
same comprehensive manner as in a normal systems implementation. The use of "real" data can
prove highly beneficial in this step.
The testing to be completed must include all of the different methods to be used in the conversion
process.
System testing of program jobs is needed in cases where several data conversion programs work
together to convert data. If each data conversion program works independently, then system
testing may not be necessary.
The entire data conversion process should be tested, including the verification procedures,
following the techniques using the System and Integration Testing Phase. If there are places
where multiple programs must work together (i.e., call one another or work on the conversion of
the same data in a multi-step process), System Test Plans should be developed and tests
executed for the integrated testing of those multiple programs. The system testing of the
conversion process should include a final cycle that tests the entire process and may include one
or more mock conversion executions.
A.17.67
Finalize the conversion schedule and resources and submit
and discuss with management.
The Data Conversion Plan should be updated to reflect any new information uncovered in this
task and formal written approval sought and gained. The scheduling of the execution of the data
conversion should be coordinated with the implementation schedule of the main system.
Submit the updated plan and the task deliverables to management and resolve any questions
and issues. If necessary, prepare an action list of items to modify. This step may require more
than one meeting.
A.17.68
Page 290
A.17.69
Purpose
To convert data from the current system to the new system format or, if there is no existing
system, create the new data to initialize the DW system.
Overview
The data conversion may need to be completed more than once if there is parallel running or a
phased or staged implementation. This will impact upon the steps in this task.
A.17.70
Prepare for data conversion execution. Before the execution of any conversion process ensure
that:
All required personnel and hardware resources are available;
Existing data files and data entry documents are frozen; and
The conversion process has sole access to the information during execution.
Generally, the database or master files need to be created as input to the system testing process.
To enable early commencement of data creation because of the volume of records or the
extended duration necessary, the data creation of the master files in a skeletal format may begin
well before the system acceptance testing and commencement of live processing. This impact
must be assessed as part of the preparation process.
A.17.71
A.17.72
Create additional data as necessary and execute the data conversion programs and procedures
according to the Data Conversion Plan. The entire data conversion execution may be a multi-step
operation completed over a length of time (i.e., a week may be scheduled for entry of data,
execution of specific programs may be done on different days or execution of a specific program
may have to wait for error-free reconciliation results from previously run programs).
Complete the acceptance testing according to the defined data conversion acceptance criteria.
Prepare documentation and audit trails of the actual conversion process and ensure that the
conversion results are properly checked by the appropriate resource (e.g., internal audit).
The creation or storage of the appropriate historical data is included as part of this conversion
process.
The steps will also vary if there is parallel running or a staged or phased implementation.
Page 291
A.17.73
Reconcile the results of the conversion execution with the original source data. Enter details on a
data conversion reconciliation worksheet. Ensure that the reconciliation process is clearly
documented and supporting information for each reconciliation item is prepared and maintained.
As mentioned above, reconciliation of the results of some conversion processes may be
completed before execution of other processes.
Appropriate formal written approvals must be obtained that the reconciliation has been properly
completed, particularly if the conversion demands that some write-offs or other adjustments occur
that impact upon an organization's financial results or balance sheet.
A.17.74
After the conversion and reconciliation of data has taken place, archive the old system data in
case the system support team subsequently needs to refer to the original data to resolve any
audit requirements and any problem areas. Agree on the length of time with management for
which the data should be retained.
If parallel running is to take place, ensure that the old system data is maintained along with the
new data to allow periodic comparisons between the old and new systems.
If a staged or phased implementation is to occur, progressive reconciliations should be completed
together with an overall reconciliation when all areas are implemented.
A.17.75
Sign-off on converted data and file conversion reconciliation
worksheets.
Obtain formal written agreement and sign-off from the appropriate authority that all the
appropriate data has been successfully converted and reconciled and that the data is in the
correct state to commence use of the new system.
Maintain copies of the signed-off and authorized balancing procedures, documents and
supporting detail and ensure that these working papers for the actual system balancing are
archived in a secure location, as they may need to be referred to at a later date for a variety of
reasons such as the annual audit.
Page 292
Page 293
In s t a lle d B W E x tr a c t io n P r o g r a m s
D a ta T r a n s fo r m a tio n S y s te m D e s ig n
D a ta D e s ig n
R1
P re p a re S o u rc e
S y s te m s
R2
C h a n g e /U p d a te M e ta
D a ta
R3
C re a te In fo C u b e s
D a ta T r a n s fo r m a tio n S y s te m D e s ig n
R4
M a in t a in
C o m m u n ic a tio n
S tru c tu re
D a ta T r a n s fo r m a tio n S y s te m D e s ig n
R5
M a in t a in T r a n s f e r
R u le s
D a ta T r a n s fo r m a tio n S y s te m D e s ig n
R6
C r e a te / M a in t a in
U p d a te R u le s
R7
S c h e d u le
In fo P a c k a g e s
Page 294
P re p a re d s o u rc e s y s te m s
M a in t a in e d D a ta S o u r c e M e ta D a ta
In fo O b je c t C h a r a c te r is tic s
In fo O b je c t K e y F ig u r e s
In fo C u b e s
A g g re g a te s
In fo S o u r c e C o m m u n ic a tio n S tr u c tu r e
T r a n s f e r r u le s
T ra n s fe r s tru c tu re
P e r s is te n t S ta g in g A r e a ( P S A )
D a ta s o u rc e s
U p d a te r u le s
In fo P a c k a g e s /G ro u p s
Page 295
Page 296
Review the document "OLTP Configuration Steps" for technical and functional requirements
Identify which standard InfoSources need to be configured for your implementation
Complete the necessary configuration steps for data migration
Tools
Use table ROIS to view the standard InfoSources
Programs RMCSBIWC and RKEBIW01
Once the configuration steps are executed and the Meta Data has been refreshed the selfdefined information structure will appear as an InfoSource to the BW server. To migrate data the
InfoSources communication structure and transfer rules should be maintained.
Once the Meta Data refresh is executed, the Operating Concern data will appear as an
InfoSource in the BW.
Page 297
You can fall back on various formats, which are displayed as ASCII flat file, for external systems.
The scheduler falls back on these formats with an external data request. The determination of the
data type (Meta Data or data) is important in the global InfoSource.
The Business Warehouse can process the following formats:
Fixed data record length
Variable data record length
CSV format (colon separated variables, an MS excel format)
Page 298
Page 299
A.18.9Create Characteristics
Purpose
Characteristics are the BWs dimension elements. Characteristics can be updated via the meta
data refresh or can be maintained manually in the InfoObject catalog. Example characteristics are
InfoObjects like customer, material, company code, and cost center.
When a meta data refresh is executed characteristics are created for automatically for each
InfoObject contained in the BWs InfoSources. Often additional characteristics are needed to
meet reporting requirements. These types of characteristics are often derived using data
transformation techniques.
Procedure
A.18.10
Purpose
Key figures are the BWs fact table elements. Key figures are often referred to as "facts". Key
Figures can be updated via the meta data refresh or can be maintained manually in the
InfoObject catalog. Key figures are quantities and values. For example, invoice sales, cost, or
discounts.
When a meta data refresh is executed characteristics are created automatically for each
InfoObject contained in the BWs InfoSources. Often additional characteristics are needed to
meet reporting requirements. These types of characteristics are often derived using data
transformation techniques.
Procedure
Page 300
A.18.11
Create InfoObjects
Purpose
InfoObjects in the BW can be updated via the meta data refresh or created manually. The
purpose of this task is to identify additional InfoObjects needed in the BW that are not updated via
the meta data refresh. These types of InfoObjects are often derived or updated from an external
file.
Procedure
Page 301
A.18.12
Create InfoCubes
Purpose
The purpose of this activity is to create the multi-dimensional data containers that are based on
the result of a multidimensional data model (star schema) to be used for queries and evaluations.
Prerequisites
The Star Schema (data model) that represents the end-users query and evaluation
requirements.
Result
A number of relational tables connected by SID tables will be created. This includes a Fact
Table that connects up to 16 Dimension Tables via dimensional keys. System requires a
minimum of a Time, a Unit and a default Data Packet Dimensional Tables plus one or more
Dimensional Tables that consists of characteristics for queries and evaluations.
A.18.13
Select Characteristics
Purpose
The purpose of this activity is to select and assign characteristics to a dimension table according
to the predefined star schema in creating an InfoCube.
Procedure
A.18.14
Purpose
The purpose of this activity is to select Key Figures and assign to the Fact Table according to a
predefined star schema.
Procedure
Review Key Figures defined in Fact Table of the star schema (data model)
Verify the Key Figures are defined in the communication structures of the target InfoSources
Refer to "Maintain InfoCubes" section of BW Online documentation for detailed processes
A.18.15
Purpose
The purpose of this activity is to select and assign Time Dimension characteristics and to insure
that the system assigned Unit and Data Package Dimension are corrected defined. These are
mandatory Fixed Dimension Tables of an InfoCube.
Page 302
Review the star schema (data model) time characteristics for time dimension
Refer to the online documentation for menu path to define time dimension
Result
A.18.16
Define InfoObjects
Purpose
To define Characteristics, Key Figures and Unit characteristics which are used in creating of an
InfoCube.
Procedure
Click "InfoObject" Icon from the menu tool bar or choose "Edit InfoObject" drop down menu
Select type of InfoObject: Characteristics, Key Figure, Unit or Time Characteristics.
Note: Customer defined Time Characteristics are not a currently supported function.
Page 303
A.18.17
Purpose
The communication structure is localized in the Business Warehouse and displays the structure
of the InfoSource. It contains all of the InfoObjects belonging to the InfoSource of the Business
Warehouse.
The communication structure is either filled from the transfer structure with fixed values or using a
local conversion routine. The routine always refers to just one InfoObject of the transfer structure.
The sum of the InfoObjects of the communication structure can differ from the sum of the
InfoObjects of the transfer structure. The intention here is: The transfer structure brings in all the
OLTP defined fields. Within the communication structure definition we can control which fields will
update the InfoCube.
The InfoSource Communication Structure should define the integrated Subject Area view of data
being brought into BW.
InfoObjects available for defining the Communication Structure are initially provided by the fields
of attached Data Sources, but can be manually added as well.
Trigger
Before you can maintain the communication structure for an InfoSource, the source
system must be assigned. This is done when the meta data refresh occurs for R/3.
Input
Determine if additional InfoObjects are needed.
Determine if the additional InfoObjects will be derived.
Create the communication structure using the BWs Administration Workbench.
Page 304
A.18.18
Purpose
The transfer structure can be found in the source system and is mirrored in the Business
Warehouse. It is the structure, in which data is transported from the source system into the
Business Warehouse. When the transfer structure arrives in the BW the data is in its "raw" format.
Transfer Rules can then be maintained which define how the data should be transformed or
staged. These transfer rules either directly map the InfoObjects in the transfer structure to the
InfoObjects in the communication or provide additional rules for data transformation.
Prerequisites
Before you can maintain the transfer structure you must assign a source system to the
InfoSource and have created a communication structure. This is done when the Meta Data is
executed.
Determine how the data should be transformed.
Document the business rules behind the data transformation,
A transfer structure always refers to one source system and one communication structure. It is
the "Translation" of the extract structure into the Business Warehouse. For each InfoSource,
whether it is SAP delivered or custom, transfer rules need to be generated.
A.18.19
Purpose
The Staging Engine is employed to implement data mapping and transformation. Transactional
data, master data and hierarchies all can be "staged" within the BWs staging engine.
When staging master data it is important to implement a strategy that clearly outlines the
business rules utilized when the data is being transformed. This is especially important if the
master data is fed from multiple systems.
Procedure
A.18.20
Define InfoSources
Purpose
The BW recognizes the business role played by data and applies this knowledge to data
extraction, to mapping and to the structures used to store the data in the warehouse. Since
multiple applications will provide data to the BW it is important to define the data content for each
InfoSource that will be utilized.
Page 305
A.18.21
Purpose
In addition to the extraction of transactional data, the BW can extract master data attributes from
a source system. These attributes will provide the BW with additional navigation attributes when
executing queries. For example, if the InfoObject for customer is used within a query it is possible
to summarize data by the customer attribute region.
It is also important to define how often the master data extraction should be conducted. Master
data attributes are often changed in the source system and capturing these changes in the BW is
essential.
Procedure
A.18.22
Purpose
After the data sources have been defined it is important to identify which transfer technique will
be utilized. The different techniques are:
Flat File Upload
Third-Party
BAPI Interfaces
R/3 ALE & TRFC APIs
Steps
A.18.23
Purpose
After the flow of the data has been configure it is important to test the results of the extractions.
This can be done after the data sources have been defined, the InfoSources are configured and
the update to the InfoCube is created.
After creating the data extracts ensure that the data values are correct.
Page 306
A.18.24
Purpose
BW ensures the consistency of the load process. The transmission of the extract data is
performed in an asynchronous fashion, giving the administrator the flexibility to run the multiple
different tasks for building the extract, transferring it to BW and updating the InfoCubes, at the
same or independent times.
Once the source and target structures and mapping rules are set up, the administrator uses the
Scheduler, the Data Load Monitor and the Data Access Monitor to control and supervise the
ongoing operation of BW. The Scheduler maintains extract and load jobs, which typically run at
recurrent time intervals (e.g. once daily).
The frequency of the data extracts for transactional data, master data and hierarchies should
coincide with the businesss need for data. In addition, it is important to define what data should
be extracted from the source. For example you may want to extract data for Company Code 0001
and Company Code 0002 at different times even though the same data container is being
updated.
Procedure
For each data container identify how often the data should be updated
Identify the InfoSources for each data container
Identify the source systems for of the InfoSources
Identify the content of the data extract
Schedule the data extraction for each of the source system
A.18.25
The Staging Engine is employed to implement data mapping and transformation. Transactional
data, master data and hierarchies all can be "staged" within the BWs staging engine.
When staging transactional it is important to implement a strategy that clearly outlines the
business rules utilized when the data is being transformed. This is especially important if the
transactional data is fed from multiple systems.
Transactional data can be transformed in the following ways:
Clean/Scrub the data
Derived additional business statistics for key figures
Derived additional characteristics for dimension elements
Derived characteristics based on transactional data values
Procedure
Page 307
Page 308
A.18.26
Purpose
The purpose of this activity is to define the data mapping, data condensing and data
transformation logic from InfoSources to an InfoCube
Prerequisites
Input
Result
Tips
Successful maintenance of communication structures of the target InfoSources must be
completed prior create InfoCube update rules.
A.18.27
Purpose
The purpose of this activity is to define the mapping rules from InfoSources to InfoCube for key
figures, characteristics and the time characteristics.
Procedure
Page 309
System generated or customized ABAP module/routines will be created based on the update
rules that you specified.
Meta data upload to the InfoCube is ready to proceed.
A.18.28
Purpose
The purpose of this activity is to develop more complicated update rules via the creation of an
ABAP routine.
Procedure
Result
ABAP Routines to create customized update rules using SAP update-routine template will be
created.
Page 310
A.18.29
Schedule InfoPackages
Purpose
The purpose of this activity is to create InfoPackages to request data (transaction data, master
data, attributes or hierarchies) from a source system according to the selection parameters.
Prerequisites
Input
Result
A new InfoPackage or InfoPackage variant will be created and scheduled, either immediately
or through batch process.
IDOC will be created.
Outcome
InfoCubes connected to the InfoSources and the defined Source systems will be updated
according to the selection parameters.
A.18.30
Select InfoCubes
Purpose
The purpose of this activity is to create or maintain an InfoPackage that prepares for the data
transport request from an R/3 source system.
Procedure
Review the Data Flow document for the involved source system and InfoSources
Determine the selection parameters for each InfoObject that data upload is requested for
The data upload will only process data based on the criteria you specified.
If you are request data upload for a hierarchy then determine the target hierarchy within the
InfoSource
Determine if all InfoCubes relevant to this InfoSource should be updated or only the selected
InfoCubes
Determine if the presence of Master data element is essential prior to transaction data upload
Reference to BW online documentation on Maintain InfoPackages for step by step
procedures
Result
InfoPackage or Variant is defined and ready to be scheduled for data upload
Page 311
A.18.31
Purpose
The purpose of this activity is to specifically address the InfoPackage definition process for data
sources from a non-R/3 system.
Procedure
Review High-level Transformation Design document for source system and InfoSources
involved
Obtain the source system file name and location
Determine the data separator format for CSV (Excel file) by opening the CSV file with note
pad
Determine if control file is present for other external data files
Determine if all InfoCubes relevant to this InfoSource will be updated or only selectively
updated
If Master upload is involved determine whether the presence of Master data is essential
Refer to BW Online documentation Maintain InfoPackages for the step by step instruction to
maintain InfoPackages
Result
InfoPackage or InfoPackage variant will be created and ready to schedule for data upload
Tools
A.18.32
Purpose
The purpose of this activity is to schedule a pre-defined InfoPackage for immediate or periodic
batch run to populate the data contents of the impacted InfoCubes.
Procedure
Result
A.18.33
Purpose
The purpose of this task is to develop batch transformation rules and routines, via the BW
workbench tool using the formula editor or ABAP routine and the InfoPackage batch Job
scheduling functions. Make sure to review all of the existing batch interfaces in the R/3 system,
before you start development. In addition, you should make sure documentation has been
created for all of your development activities. During development, you should consider Job
processing performance issues. Be aware that the process of creating and later testing of the
interface programs will very likely consume a significant amount of time.
Page 312
In addition, it is important to consider what sequence the batch InfoPackages and interface
programs will be running.
Procedure
From the basic methods of data transformation and transfer with R/3 systems and Non-R/3
systems, namely file access and program-to-program communication, both are suited for batch
interfaces and transformation rules discussed in this section. But because of the nature of
asynchronous data processing in batch interfaces and the development effort needed for
program-to-program communication, an online data transfer is probably not the first choice for
this kind of transformation rule interface.
Batch interfaces should be integrated into R/3 on the application server level NOT via front-ends.
An integration on database level could be also considered, but is highly dependent on the
database system and the techniques provided by the database vendor (e.g. Embedded SQL,
OBDB, etc.). Thus the focus in this task will be on SAP adopted techniques.
The R/3 and BW systems offer different methods for transformation and transfer of data into the
BW System from other SAP Systems and non-SAP Systems in a batch Job processing mode.
These methods are "Batch Input", "Call Transaction", "Direct Input", BAPI and IDOC interfaces
Batch Input (BI)
Call Transaction (CT)
Direct Input (DI)
BAPI
IDOC
BAPIs are normally used for online interfaces but could be used also for batch Job processes.
IDOC interfaces can be configured to send and receive IDOCs in bulks, i.e. collecting IDOCs
before sending and processing, thus forming a batch interface.
Steps
Call Transaction
In the CT processing method, a program uses the ABAP CALL TRANSACTION USING
statement to post the transformed and transferred data. Batch-input data does not have to be
deposited in a Job session for later processing. Instead, the entire batch-input process takes
Page 313
Direct Input
This technique is an enhancement of the batch-input procedures. In contrast to batch input,
this technique does not create Job sessions nor does it process screens. It stores the data
read from a flat file directly into the SAP application tables. To do this the SAP system calls a
number of function modules provided by the applications out of several reports. These
function modules are carrying out the necessary business transformation rule checks. In case
of errors DI provides a restart mechanism. However, to be able to activate the restart
mechanism, DI programs must be executed in background.
Result
Batch interface InfoPackages, Transformation BW Programs and Procedures
A.18.34
Purpose
The purpose of this task is to set up procedures for testing the InfoPackage Jobs and InfoSource
data extraction programs (i.e., R/3 system extraction programs or Non-R/3 extraction programs
from ETI or Informatica, etc.) User departments must be involved in testing to ensure data
integrity, accuracy, and completeness. Use a separate client in your QA environment for testing
your data extraction Jobs and programs.
Procedure
To perform test runs on a select number of data records in the R/3 and/or Legacy Source
Systems to allow for testing of all data values
The test plan must include a description of the test processes and of the data consistency
checks in the R/3 System and Legacy Source System
After data extraction, transformation and transfer process have completed, you should test all
related business processes, and check BW master data values for accuracy
The test plan must ensure that data is correct and imported into the BW System
A.18.35
Purpose
The purpose of this task is to develop data transfer procedures, which may involve exporting the
transport settings in the R/3 Repository and QA environment to the production environment.
Successful import of customizing settings and R/3 Repository objects is a prerequisite of the data
transformation and transfer process. The R/3 and BW systems provide tools for data
transportation.
Another purpose of this task is to identify all data transfer requirements, and to develop a
conceptual design. Data from existing legacy systems may also be required to be transferred into
the BW system, and you need to identify both manual and automated transfer procedures.
Page 314
Determine if BW InfoPackage Job requires InfoSource data to be transferred from R/3 system
or Non-R/3 Legacy system
This data will eventually be loaded to a corresponding BW InfoCube.
A.18.36
Purpose
The purpose of this activity is to monitor and review the result of the InfoPackage execution, take
corrective action based on the status and information provided by the log file.
Prerequisites
Input
Page 315
A u t h o r iz a t io n D e s ig n
P r e s e n ta tio n S y s t e m D e s ig n
S1
Im p le m e n t
A u th o r iz a t io n C o n c e p t
P r o f ile s
A u th o r iz a t io n O b je c t s
U p d a te d U s e r M a s te r
S2
V a lid a t e A u t h o r iz a tio n
C oncept
V a lid a t e d a u t h o r iz a t io n p r o f ile s a n d o b je c t s
S3
C o n s tr u c t B u s in e s s
E x p lo r e r P r e s e n ta tio n
S y s te m
Q u e ry
W o rk b o o k
S tru c tu re
R e s t r ic t e d K e y F ig u r e s
C a lc u la te d K e y F ig u r e s
V a r ia b le s
Page 316
Provide basic information to identify the activity group. This task is mandatory.
Select the reports and transactions from the R/3 Company Menu to include in the activity
group. This task is not mandatory, but is recommended (95% of the time).
Choose the tasks to include in the activity group. This task is not mandatory, but is
recommended if you use SAP Workflow.
Page 317
In conjunction with application area representatives, perform security unit testing for each
sample user created. The enterprise-wide Job Role Matrix can be the basis for this test plan.
Each Role (activity group and generated authorization profiles) must be tested before you golive.
Test Roles in user training. If a user authorization is missing, your training schedule can be
delayed.
Page 318
Work with people responsible for each application area to create a list of users in each
department.
Group employees of the same department into user groups.
Create a list (spreadsheet) of job descriptions, and related activity groups and profiles for
each user.
A.19.10
Purpose
The purpose of this task is for users to perform testing to ensure they can perform necessary
activities and transactions in the system in total security.
Page 319
A.19.11
Purpose
The purpose of this task to refine user authorizations. Refine user authorizations, for example,
when new employees join the organization, employees change job responsibilities, and so on. Do
this task with the task of validating user masters for job functions.
Page 320
A.19.12
Create Reports
The purpose of this task is to identify and develop company specific or user favorite reports.
Activities
A.19.13
Purpose
The purpose of this activity is to create the base line Business Explorer objects for query analysis
and evaluation.
This involves creation of the following components:
Query
Workbook
Structures
Restricted Key Figures
Calculated Key Figures
Variables
A.19.14
Define Company, Channel, and User (Enterprise InfoCatalog,
User Channel InfoCatalog Levels)
Purpose
The purpose of this activity is to provide an organized structure to group query workbooks for
display and management which offers as a framework for authorization to subscribe and publish
reports.
Prerequisites
Input
Result
Page 321
Tips
The enterprise and user channel InfoCatalog offer great flexibility on how a company may
organize how queries are access and by whom. A well-structured set of InfoCatalogs makes
query distribution and authorization easier. It is worth to invest time up front to determine a
strategy.
Typically, Channel InfoCatalog reflects the roles and functions of the query users.
A.19.15
Purpose
The purpose of this activity is to organize the pre-defined queries (reports) into Channel
InfoCatalog and assign users to subscribe to the appropriate Channel or Sub-Channels.
Prerequisites
Upon completion of Enterprise and Channel InfoCatalog structure definition that is specifically
designed for your company.
Input
Result
A.19.16
Purpose
The purpose of this task is to configure the BW Business Explorer browser defaults.
The BW Business Explorer is the OLAP Front-end PC Software component, which allows End
-Users view and analyze decision support data within the BW warehouse.
Prerequisites
Purchase of BW Software
Defined Data Access Paths by User Group and Organization
Result
BW Business Explorer Browser default settings configured for each desktop identified (i.e.,
End-User, Project Team Members).
A.19.17
Assign Authorizations
Purpose
The purpose of this activity is to assign user authorizations for working with queries.
Page 322
Input
Result
The appropriate user profile will be assigned to each user id of the BW system.
A.19.18
Purpose
The purpose of this activity is to execute the query from Business Explorer Analyzer and Browser
to insure the delivery of expected results.
Prerequisites
Input
Result
Query display and navigation with slice and dice as well as all other functionality.
Page 323
Page 324
T1
P re p a re S y s te m T e s t
S tra te g y
N e w o r u p d a te d S y s te m T e s t S tra te g y
S y s t e m T e s t W o r k P la n
S y s t e m T e s t t e a m r e s p o n s ib ilitie s
C r it ic a l S u c c e s s F a c t o r s
P e rfo rm a n c e M e a s u re s
B u s in e s s s c e n a r io s
A c c e p ta n c e te s t p la n
M a n a g e m e n t O v e r v ie w D ia g r a m
H ig h - le v e l a r c h it e c t u r e d ia g r a m
T2
D e v e lo p S y s t e m T e s t
P la n
S y s t e m t e s t o b je c t iv e s
S y s t e m t e s t d a t a s p e c ific a t io n s
S y s t e m t e s t p la n
C r it ic a l S u c c e s s F a c t o r s
P e rfo rm a n c e M e a s u re s
B u s in e s s s c e n a r io s
A c c e p ta n c e te s t p la n
M a n a g e m e n t O v e r v ie w D ia g r a m
H ig h - le v e l a r c h it e c t u r e d ia g r a m
T3
C o n d u c t S tru c tu re d
W a lk - t h r o u g h o f
S y s t e m T e s t P la n
S y s t e m T e s t E n v ir o n m e n t
T5
E x e c u te S y s te m T e s t
I n t e g r a t io n te s t p la n
I n t e g r a t io n te s t d a t a
H ig h - v o lu m e t e s t s t r a t e g y
H ig h - v o lu m e t e s t p la n
H ig h - v o lu m e t e s t d a t a
C S F s /P e rfo rm a n c e M e a s u re s
A c c e p ta n c e c r ite r ia
S y s t e m t u n in g a p p r o a c h
T4
P re p a re S y s te m T e s t
E n v iro n m e n t( s )
R e v is e d S y s te m T e s t P la n
R e v is e d S y s te m T e s t W o r k P la n
U p d a te d S y s te m a n d U s e r D o c u m e n ts
S y s te m C h a n g e R e q u e s ts
C o m p le t e S y s te m te s t p la n s
T e s t p r o b le m r e p o r t s
T e s t p r o b le m r e p o r t lo g
T6
E x e c u te In te g r a tio n
Test
T e s t p r o b le m r e p o r t s
T e s t p r o b le m r e p o r t lo g
T7
E x e c u te H ig h - V o lu m e
T e s t in g
T e s t p r o b le m r e p o r t s
T e s t p r o b le m r e p o r t lo g
T8
C o n d u c t S y s te m
T u n in g
Page 325
U p d a te s y s te m t e s t p la n s
O n g o in g s y s te m tu n in g p r o c e s s
Page 326
Security testing - to demonstrate the system's ability to prevent or detect and isolate
improper access and unplanned activity.
Depending upon the project scope, the system testing may also include the testing of disaster
recovery procedures to determine the system's ability to switch-over to a parallel machine or to
execute a recovery after hardware failure, as per the contingency plan. Generally, this is
completed as a separate sub-project by a separate Disaster Contingency Planning project team.
Additional testing may be completed at the system level to address:
Integration testing - to test that business events and their responses can be traced and
reconciled across application boundaries and across technology platforms, from their
initiation into the business until their exit from the business; and
High-volume testing - to test that the system meets the required level of performance with
respect to such items as throughput, delay or simultaneous transactions in a simulated
production environment.
The need for both integration testing and high-volume testing as well as a recommended
approach to executing the tests should be included in the System Test Strategy.
Page 327
Page 328
A.20.10
Ensure that all members of the System Test Team have received the orientation materials and
have attended the prerequisite training and that roles and responsibilities are clearly defined.
Page 329
A.20.11
Purpose
To provide instructions for the execution of system tests, to define the scope of the tests and to
ensure that all required testing will be completed.
Overview
A System Test Plan is created for a comprehensive test of the developed system. Individual test
conditions are developed to satisfy each test objective as specified in the System Test Strategy.
Test steps and associated test data sets are put together and defined as a system. Before
execution, the System Test Plans are reviewed and approved.
A.20.12
Each objective encompasses the processing activities within one or more program specifications
that are invoked to execute a system process. The objectives must also include the interfaces to
other automated systems because with some projects there are a large number of these
interfaces to test, each with their own complexity.
For each test objective, define the test conditions that should be tested. When all of an objective's
test conditions are executed properly, the test objective should be satisfied. The types of test
conditions that should be considered include:
Valid data conditions;
Invalid data conditions;
Pathing logic determination conditions;
Exception path execution conditions;
Output conditions;
Control total accumulation conditions;
Program interface conditions;
System interfaces;
System, power and communications failures; and
Back-up or alternate configurations and systems.
A.20.13
Identify all instances of the systems interface to other systems within its environment. Sources for
identifying external interfaces include:
The Management Overview Diagram (defined in the Process Abstract), the Source
System Abstract and the Data Conversion Abstract indicate the scope of the system and
show the system's boundaries and interactions with other systems;
The intersection of entities used by individual systems data models when overlaid on the
enterprise data model;
The external call sequences documented in the program design specifications; and
The local and remote networks documented in the system distribution strategy.
External interfaces are those instances where the system:
Receives data from another system (e.g., source data systems);
Sends data to another system (e.g., BW); or
Uses shared resources (e.g., network).
Page 330
A.20.14
Develop a set of test cases that will execute all of the system's functionality. Test cases should
include conditions such as:
Errors in passed data as a result of residues (e.g., uncleared buffers or registers), file
incompatibility, connection failures, invalid timing and sequencing;
Interface errors as a result of passing corrupt or incorrect data with insufficient tolerance
to receipt of bad data and/or insufficient checks on sent data;
Access and security controls;
Data corruption, destroyed values and residues remaining within a common database as
a result of interface via the database;
Unacceptable levels of performance degradation of any system using shared resources
as a result of resource sharing;
fall back procedures such as the use of magnetic media in place of communications or
dial-up communications lines that typically run at a slower speed than leased lines;
Recovery of data corrupted during transfer or as a result of a system interface;
Recovery of transferred data and re-establishing connections after system failure; and
Re-run of interfaces caused by other system failures.
A.20.15
For each test condition, define sets of data required to execute the test conditions. Identify:
Data record(s) and/or screen/window(s) to be constructed to test the condition;
Job(s) (e.g., program processing sequence) and required command language sets;
Key data element value(s); and
Unique sets of data element value(s) that were already defined for unit testing and can be
used again and that include:
ranges below, above and at value limits,
valid or invalid values,
invalid types,
invalid value formats, and
invalid combinations of values.
Where appropriate, multiple test data sets should be defined for each test condition. All processes
need to be tested by the use of "cascades" of test data sets containing at least three transactions
to determine the integrity of program pathing logic setting, e.g., testing for invalid data entry
should include test data sets for different invalid values and for various combinations of invalid
data.
Cycles of test data should be created so that the tests replicate normal use of the system or
logical flows, e.g., master file data can be created, manipulated and deleted in successive test
jobs.
Test data generation tools may also be used in this step to prepare some of the test data set
information.
A.20.16
Specify the expected status and format of the data, screens/windows and/or reports for every
major automated processing point during the execution of programs for a test condition. For
instance, a test of the interface between two programs should include expected results that show:
Page 331
The status and format of any passed data just before program control transfer by the first
program;
Expected results as determined by the specifications that have been developed during
the phases preceding testing;
The status and format of any passed data upon transfer of program control to the second
program;
Any reports created (error, activity listings or control).
A.20.17
For each test condition, define the test steps necessary to execute the test condition and verify
the expected results. These test steps become a script for each test condition of the manual test
steps required to execute the program(s) and process the test data set.
The test steps should be defined by referencing the detailed sections of the user procedures
required to execute the test condition and should include (in the appropriate sequence) any test
steps that are specific to the testing (e.g., printing of test data sets for verification, invocation of an
on-line debugger to verify status). The test steps should include:
Input data preparation and entry instructions;
Input balancing procedures;
Expected results verification instructions;
Output selection and verification instructions;
Verification enabling instructions (e.g., setting interest break-points for CICS programs);
and
Output data balancing/reconciliation and report balancing/reconciliation procedures.
Ensure that a final set of conditions are defined with all break-points and other temporary items
removed from the application.
A.20.18
Each test cycle is a consecutive set of ordered test conditions and associated test data sets that
will test a logical and complete portion of the system. The test cycles are defined such that
successive test cycles are built by combining the test condition strings of previous test cycles.
By building subsequent test cycles from previous test cycles, each successive test cycle:
Tests a larger or more complex portion of the system;
Allows data processed in one cycle to be input for further processing, e.g., master file
data creation, modification and reporting; and
Creates a discrete unit of work (i.e., inclusion of test conditions from early test cycles into
later test cycles will obviate the need to re-execute the complete test cycles when
problems arise in the later test cycle).
A.20.19
Identify the number of test cycles required for each major process in the system.
Identify the test conditions and associated test data sets that will be used for each test cycle.
Define the order in which the test conditions will be executed. The sequence of test conditions
within a string should follow a logical progression. When sequencing test condition strings,
consider:
Valid data is the simplest test condition and should be entered first;
Page 332
The complexity of test conditions (e.g., less complex test conditions should go before
more complex test conditions);
Documented procedural flow of work;
Logical processing order (e.g., changes or additions executed after deletions);
Reusability of test data sets such that the successful completion of one string will
properly initialize the database for a subsequent string;
Criticality of functions (e.g., the more critical functions and the more frequently used
functions need to be tested more thoroughly), including fall back procedures, especially
for time critical or operational dependent procedures for a business; and
Association of functions (e.g., for an interface function, it is not necessary to include
conditions that do not affect the transferred data in either of the interfaced processes).
A.20.20
Identify the test cycles required to test compliance with the appropriate Access Control Matrices.
Identify the test cycles required to test compliance with all of the defined security controls from
the processing controls and security strategy.
A.20.21
Review the acceptance test plan and confirm that the types of tests included in the Acceptance
Test Plan are also included in a System Test Plan cycle and ensure that the acceptance criteria
can be met.
The acceptance test should be a confidence test for the system users to demonstrate the
effective working of the system. It should not result in Test Problem Reports as any potential
problems should be identified and resolved as part of the System Test.
Page 333
A.20.22
Purpose
To ensure that the System Test Plan, when executed, will adequately test the system.
Overview
The structured walk-through process helps to ensure that a complete set of conditions and steps
has been defined.
A.20.23
Conduct a structured walk-through of the System Test Plan with the System Test Team.
During the walk-through, the System Test Plan is compared to the business events and Business
Scenarios, not to the source code. A correct System Test Plan is one that addresses all the
defined events and scenarios.
A.20.24
Update the System Test Plan with any additional test conditions and related documentation
identified from the structured walk-through.
Ensure the System Test Plan adequately addresses not only typical business scenarios but also
invalid and atypical scenarios to fully test issues such as:
No records;
System security;
Back-up and recovery;
Contingency planning;
Interfaces;
Performance measures; and
System upgrades.
A.20.25
Discuss the contents of the System Test Plan with the project team, management and the
business users who are part of the System Test Team and obtain formal written approval of the
contents.
A.20.26
Revise the system test work plan following the structured walk-throughs to reflect any changes
caused through:
The number and complexity of test conditions;
Level of effort required in preparation of test data sets;
The complexity of procedures required to verify expected results; and
The number and complexity of test cycles.
A.20.27
Submit System Test Plan to management and obtain formal
written approval.
Page 334
A.20.28
Purpose
To set-up all supporting environments, including data file initialization and job execution streams
in preparation for the execution of the System Test Plans.
Overview
The preparation of the system test environment includes:
The creation of the system test environments (e.g., functional, integration, high-volume
and acceptance test environments);
Loading utility software such as performance monitors or background load simulators;
The initialization of all required test data sets;
The modification of all command language sets (e.g., shell scripts) and utilities such as
copies, back-ups and restores for compatibility with the system test environment; and
Preparation of programs for execution.
A.20.29
Confirm that all physical resources identified in the System Test Strategy are available and ready
for use.
A.20.30
Define and generate all required sign-on identification codes and security control tables to:
Enable access to the system test environment by system test personnel; and
Prevent access to the system test environment by non-system test personnel.
A.20.31
Where applicable, initialize data files and databases with any control, header and/or trailer
records required. Utility programs may be required for this step. This should be completed after
test conditions that are related to null data sets are executed.
Create input transaction data sets, if any, required for each test cycle.
For batch update processes and on-line processes, prepare and/or generate input transaction
data.
A.20.32
Alter command language sets and utilities as required for the
system test environment.
Alter command language sets and utilities required for the system test environment when it is
unable to achieve a 100% emulation of the production environment.
For instance, for an UNIX environment, test data sets may require different names than the
production data sets and any utilities used in the test streams may need to be set-up differently.
A.20.33
Identify and create required diagnostic command language
steps.
Identify and create required diagnostic command language steps. Command language steps may
need to be added to command language sets to enable the verification of test results. These may
include:
Data file copy/store steps;
Page 335
Code these steps with comments, merge them into the command language set and review the
set for errors.
A.20.34
The IDs should enable precise identification of the results of each test condition, test cycle and
test step execution. Inappropriate identification will result in confusion regarding the source of the
final results and will make verification of expected results difficult, if not impossible. Follow.
installation standards, if applicable.
A.20.35
Move the source code modules to program library catalogues controlled by the System Test
Team.
Compile and link all catalogued modules according to instructions provided at hand over.
Complete all program migration or movement instructions signifying that programs are ready to
be tested.
Move all data necessary to conduct the test to the test execution catalogue including command
languages, copybooks and utilities.
A.20.36
Confirm from the work plan and the check lists that all items
have been completed.
Page 336
A.20.37
Purpose
To test the functionality of the developed system according to the designed System Test Plan.
Overview
All test cycles are executed until successfully completed.
Successful completion is annotated on the hard copy output of the test. System test results are
reconciled with expected results and all errors are documented and corrected. Final system test
results are cross-referenced by page and/or line number to the appropriate System Test Plan
step.
Successfully executed System Test Plans are reviewed before acceptance testing.
The Test Problem Report and System Change Request processes are used throughout the
system test.
A.20.38
Execute the test cycles by following the test steps associated with each test condition in the
cycle's test condition string. The logging of each test cycle should include the recording of:
Execution or submission date and time;
Completion date and time;
System test results; and
Final test cycle sign-off.
User personnel should be heavily involved in the execution of the test cycles. This will serve as
preliminary verification of the effectiveness of the user procedures, provide input to the training
needs analysis and should prepare the user for implementation and system acceptance.
A.20.39
At each processing point in the System Test Plan for which expected results have been specified,
the tester should visually compare all parts of the system test results with the expected results or
through the use of file compare utilities. A hard copy of the system test results should be
produced and attached to the expected results.
The person verifying the system test results should initial the system test log as evidence of
completing the review and should cross-reference on the log the filing location of the system test
results.
A.20.40
Document any discrepancies between the system test results and the expected results and
highlight them on the hard copy of the system test results. Attach any comments that may be
helpful in identifying the cause of the discrepancy.
The error documentation should then be passed to the appropriate personnel for error correction.
The System Test Team leader should determine if the execution of the test is to be suspended
because of the encountered errors. Errors that are of sufficient magnitude will affect all
subsequent test steps, so continuing the system testing in such an instance may be meaningless.
Page 337
A.20.41
Correct errors and log error corrections. Include a description of the resolution, identification of
the error correction and date of correction.
The corrected program(s) should then follow the defined procedures for re-submission to system
testing including notifying the System Test Team when the correction has been completed and
notifying the documentation team of the program change(s). Depending on the type of error,
regression testing may be necessary to verify that the fix to the problem has not adversely
affected other successfully tested components of the system.
A.20.42
A.20.43
Annotate all of the documentation associated with the completed and approved test cycles and
file as part of the system test documentation.
This should include the System Test Plan, system test results, any error corrections and system
test logs.
A.20.44
Submit system test results to management and obtain formal
written approval.
Page 338
A.20.45
Purpose
To ensure the delivery of complete DW system functionality and system performance, across
application boundaries and across technology platforms.
Overview
Integration testing will:
Test that business events supported by multiple applications adhere to the performance
criteria and functional specifications agreed with the business users;
Ensure external interfaces of the DW system have been adequately tested; and
Ensure the co-existence of applications as they work in conjunction with each other does
not have adverse effects upon the operating environment.
Integration testing focuses on testing the "full life cycle" business event and verifies an event
through its entire life cycle, from its initiation through to the termination of that event across all
systems affected by it.
The System Test Team must conduct an integration test that provides a level of assurance that
the systems implemented are reliable and meet business requirements without excessive
preparation and testing of functions that have not changed.
The integration test focuses on those actions that minimize exposure to failure, minimize cost and
maximize coverage of business requirements and conditions through testing:
All processes that have been modified or created as new across application boundaries
(including data conversion for each application). e.g., for a new general ledger
implementation, testing that a journal entry is created in the source system, added to a
sub-ledger that creates a journal entry to the corporate general ledger, that adds the
journal entry and outputs an extract file to a financial data warehouse);
All system procedures and reconciliation processes related to the conditions being
tested;
Control balance processing (control totals between applications) to ensure processing
through existing application systems is accurate and meets functional requirements
defined by the business;
By use of a standard "base case" of test data and conditions to allow each application
system to ensure the application accurately processes data; and
Any transaction that crosses three or more application boundaries. Transactions crossing
less than three application boundaries may still be tested during integration testing if
considered necessary to mitigate an unacceptable level of risk.
The integration test work plans contain contingency to allow for problem resolution, unplanned
conditions and technical breakdowns. However, integration test will not test processing within an
application system that has not been modified as a result of the application development effort.
Integration testing will also not be a complete re-execution of the application system test.
A.20.46
Prepare an integration test strategy to ensure that the delivery of complete business functionality
can be achieved. The integration test strategy structures and formalizes the approach to
conducting the integration test activities.
Prepare a brief strategy document containing the following sections:
Page 339
Purpose;
Scope and approach;
Timing;
Team roles and responsibilities;
Planning technique;
Execution technique;
Key assumptions;
Risk assessment; and
Data generation.
The integration tests must also address test conformance to standards. Interface standards must
exist for transfer of information between systems. Each external interface must follow established
standards. Review the interfaces identified in the System Test and reject any programs that do
not follow documented standards. Standards should exist for explicit interfaces and interfaces
implicit in the database.
A.20.47
Develop integration test plans. The integration test plan is at a higher functional level than a
system test plan.
Condition coverage includes both valid and invalid cases. Conditions focus on the business event
being performed, often involving multiple types of data at a time.
Each test step walks-through the execution and checkout process for each individual application
system involved with a business event. Create expected results for each test step, documenting
the outputs of each test step at an integration level (i.e., expected results should not duplicate the
system test results but should focus on results that demonstrate successful/unsuccessful
integration of applications and/or systems).
Utilize user management as the key resources for the planning activities and validate the
integration test plans by conducting structured walk-throughs including sign-offs with the
appropriate user management and project management from the application teams.
A.20.48
Create a schedule to address the appropriate sequencing of the integration tests contained in the
integration test plan and document in an integration test work plan.
When creating the integration test work plan consider:
The scope and approach and timing sections of the integration test strategy, to identify
potential timing constraints and/or windows of opportunity for conducting the tests;
Sequencing of and dependencies between integration tests;
Resource availability for both conducting the tests and researching and correcting Test
Problem Reports; and
The impact of the tests on other systems.
A.20.49
Data is required to be generated, allowing specific conditions to be confirmed and that can be
used to conduct high-volume testing of the applications.
Generate test data for the integration tests. Production data can also be converted for this same
purpose. Each set of test data will contain both valid and invalid test data.
Page 340
The integration test will ultimately utilize production data to exercise the integration test. The data
will be formulated in the "originating system" and sent via interfaces to each of the target
applications. Testing will be conducted in test cycles defined to test specific business functionality.
Test cycles primarily involve three categories of test data - base case, low-volume and highvolume data. Each category of data must be able to be successfully processed for each event
before the next category of data may be processed. As a new category is processed, the prior
categories are also executed in combination with the new category to build up a progressively
larger test data set.
A.20.50
Confirm that the appropriate test environment has been created as part of the Technology
Planning, Support and Cutover Phase.
Integration tests should be conducted in a technical environment that mirrors the production
environment. Using a production grade environment allows the technical performance of the
integrated system to be monitored and tuned, as necessary, to meet the system's performance
and cost requirements. Representative operating cost estimates can be derived from the highvolume integration test runs.
Key interface systems that currently exist in production may not have a test environment available
for use. Identify these applications as soon as possible and make arrangements for environments
to be made available or additionally recognize opportunities to satisfy the integration testing
needs.
A.20.51
Complete the integration test. The execution of the integration test requires a coordinated effort
between the application teams. Experience has shown that the use of a common work area for
the Test Team is essential for effective communications.
Each application team is responsible for performing the necessary "due diligence" to ensure the
accuracy of their application as it processes integration data.
Confirmation of the test execution includes ensuring that:
The data that was processed can be viewed and manipulated successfully using on-line
screens/windows;
The data can be successfully processed by subsequent batch programs; and
The data is able to be printed via existing report programs.
Use a three step process (base case data, then low-volume data followed by high-volume data)
to qualify the applications being tested. Each application must verify control balances before
initiating the next application's processing. Otherwise, all the teams waste time processing
incomplete or erroneous data.
The results of the execution should be documented by each application team in the same way
system test documentation is completed.
The successful execution of an integration test includes:
Coordination of back-ups and data synchronization points to ensure that all databases
are the current version. Strict program version control and change management must be
in effect;
Page 341
Execute in test cycles starting with base case data. The base case data will be used
repeatedly until it successfully completes processing for all applications. All anticipated
business functionality must be included in the base case data set. At the end of base
case data testing, the focus turns from one of functional verification to one of application
integration;
If errors or program problems are identified, the integration test activities in this area
cease until the program is repaired, unit tested and system tested. Any problems are the
responsibility of the System Test Team to resolve and migrate the corrected program;
Once application functionality is stabilized (the base case data successfully executes),
execute low-volume data test cycles including base case data. Now that functional
capacity is established, begin growing the amount of records and throughput processed
by the applications. Iterations of this test cycle are executed until processing successfully
completes for all application systems. The System Test Team must have confidence that
the applications are functionally stable and have the operational capacity to process highvolumes of data successfully; and
Execute high-volume test cycles including the base case and low-volume test data.
Volumes executed should be greater than the expected production work load. Recognize
that additional time will be spent completing execution, back-ups, restores and other
load-impacted tasks. Complete performance tuning and re-execute the test cycles until
satisfactory performance is delivered by the application systems. The project teams can
derive representative operating cost estimates from the high-volume integration test runs.
A.20.52
Document each test when executed so that the test results can be verified and duplicated.
Documentation should include:
Execution date and time;
Completion date and time;
System conditions at the time of testing;
The test case executed and the data used; and
Test results (e.g., output generated, data passed/received, errors logged, database
updates).
A.20.53
Where errors are identified in the integration tested programs, these errors are documented and
returned with the test log to the project manager for implementation of debugging procedures.
Errors that require changes to the original specification or design should proceed through the
issue resolution process.
All of the documentation associated with the completed and approved test cycle is annotated and
filed as part of the integration test documentation. This should include the Test Plan, test results,
any error corrections and test logs.
A.20.54
Submit system test results to management and obtain formal
written approval.
Page 342
A.20.55
Purpose
To confirm that the system operates within acceptable performance guidelines, when processing
the agreed upper limit loadings in a production or simulated production environment.
Overview
High-volume testing confirms that the system meets the required level of performance with
respect to throughput, delay and simultaneous transaction processing while operating at the
planned maximum load in a production environment.
Scripts are defined to outline how the tests will be conducted and to cross-reference the expected
results. After execution of the high-volume tests, the results are analyzed and the system tuned
as needed.
Note: High-volume tests should be run in a production environment with other systems running or
simulated to provide an operational load on the machine and network environments.
A.20.56
Define test cycles that establish system performance
boundaries.
This step requires that concurrent execution of similar and dissimilar programs be planned. The
intent is to incrementally increase the load on the system and by so doing establish the point at
which it fails to complete processing within the desired response/turnaround time. Sensitivity
analysis of these results is used to identify which transactions or combination of transactions
affect the system performance most.
When planning high-volume tests and high-volume test cycles, take into consideration:
High-volume tests are not often concerned with the correctness of process, i.e., expected
results are often physical and not logical. Also, high-volume test cycles may not follow a
specific transaction all of the way through its life cycle if the transaction is only of interest
to the high-volume test at certain points;
As high-volume tests are particularly interested in CPU and network loading, control over
other concurrent use of the system is important. High-volume tests are often conducted
at the same time as data conversion and the competition for system resources should be
planned for; and
High-volume tests can result in as many database changes as program logic changes. All
data model documentation must be updated with these changes.
A.20.57
The high-volume test process ensures programs and components operate effectively under
greater than normal volume levels. Testing using high-volumes of records also reveals program
inefficiencies and tuning opportunities that may be leveraged to increase performance.
Determine which system components should be high-volume tested. Identify such items as:
Key business functionality;
Specific types of reporting or retrieval/scanning of data for reporting;
All interface points to or from external systems;
Internal system linkages between sub-systems;
Inquiries that must process a large input or scan large amounts of data;
Processing complexity that causes slow response times; and
Page 343
The identified transactions should represent the majority the BW system, including extract and
transformation jobs. Review the list with project management and the construction managers for
accuracy and completeness.
A.20.58
Determine the expected production volumes for each group of items identified. Consider:
Month-end, period-end and year-end;
Seasonality, holidays or other volume-changing variables;
Daily use curves (e.g., does the majority of activity occur in the morning or in two hours in
the afternoon);
Load variability, is the volume of records processed dynamic (increase/decrease over
time) or static (remains constant over time)
Process results in inserts, updates or deletes of records; and
Expected record retention levels.
Once peak volume levels have been defined for each item to be tested, assess the load
variability.
If the load is static, increase the peak number by 20% (e.g., if the volume for a program is always
10,000 records, test the program with 12,000 records). The static load test confirms the program
can perform at expected production levels as long as the load is static.
Programs with dynamic loads incur greater risk of failure and therefore must be tested with higher
volume levels than the expected production loads. If the load is dynamic, multiply the peak
volume number times three (e.g., if the volume for a program can be up to 10,000 records, test
the program with 30,000 records).
A.20.59
Batch high-volume testing uses a variety of methods for synthesizing test data.
The primary purpose of the test data is to provide a mechanism for verifying that the program will
perform to expectations during production level (or greater) volume loads. Test data for a program
should exercise a proportionate subset of the overall functionality. Create an initial subset of data
(10 to 30 records) that reflects a majority of system functions, containing both valid and invalid
conditions, that is replicated to the target transaction volume.
System test data may be used to provide the initial "base" of data. Replication or cloning methods
for the data may employ a variety of techniques to create test data including manual input,
keystroke macros, utility jobs, special tools, conversion programs, actual production data or
special programs built to create data. Where possible, chaining the output from one process to
another provides the most effective test.
The overhead of synchronizing the efforts of two processes is generally worth the investment.
A.20.60
Before executing the high-volume test, ensure appropriate coordination has occurred with the IT
technical support organization. Performance measurement tools and expected measured outputs
should be identified before the test is conducted. Usually a two month lead time is recommended
for beginning and conducting a dialogue with the technical organization to enable the most
effective use of the test execution time.
Page 344
A.20.61
Submit high-volume test results to management and obtain
formal written agreement.
Page 345
A.20.62
Purpose
To ensure that the system that has been constructed operates efficiently on the target hardware
and software platforms.
Overview
The new system once installed and tested, may need to be tuned to ensure that the system runs
at an agreed level of performance. Optimum performance can often only be achieved once the
system has been built and has been tested in the production system environment.
This task may only be partially completed before live processing commences. The steps defined
should also be included as part of ongoing system maintenance.
Note: This area is highly technical in that it requires deep expertise in all of the different components of
the BW environment to complete the system tuning. The steps in this task are thus only an indicative
outline of the work to be completed and will need to be supplemented with the detailed tasks and steps
required for each environment.
A.20.63
Identify if any performance-related criteria have been specified in the following documents:
CSFs/Events list;
Contract;
Agreed system acceptance criteria; or
System Test Strategy.
A.20.64
Review the results of the system, integration and high-volume tests against the expected
performance levels and identify any performance levels not achieved.
A.20.65
Determine the possibility of tuning the system to address the problems. In the operations area,
consider alternatives such as:
Hardware upgrades (e.g., more processing power, more memory, more disk drives, more
communications bandwidth, etc.);
Software upgrades (e.g., different or additional utilities such as for sorts or back-ups);
Operational tuning (e.g., revision of job schedules to reduce the possibility of system
response degradation due to competition for resources with other systems, improved job
submission systems);
Network configuration tuning (e.g., upgrades to communications protocols, faster line
speeds, front-end processor upgrades, reduced encryption and increased compression);
and
General tuning (e.g., use of disk caching or other technologies, file placement, moving
critical programs directly into resident memory).
Page 346
A.20.66
Determine the prospect of re-designing the programs to improve the performance of the system
for critical transactions. Program requirements tend to change more readily than data structures,
so modifying the programs to improve performance may provide long-term benefits.
Identify any changes in the daily, weekly or period-end processing streams which may improve
overall performance or capacity utilization.
A.20.67
Determine the potential for tuning and denormalizing the database(s). Consider that modifying the
database to improve program performance may only provide short-term benefits and in the longer
term only complicate the maintenance of the system. Options include:
Tune for non-index accesses;
Validate clustering and sequences;
Add indices; and
Allow redundant data.
A.20.68
Create a record to define the performance level of the system before any tuning.
Identify, from the system tuning plan, tuning that may potentially improve the performance of the
software and therefore lead to improvements in the overall system performance.
Conduct the appropriate system tuning techniques and compare the performance results with
those already obtained. Compare the test performance results with the required results and if the
results are still unsatisfactory continue with system tuning. Decide if the results are significant
enough to warrant making the change permanent and, if so, update all relevant documentation to
reflect the change.
Repeat any of the system tests considered appropriate to ensure that making changes in one part
of the system does not have an adverse effect in other areas.
A.20.69
Ensure that any changes made to data models, networks or other items as part of the system
tuning process, are reflected in the maintained system documentation and operations
instructions.
A.20.70
Ensure that the responsibilities for all areas of systems performance are clearly defined.
Ensure that performance and capacity measurement data is collected and regularly reviewed.
Ensure that housekeeping routines are included as part of the operations instructions to maintain
performance levels, e.g., regular running of database file reorganization utilities.
Page 347
Page 348
U1
C o n f ir m C o m p le te n e s s
o f t h e R e a liz a t o in
S ta g e
C o n f ir m e d R e a liz a t io n S t a g e D e liv e r a b le s
O p e n Is s u e s
U2
R e v ie w I s s u e s
P r o je c t p la n s
U3
U p d a t e P r o je c t P la n s
U p d a te d p r o je c t p la n s
Q u a lit y P la n
U4
U p d a t e Q u a lit y P la n
U p d a te d Q u a lit y P la n
U5
O b ta in A p p r o v a l o f
R e a liz a t io n S ta g e
D e liv e r a b le s
R e a liz a t io n S t a g e D e liv e r a b le s
F o r m a l A p p r o v a l P o in t
Page 349
Is s u e r e s o lu t io n s
Page 350
Page 351
A.21.10
Data conversion may already have been started at this point. e.g., data cleansing and purging
may have commenced or external information may already have been obtained. Check that any
data conversion tasks that were planned for completion by this point have been completed. If
there are any discrepancies, assess the impact on the overall data conversion estimates and
schedule.
Deliverables include:
Converted data;
Created data;
Purged data;
Reconciliation results; and
Approach used and program documentation for any clean-up, creation or conversion
programs.
A.21.11
Confirm formal written sign-off on transition support and
migration deliverables.
Examine the Project Charter and the work plan for transition support and migration for a definition
of the deliverables to be produced and the appropriate approval authority. Check these
deliverables have been completed and that formal written approval has been obtained.
Confirm that all tasks in the Technology Planning, Support and Cutover Phase related to the
implementation of the production technology environment have been completed and ensure that
the final Task M5 - Support Technology Environments is included in the Implementation Stage
Work Plan.
The impact of any discrepancies should be assessed and appropriate issues raised.
A.21.12
User acceptance testing is primarily a client responsibility. Within the methodology, formal
acceptance testing takes place during the Implementation Stage. At the Construction Checkpoint
Phase, the user acceptance test plans are checked for completeness against the defined
acceptance criteria to ensure that the testing includes all of the acceptance criteria.
If there are any discrepancies, issues should be raised and action taken accordingly.
Page 352
A.21.13
Check progress of any acceptance test work already
undertaken.
In certain projects, acceptance testing, although formally scheduled for completion during
implementation, may be started during the Realization Stage. Although formal completion is not
expected until later, progress should be checked at this time as any problems with acceptance
testing may have an impact on the overall schedule and issues should be addressed as early as
possible.
Some of the acceptance tests may take place in an environment other than the planned
production environment. e.g., it may be necessary to establish a controlled network with a limited
number of machines of specified power to undertake some of these performance tests. The
Methodology assumes acceptance testing is performed in implementation in the full production
environment. If this is not the case, ensure that this environment is checked at the appropriate
point.
Raise any issues as appropriate and take corresponding action.
Page 353
A.21.14
Review Issues
Purpose
To identify all outstanding issues and either resolve or agree future actions.
Overview
All issues must be reviewed and appropriate action agreed/taken.
A.21.15
Review all open issues contained in the Issue Resolution Log. Identify possible resolutions and
initiate appropriate corrective action.
For any issues that will not be resolved immediately, an approach describing how these will be
resolved should be developed and agreed.
Ideally, no issues should remain outstanding at the end of the Realization Stage. In practice, not
all issues will need to be fully resolved before progressing to the Realization Stage. The decision
on the importance of resolving an issue should be made by the project manager in conjunction
with senior management.
Assess the impact of any outstanding issues and reflect in the plans and risk assessment for the
next stage.
A.21.16
Document on the Issue Resolution Form or cross-reference to the Issue Resolution Form, an
agreed approach to resolving all outstanding issues.
Ensure that procedures for investigating outstanding issues are fully understood by both
developers and system users. The cost of investigating outstanding issues may need to be split
between both parties, e.g., if after investigating an issue, it turns out to be an expansion of scope
or an error in information provided by the user, the cost of the investigation as well as the
resolution of the error should be borne largely by the system users.
Page 354
A.21.17
Purpose
To review the approach for the implementation of the system and to confirm the approach for the
warranty and maintenance period.
Overview
At the Business Blueprint Checkpoint, an approach for the Construction and Implementation
Stages was developed and an approach to the ongoing support of the implemented system was
prepared.
The estimates for the remainder of the project are reviewed using the actual results from
construction, together with any implementation and parallel phase work already completed.
Fully detailed work plans are produced which address the overall project and resources available.
Full costs and time estimates are derived.
Both the detailed implementation plans and the approach to warranty and maintenance are
reviewed including the impact of any ongoing parallel phases. These changes are then reflected
in the overall work plan.
A.21.18
At the end of design, an implementation plan was developed. This should be reviewed at the
completion of the Realization Stage and adjusted accordingly. Considerations include:
Changes to sub-system boundaries made during construction;
Changes to implementation mechanisms made during construction;
Work brought forward or postponed to earlier/later phases;
Changes in internal or external dependencies;
Staffing requirements and availability; and
Changes in milestones or financial constraints.
Revise the implementation plan accordingly.
A.21.19
Validate decisions based on sequencing of deliverables with
other Realization Stage deliverables.
Review complementary Realization Stage plans for other items from the Cross-Life Cycle Phases
which may still be in progress including:
User Documentation;
Training; and
Acceptance Testing.
Consider the impact of these plans on the approaches envisioned for implementation and
ongoing support. Correct any inconsistencies by revising the appropriate plans.
A.21.20
Ensure that all plans are included in the project work plan which combines all tasks,
dependencies and milestones for the remaining stage of the project.
Page 355
A.21.21
Estimates for implementation were made in the Business Blueprint Checkpoint Phase based on
the best knowledge at that time. Experience gained during the Realization Stage may indicate
variances from these original estimates and the implementation estimates should be revised
accordingly.
Many of the parallel phase activities should be almost complete and should provide information
about appropriate estimates for the remainder of this work. Estimates should be revised
accordingly.
A.21.22
Produce detailed work plans for the Final Preparation Stage
and Go Live and Support Stage.
Based on the project work plan and the revised estimates, develop detailed work plans for the
remaining work to be undertaken to complete the system implementation and to support the
system.
Reflect any changes to the overall work plan necessitated by resource or other constraints.
A.21.23
Develop final cost and timing plans for implementation and support. Compare these with the
originally agreed figures. Review any discrepancies and take appropriate action including:
Re-phasing of the implementation to ensure that time scales are met;
Revisiting the original scope to see if scope has changed over the lifetime of the project;
Agreeing a revised cost for the project;
Increasing the staff complement in an attempt to reduce time scales at increased cost; or
Reducing the staff complement to reduce cost but increase time scales.
Formal written approval should be sought for any actions to be taken and plans revised
accordingly.
Page 356
A.21.24
Purpose
To ensure that all aspects of planning and quality have been properly addressed.
Overview
To this point, the Construction Checkpoint Phase has largely been concerned with work planning.
Many issues are addressed in the Project Charter that go hand-in-hand with the work plans.
These are reviewed and appropriate amendments made to reflect the end of this stage and the
start of the next.
Those sections of the Project Charter that need to be updated for Construction are reviewed and
amended or created, as appropriate. In particular, the Project Risk Assessment needs to be
revisited.
A.21.25
Review the risk assessment as a result of the Realization Stage experience and the updated
project plans for the Final Preparation Stage. Consider:
How has the anticipated project commitment been in practice?
How has the anticipated commitment and quality of third parties been in practice?
How has the scope changed and why did this occur?
Have the volumes and anticipated performance levels changed significantly?
Has the user base changed significantly?
Are the resources needed still available?
Is the experience and knowledge of available resources as anticipated?
Has information about the business and the technology been learned that was unknown
before?
Has the anticipated processing changed significantly and does this affect previous
assumptions?
Have budget and time considerations changed (e.g., by project overrun)?
To what extent have the needs of the business changed during the project to date?
Have other projects started that have made demands on the resources required for this
project?
A.21.26
Identify where the risk ratings described in the risk assessment are higher than expected or are
greater than originally. Obtain agreement with management on the appropriate actions to take.
Possible options may include reviewing the project and strictly monitoring any deviations from the
project work plan to be able to identify possible symptoms of a larger problem or may involve
changing the project scope or project team composition to try to resolve the issues.
In this latter instance, assess the impact of making changes to the project plan on the associated
strategy documents developed in this stage. Update the implementation strategy, data conversion
strategy and appropriate testing strategies as necessary.
A.21.27
Page 357
A.21.28
Many areas within the Project Charter may not have been completed in full until this point since
they primarily affect the Implementation Stage. Ensure that all these areas have been properly
addressed before progressing to the Implementation Stage.
Revise the Project Charter as appropriate.
A.21.29
Approval for the revised Project Charter should be sought and the plan should be reviewed for
completeness and correctness including:
Any issues should be discussed between project management and the independent
assessor;
Corrective actions to be taken should be agreed by all parties and documented. Such
actions should have clear time scales and responsibilities assigned; and
Further reviews should be scheduled to ensure corrective action is undertaken.
Page 358
A.21.30
Purpose
To seek final review and formal written approval of the Realization Stage deliverables.
Overview
Throughout the Realization Stage, formal written approval should be sought for deliverables on
an ongoing basis. At the close of this stage, a final review of deliverables is undertaken and
formal written approval obtained for all of the Realization Stage deliverables.
A.21.31
Discuss the Realization Stage deliverables with management and resolve any questions and
issues.
Responsibility and time scales for formal sign-off should have been defined in the Project Charter
and the project contract.
Many of the deliverables should have already been subject to review and formal written sign-off
by user representatives. Review the work deliverables and address any outstanding sign-off
issues.
A.21.32
Obtain formal written approval of the Realization Stage
deliverables.
Obtain formal written approval of the Realization Stage deliverables as defined in the Project
Charter and the project contract.
*** FORMAL APPROVAL POINT ***
Page 359
Page 360
Page 361
Page 362
P r o je c t s ta n d a r d s
V1
P re p a re S y s te m
O p e r a t io n s
In s t r u c tio n s
V2
E s ta b lis h P r o d u c t io n
O p e r a t io n s
E n v ir o n m e n t
R e v is e d c u t o v e r p la n
P r o d u c tio n jo b s t r e a m s
L iv e p r o c e s s in g c h e c k lis t
S y s te m o p e r a tio n in s tr u c tio n s
D a t a c o n v e r s io n r e c o n c ilia t io n p r o c e d u r e s
U s e r d o c u m e n t a t io n
V3
Im p le m e n t a n d
T ra n s fe r T e s te d
S y s te m
Page 363
Y
S
S
O
e a r - e n d p r o c e s s in g m a n u a l
y s te m tra n s fe r
y s t e m c o n v e r s io n r e c o n c ilia t io n
n g o in g s u p p o r t r e s p o n s ib ilit ie s
Page 364
Page 365
A.22.10
Obtain formal written approval that the production
environment creation meets installation standards.
Page 366
A.22.11
Purpose
To transfer the new system to both the user and computer operations areas in a controlled
manner.
Overview
The implemented system, including all documentation, is transferred to the appropriate personnel
responsible for its ongoing management, maintenance and support.
A.22.12
Purpose
The purpose of this task is to transport customizing settings and BW Repository objects from the
QA environment to the production environment. Successful import of customizing settings and
BW Repository objects is the main objective of this task. The BW system provides tools for
transportation of customizing settings and BW Repository objects.
Procedure
The customizing settings created in the development system, as well as development objects, are
transported. In this and subsequent activities, observe the transport time windows in the project.
Before exporting, check dependencies and links to other development objects
After you export, review BW export logs
If required, import the enhancement into the PRD system
Check the import in the PRD system using the BW import log
Test the enhancement with reference to a data model in the PRD system
Log the results
A.22.13
Execute programs to finalize data conversion to the
production BW environment.
Execute programs to convert and load both master and transaction data into the production BW
system.
Depending on the means of live processing commencement, ensure that the data conversion is
reconciled and formal written approval is obtained as being completed.
A.22.14
Initialize both the system and user security profiles. Provide all personnel that need access to the
system their assigned security level. This process may need to be supervised by or completed in
conjunction with, internal or external audit personnel.
A.22.15
Conduct system transfer according to installation change
control procedures.
Formally transfer the new system and all of its supporting documentation including:
Program listings (including conversion programs, where relevant);
Program and system documentation;
Data conversion;
User documentation; and
Page 367
A.22.16
Once the new system has been initialized and is ready to commence production, it is imperative
that the source and Business Information Warehouse systems are reconciled and balanced.
Depending on the means of live production commencement (e.g., a staged implementation or
parallel processing), this reconciliation process may have to be completed on more than one
occasion.
In addition, in some case this reconciliation of balances may occur after processing has
commenced. The reconciliation must be:
Properly documented - suggest putting in a separately bound volume;
Formally approved with written approval obtained from the appropriate person(s)
responsible; and
Available for subsequent post-reconciliation checks (e.g., by internal/external audit).
A.22.17
Establish the Help Desk support function. Ensure that the Help Desk staff are properly trained
and all users are aware of the Help Desk availability.
Ensure that the users and computer operations are aware of their support responsibilities.
Confirm any continuing maintenance role that may be fulfilled by the development team for a
specified time period.
A.22.18
If a service level agreement is to be put in place for the new systems implementation, finalize the
details at this time.
A.22.19
The various items that have been defined during design and implementation and that are not met
during the initial implementation of the system, must be addressed.
Prepare detailed plans to address implementation of these items and which forms part of the
ongoing maintenance and development of the system. These may include:
The introduction of additional system features that were not used during the
commencement of the system;
Subsequent organizational structure changes;
Subsequent changes to the existing workflows; and
Development and implementation of additional sub-systems.
A.22.20
Prepare year-end processing plan and instruction manual
(optional).
Year-end processing is generally a significant event because of such items as year-end
accounting pressures, changes to or purging of master file data, production of year-end reporting,
transfer of history data, loading of data for the new year and sub-system year-end reconciliations.
Depending upon the project scope, prepare an initial plan for the first year-end processing of the
new system.
Page 368
A.22.21
If the implementation of the new system has also included changes or additions to the
organization's Disaster Contingency/Recovery Plans, ensure that all of the appropriate items and
services are in place, have been fully tested and that the plan has been formally approved.
A.22.22
Commence use of the new system using the agreed definition
of "live processing."
Commence live production of the new system using the definition of live processing derived for
the project. It is mandatory that the project team continue its involvement past "day one" and
address such areas in the live production environment as:
The first period-end and month-end processing;
The reconciliation of sub-systems and interfaces;
The production of the first set of cyclical reporting;
Monitoring of service levels and performance;
Any quarterly, half-yearly or year-end processing; or
Completion of parallel processing.
Where a staged implementation (e.g., by location or by division) is to occur, live processing may
be commenced on more than one occasion and this should be addressed in the different work
plans.
A.22.23
Obtain a formal written system transfer sign-off that the system was successfully transferred.
Complete the necessary approval procedures to complete CPDEP Phase 4 Execute.
Page 369
Page 370
O n g o in g S u p p o r t R e s p o n s ib ilit ie s
W 1
P r o v id e P r o d u c tio n
S u p p o rt
P ro d u c tio n s u p p o rt
R e c o n c ilia t io n P r o c e d u r e s
W 2
V a lid a t e L iv e B u s in e s s
P r o c e s s R e s u lts
V a lid a te d s y s t e m
C
S
B
C
W 3
O p t im iz e S y s te m U s e
T u n e d s y s te m
SFs
e r v ic e L e v e l A g r e e m e n t
W S ta tis tic s
C M S S t a t is t ic s
Page 371
Develop, document and distribute procedures for handling internal problems both during the
period of going live and for the long-term.
Define the roles and responsibilities of project team members during the immediate period of
going live. End users may require increased support during the early production period.
Supply end users with the names of support personnel.
Document and distribute procedures for directing problems to SAP.
Verify that all SAP contact lists contain the correct names. Also, verify that end users have
internal contacts through whom they can channel issues to SAP.
Page 372
Page 373
If possible, use the testing plans designed Acceptance Testing. Otherwise design plans to monitor
and verify the transactions processed. Execute the plan and prepare a written report on the
issues that arise during production.
It is important to document and resolve all issues. This gives users confidence that no matter how
small or insignificant the problem may be, it is still important to resolve it.
Page 374
A.23.6Resolve Issues
Purpose
The purpose of this task is to resolve issues and ensure that corrective action is taken. Examples
of some corrective actions include:
Changes in Business Processes
Corrective Code Changes in BW System
Configuration Changes
Additional User Training
Any code or configuration changes must first be tested and verified in your development or QA
systems and moved to your production system only after testing.
Procedure
Page 375
A.23.10
Purpose
The purpose of this task is to optimize system performance for the most frequent critical
transactions.
Procedure
The starting point for analysis can be system performance. The following BW standard system
utilities can be used to monitor system performance:
SAP SQL trace
SAP workload analysis (statistical records)
SAP technical applications monitors
SAP monitors for applications server, database, and operating system
SAP BW/CCMS
Additionally, the BW Statistics InfoCube provides information on query execution and data
loading. This data can be used to fine-tune, create, or drop InfoCube aggregates.
Page 376
Page 377
Q u a lity P la n
H e lp D e s k P r o c e d u r e s
P e rfo rm a n c e M e a s u re s
O n g o in g S y s t e m T u n in g P r o c e s s
S y s t e m R e q u ir e m e n ts
N a m in d a n d C o d in g S t a n d a r d s
O p e r a tio n a l S ta tis tic s
F u n c t io n a l R e q u ir e m e n ts
r o je c t W o r k P la n
u a lit y P la n
r o c e s s in g S c h e d u le s
e r v ic e L e v e l A g r e e m e n t
o s t / B e n e f it A n a ly s is
p e r a tin g C o s t s
y s te m D o c u m e n ta t io n
X2
E v a lu a t e S y s t e m 's
C o m p lia n c e w ith
R e q u ir e m e n t s
X1
E v a lu a t e S y s t e m 's
E ff e c tiv e n e s s
A p p r o v e d P r o je c t S c o p e a n d
In fra s tru c tu re
R e q u ir e m e n t s C o m p lia n c e F in d in g s
X3
D e t e r m in e P o t e n t ia l
Changes and
E n h a n c e m e n ts
L is t o f P o t e n tia l C h a n g e s a n d E n h a n c e m e n t s
C o s t S u m m a r y M a t r ix
X4
S u b m it P o s t Im p le m e n ta t io n
R e v ie w D e liv e r a b le
P o s t - I m p le m e n t a t io n R e v ie w D e liv e r a b le
Page 378
Page 379
Determine the use of supplementary procedures to support the needs for organizing the
information; and
Determine whether the installed system features are being fully used or not.
Page 380
A.24.10
Evaluate the effectiveness of operating costs as related to the
system.
Analyze and evaluate the operating costs as related to the system in a production environment.
Compare the actual costs against the expected costs. Check to ensure that the cost basis is
consistent between the original estimates and the current figures as, for instance, changes may
have occurred in the hardware configuration, systems software or networks being used.
Compare the system costs with other application costs to determine if the costs are appropriate
and equitable.
Page 381
A.24.11
Discuss the findings and consider alternative
recommendations to address any anomalies discovered.
Page 382
A.24.12
Purpose
To identify areas of the system that do not meet the stated requirements.
Overview
The implemented system is reviewed to determine if the requirements have been satisfied. The
requirements of the implemented system are measured against the original requirements from
the Business Blueprint Stage, the amendments to those requirements as specified in
scope/contract amendments and accepted issue resolutions and any subsequent changes to the
system since commencement of live processing.
A.24.13
Determine if the processing capabilities of the implemented
system comply with the original requirements.
Review the original or modified requirements for the system. Consider specifically the data
handling capabilities and functionality of the system. Identify:
The individual processes and their estimated volumes;
The availability of the specified processes and functionality in the implemented system;
and
If all processes are being used.
A.24.14
Determine if the processing schedules comply with the
original requirements.
Compare the original scheduling and frequency requirements with the actual operational statistics
maintained within computer operations, adjusted where necessary by any changes, e.g., changes
in the technology environment.
A.24.15
Determine if the response times comply with the original
requirements.
Review the on-line performance statistics for on-line response times, volumes and downtime.
Compare the original response times requirements with the actual statistics. Compare the
estimated volumes of data against the data volumes currently being processed by the system.
When completing this task, determine the present hardware configuration, network configuration,
capacity utilization and systems software modules to determine whether any changes have
occurred in the interim period that may have affected the performance of the system, e.g., other
new systems have been added or volumes have been increased in other existing systems.
A.24.16
Determine compliance with standards intended to prevent
information inaccuracies.
Determine compliance with installation standards for:
Programming and design standards;
Testing standards;
Numbering and naming conventions;
Maintenance;
Service levels;
Development methodology usage;
Operations housekeeping;
Page 383
A.24.17
A.24.18
A.24.19
If service-level agreements exist between the users and computer operations, review the planned
and actual levels of service. If any external services are used (e.g., network management), review
their compliance with the appropriate contracts.
A.24.20
A.24.21
Discuss the findings and consider alternative
recommendations to address any anomalies discovered.
Page 384
A.24.22
Purpose
To recommend change or enhancements to the system based upon the review findings.
Overview
Potential areas for improvement are identified and a determination is made regarding whether
changes or enhancements should be made to the system.
A.24.23
A.24.24
Determine if there are features of the system not being used that the users should incorporate or
whether existing features could be used in a different fashion.
Determine if the reporting, screen/window inquiry, interfaces or data input can be improved.
A.24.25
A.24.26
Page 385
A.24.27
A.24.28
A.24.29
Evaluate potential user and computer operations
documentation improvements.
Potential documentation improvements may include:
Improving training and education;
Changing the level of procedure detail;
Changing the scope of procedural overviews;
Improving the level of indexing and/or cross-referencing;
Improving the format and/or organization of the documentation; or
Changing the style or target level of the textual or on-line material.
A.24.30
Determine if the benefits derived by a change in processing mode justify additional operational
and/or systems development expenditures. Benefit and expenditure assessment should include:
Present operating costs;
Projected operating costs under changed system processing mode;
Tangible benefits such as reduced staff costs;
Intangible benefits of proposed improvements;
Development costs for proposed improvements; and
Implementation costs and time required for proposed improvements.
A.24.31
Discuss the findings and consider alternative
recommendations for changes and enhancements.
Page 386
A.24.32
Purpose
To formally prepare and present the Post-Implementation Review findings and recommendations
to management.
Overview
The outputs from the individual tasks are consolidated into a post-implementation review
deliverable. This document is reviewed for content and clarity and presented to management.
Traditionally, results of reviews such as this have used reports containing only text to describe
and present information. The use of some different graphical techniques may dramatically
improve and simplify the review outputs. For example:
A simple graphing of response times as shown may quickly and effectively describe and
dispel issues related to response times;
The "barometer" technique may be appropriate in some circumstances; or
Overhead slides and graphics can be created to show summary results to support the
detailed report.
A.24.33
A.24.34
Submit and discuss post-implementation review deliverable
with management.
Discuss the post-implementation review with management and resolve any questions and issues.
If necessary, prepare an action list of items to modify in the post-implementation review
deliverable. In practice, this step may involve more than one meeting to resolve the outstanding
issues.
A.24.35
Obtain formal written approval of post-implementation review
deliverable and formally close the project.
Obtain formal written approval of the post-implementation review deliverable. Complete the
necessary approval procedures to complete CPDEP Phase 5 Operate and Evaluate.
Complete all of the tasks necessary to formally close the project as stated in the Project Charter
and the project proposal.
Page 387