You are on page 1of 31

1111

IMPLEMENTATION PHASE

Systems Implementation is the


fourth of fve phases in the
systems development life cycle
S!L"#

Includes application development$


testin%$ documentation$ trainin%$
data conversion$ system
chan%eover$ and post&
implementation evaluation of the
results
2222
Introduction

The system desi%n specifcation serves


as a 'lueprint for constructin% the ne(
system

The initial tas) is application


development

*efore a chan%eover can occur$ the


system must 'e tested and documented
carefully$ users must 'e trained$ and
e+istin% data must 'e converted

A formal evaluation of the results ta)es


place as part of a fnal report to
mana%ement
3333
Overvie( of Application
!evelopment

Application development

O',ective is to translate the


lo%ical desi%n into pro%ram and
code modules that (ill function
properly

"reation of the System !esi%n

The tasks involved in system design


produced an overall design and a plan for
physical implementation
4444
Overvie( of Application
!evelopment

Application
!evelopment
Steps

Module

After the design is


created, coding can
begin
5555
Overvie( of Application
!evelopment

Pro,ect Mana%ement

Even a modest-sized project might have


hundreds or even thousands of modules

mportant to set realistic schedules, meet


project deadlines, control costs, and
maintain !uality

"hould use project management tools and


techni!ues
6666
Structured Application
!evelopment

Top&do(n approach

Modular desi%n

Must proceed carefully$ (ith


constant input from
pro%rammers and IT
mana%ement to achieve a sound$
(ell&inte%rated structure

Must ensure that inte%ration


capa'ility is 'uilt into each
desi%n and thorou%hly tested
7777
Testin% the System

After codin%$ a pro%rammer must


test each pro%ram to ma)e sure
that it functions correctly

Synta+ errors

!es) chec)in%

#ogic errors

Structured (al)throu%h$ or code


revie(

!esi%n (al)throu%h
8888
Testin% the System

-nit Testin%

Test data

$rogrammers must test programs that


interact %ith other programs and &les
individually

'egardless of %ho creates the test


plan, the project manager or a
designated analyst also revie%s the
&nal test results
9999
Testin% the System

Inte%ration Testin%

ntegration testing, or link testing

Testing the programs independently does


not guarantee that the data passed
bet%een them is correct

A testing se!uence should not move to the


integration stage unless it has performed
properly in all unit tests
10 10 10 10
Testin% the System

System Testin%

Major objectives(
) $erform a &nal test of all programs
) *erify that the system %ill handle all input
data properly, both valid and invalid
) Ensure that the T sta+ has the
documentation and instructions needed to
operate the system properly and that
backup and restart capabilities of the
system are ade!uate
11 11 11 11
Testin% the System

System Testin%

Major objectives(
) ,emonstrate that users can interact %ith the
system successfully
) *erify that all system components are
integrated properly and that actual processing
situations %ill be handled correctly
) -on&rm that the information system can handle
predicted volumes of data in a timely and
e.cient manner
12 12 12 12
!ocumentation

!ocumentation

Pro%ram !ocumentation

System !ocumentation

Operations !ocumentation

-ser !ocumentation

/nline documentation
13 13 13 13
Mana%ement Approval

After system testin% is complete$


you present the results to
mana%ement

If system testin% produced no


technical$ economical$ or
operational pro'lems$
mana%ement determines a
schedule for system installation
and evaluation
14 14 14 14
System Installation and
Evaluation

.emainin% steps in systems


implementation/

$repare a separate operational and test


environment

$rovide training for users, managers,


and T sta+

$erform data conversion and system


changeover

-arry out post-implementation


evaluation of the system

$resent a &nal report to management


15 15 15 15
Operational and Test
Environments

The environment for the actual


system operation is called the
operational environment or
production environment

The environment that analysts and


pro%rammers use to develop and
maintain pro%rams is called the test
environment

A separate test environment is


necessary to maintain system
security and inte%rity and protect
the operational environment
16 16 16 16
Operational and Test
Environments
17 17 17 17
Trainin%

Trainin% Plan

The &rst step is to identify %ho should


receive training and %hat training is
needed

The three main groups for training are


users, managers, and T sta+

0ou must determine ho% the company %ill


provide training
18 18 18 18
Trainin%

Outside Trainin% .esources

Many training consultants, institutes, and


&rms are available that provide either
standardized or customized training
packages

0ou can contact a training provider and


obtain references from clients
19 19 19 19
Trainin%

In&House Trainin%

The T sta+ and user departments often


share responsibility

1hen developing a training program, you


should keep the follo%ing guidelines in
mind(
) Train people in groups, %ith separate training
programs for distinct groups
) "elect the most e+ective place to conduct the
training
) $repare e+ective training materials, including
interactive tutorials
20 20 20 20
Trainin%

In&House Trainin%

1hen developing a training program, you


should keep the follo%ing guidelines in
mind(
) 'ely on previous trainees
) Train-the-trainer strategy

1hen Training is complete, many


organizations conduct a full-scale test, or
simulation
21 21 21 21
!ata "onversion

!ata "onversion Strate%ies

The old system might be capable of


e2porting data in an acceptable format
for the ne% system or in a standard
format such as A"- or /,3-

f a standard format is not available, you


must develop a program to e2tract the
data and convert it

/ften re!uires additional data items,


%hich might re!uire manual entry
22 22 22 22
!ata "onversion

!ata "onversion Security and


"ontrols

0ou must ensure that all system control


measures are in place and operational to
protect data from unauthorized access and
to help prevent erroneous input

"ome errors %ill occur

t is essential that the ne% system be


loaded %ith accurate, error-free data
23 23 23 23
System "han%eover

!irect "utover

nvolves more risk than other


changeover methods

-ompanies often choose the direct


cutover method for implementing
commercial soft%are packages

-yclical information systems usually are


converted using the direct cutover
method at the beginning of a !uarter,
calendar year, or &scal year
24 24 24 24
System "han%eover

Parallel Operation

Easier to verify that the ne% system is


%orking properly under parallel
operation than under direct cutover

'unning both systems might place a


burden on the operating environment
and cause processing delay

s not practical if the old and ne%


systems are incompatible technically

Also is inappropriate %hen the t%o


systems perform di+erent functions
25 25 25 25
System "han%eover

Pilot Operation

The group that uses the ne% system &rst


is called the pilot site

The old system continues to operate for


the entire organization

After the system proves successful at


the pilot site, it is implemented in the
rest of the organization, usually using
the direct cutover method

s a combination of parallel operation


and direct cutover methods
26 26 26 26
System "han%eover

Phased Operation

0ou give a part of the system to all users

The risk of errors or failures is limited to


the implemented module only

s less e2pensive than full parallel


operation

s not possible, ho%ever, if the system


cannot be separated easily into logical
modules or segments
27 27 27 27
Post&Implementation Tas)s

Post&Implementation Evaluation

ncludes feedback for the follo%ing areas(


) Accuracy, completeness, and timeliness of
information system output
) 4ser satisfaction
) "ystem reliability and maintainability
) Ade!uacy of system controls and security
measures
) 5ard%are e.ciency and platform performance
28 28 28 28
Post&Implementation Tas)s

Post&Implementation Evaluation

ncludes feedback for the follo%ing areas(


) E+ectiveness of database implementation
) $erformance of the T team
) -ompleteness and !uality of documentation
) 6uality and e+ectiveness of training
) Accuracy of cost-bene&t estimates and
development schedules
29 29 29 29
Post&Implementation Tas)s

Post&Implementation Evaluation

1hen evaluating a system, you should(


) ntervie% members of management and key users
) /bserve users and computer operations personnel
actually %orking %ith the ne% information system
) 'ead all documentation and training materials
) E2amine all source documents, output reports,
and screen displays
) 4se !uestionnaires to gather information and
opinions form a large number of users
) Analyze maintenance and help desk logs
30 30 30 30
Post&Implementation Tas)s

Post&Implementation Evaluation

4sers can forget details of the


developmental e+ort if too much time
elapses

$ressure to &nish the project sooner


usually results in an earlier evaluation
in order to allo% the T department to
move on to other tasks

deally, conducting a post-


implementation evaluation should be
standard practice for all information
systems projects
31 31 31 31
Post&Implementation Tas)s

0inal .eport to Mana%ement

0our report should include the follo%ing(


) 7inal versions of all system documentation
) $lanned modi&cations and enhancements to the
system that have been identi&ed
) 'ecap of all systems development costs and
schedules
) A comparison of actual costs and schedules to the
original estimates
) $ost-implementation evaluation, if it has been
performed

Marks the end of systems development %ork