You are on page 1of 22

CS/SE 6354 Advanced Software Engineering (Summer 2007)

Ambulance Dispatch System


Test Plan

Submitted to:

Dr. Lawrence Chung


Associate Professor,
Department of Computer Science,
The University of Texas at Dallas,
Richardson, TX- 75080

Submitted by:

GoE (Gang of Eight)

1. Anand Rajeevalochana (axr067000)


2. Prashanth Guha (png051000)
3. Santosh Bheemarajaiah (sxb062000)
4. Santhosh Nagaraj (sxn061000)
5. Kiran Kumar Mohan (kkm062000)
6. Arvind Prabhakar (axp065000)
7. Nikhil Narayan (nxn063000)
8. Ramiah Natarajan (rxn059000)

Version: 1.0 Date: 07/18/2007

Version: 1.1 Date: 07/20/2007

1
Table of Contents
1 Introduction.................................................................................................................3
2 Relationship to other documents.................................................................................3
2.1 Relationship between Requirements Analysis Document and Test Plan.............3
2.2 Relationship between Software Architecture Document and Test Plan...............4
2.3 Relationship between Object Specification Document and Test Plan.................4
3 System Overview.........................................................................................................5
4 Features to be tested/not to be tested...........................................................................6
4.1 Features to be tested:...........................................................................................6
4.2 Features that cannot be tested..............................................................................7
5 Pass/Fail Criteria..........................................................................................................7
5.1 System level.........................................................................................................7
5.2 Integration testing................................................................................................7
5.3 Unit testing...........................................................................................................8
6 Approach......................................................................................................................8
6.1 Testing Levels......................................................................................................8
6.1.1 Unit Testing..................................................................................................8
6.1.2 System/Integration Testing..........................................................................8
6.1.3 Acceptance/Installation Testing:..................................................................9
6.2 Integration Testing Strategy.................................................................................9
6.3 UML class diagram for integration testing..........................................................9
7 Suspension and Resumption......................................................................................10
7.1 Suspension Criteria............................................................................................10
7.2 Resumption Requirements.................................................................................11
8 Testing materials (hardware/software requirements).................................................11
8.1 Hardware requirements......................................................................................11
8.2 Software Requirements......................................................................................11
8.3 Other Resources.................................................................................................11
9 Test Cases..................................................................................................................12
9.1 Unit Test Cases..................................................................................................12
9.2 Integration Test Cases........................................................................................15
9.3 Functional Test Cases........................................................................................17
10 Test Schedule.........................................................................................................20
10.1 Responsibilities..................................................................................................20
10.2 Staffing and training needs................................................................................20
10.3 Risks and contingencies.....................................................................................21
10.4 Testing schedule.................................................................................................21

2
1 Introduction
The purpose of this Test Plan document is to prescribe the scope, approach, resources,
and schedule of the testing activities for the Ambulance Dispatch System that is
characterized by a complementary conformance statement. The goal of the Test Plan is to
identify the items being tested, the features to be tested, the testing tasks to be performed,
the personnel responsible for each task, and the risks associated with the test plan.

2 Relationship to other documents

2.1 Relationship between Requirements Analysis Document and Test


Plan

The functional requirements (Use Cases) recognized in Requirements Analysis Document


are mapped to Functional Test Cases listed in Test Cases as follows:
Functional Requirement (Use Cases Test Case ID
Name)
Incident Report ADS_FT_01
Collect Information ADS_FT_02
Incident Enquiry ADS_FT_03
Track Incident ADS_FT_04
Resolve Duplicates ADS_FT_05
Dispatch Ambulance ADS_FT_06
Update Ambulance Status ADS_FT_07
Update Ambulance Location ADS_FT_08
Manage Fleet ADS_FT_09
Manage Accounts ADS_FT_10
Login ADS_FT_11

The following Non Functional Requirements need to be tested using Performance Testing

 Usability
 Reliability Requirements
 Performance Requirements
 Support Requirements
 Implementation Requirements
 Interface Requirements
 Packaging Requirements

3
2.2 Relationship between Software Architecture Document and Test
Plan

The sub-systems recognized in Software Architecture Document are mapped to


Integration Test Cases listed in Test Cases as follows:

Sub-Systems Test Case ID


ADS_IT_07
UI Interface Subsystem

User Management Subsystem


ADS_IT_01,
Incident Management Subsystem
ADS_IT_02,
ADS_IT_03,
ADS_IT_04,
ADS_IT_05,
ADS_IT_07
ADS_IT_06
Fleet Management Subsystem
-
Admin Operations Subsystem
ADS_IT_01,
ADS Controller Subsystem
ADS_IT_02,
ADS_IT_03,
ADS_IT_04,
ADS_IT_05,
ADS_IT_06,
ADS_IT_07
ADS_IT_02,
Persistent Data Storage Subsystem
ADS_IT_03,
ADS_IT_04,
ADS_IT_05,
ADS_IT_06,
ADS_IT_07

2.3 Relationship between Object Specification Document and Test Plan

The packages recognized in Software Architecture Document are mapped to Unit Test
Cases listed in Test Cases as follows:

Packages(Class) Test Case ID


ADS_UT_01, ADS_UT_02,
Incident
ADS_UT_03, ADS_UT_04,
ADS_UT_05, ADS_UT_06,

4
ADS_UT_07
ADS_UT_08
Ambulances
ADS_UT_09, ADS_UT_10,
Dispatcher
ADS_UT_11
ADS_UT_12, ADS_UT_13,
Supervisor
ADS_UT_14, ADS_UT_15

3 System Overview
This section, focusing on the structural aspects of testing, provides an overview of the
system in terms of the components that are tested during the unit test. The things that are
intended to be tested within the scope of this test plan are essentially, something that
needs to be developed. This can be developed from the software application inventories
as well as other sources of documentation and information.

UML diagrams help in developing the test architecture that copes with all the structural
aspects of testing. The test architecture contains test components and test contexts and
defines how they are related to the specified under test, the subsystem, or the component
under test (i.e., the tested software).

The figure summarizes the structural aspects of testing.

5
 A test context represents a collection of test cases, associated with a test
configuration e that defines how the test cases are applied to the system under test
(SUT).

 A test configuration may comprise a number of test components and describe how
they are associated with the tested component, the SUT.

 An arbiter evaluates the test results and assigns an overall verdict to a test case.

 Feasible verdicts for a test result are pass, inconclusive, fail and error.

4 Features to be tested/not to be tested

4.1 Features to be tested:

 Weather the operator and other users of the system can log in successfully after
authenticated with their respective username and password.

 Once inside the system can the operator log an event successfully?

 Can the system detect duplicate incident reports?

 If two dispatchers simultaneously try to access an incident report, the system


should not allow this to happen.

 The system should be tested to change the details of the ambulance assigned for
an incident and in case of an ambulance accident the system users should be able
to assign the next available ambulance to the previous incident site.

 The report created by the operator, is it available to be viewed to the supervisor


and dispatcher.

 A particular incident report, once updated by the supervisor, is these changes


reflected when the incident report is drawn up by operator and dispatcher.

 Are the feeds from the ADS system reaching the ambulance system/crew
correctly?

 We also need to check if once the incident is logged, the dispatcher dispatches an
ambulance and the supervisor updates the status of the incident report in timely
fashion and then can the same report be draw up again by the operator when a
user calls in to request the status of the incident.

6
4.2 Features that cannot be tested

 If the system goes down can it come up with in 10 minutes?

This feature cannot be tested as we cannot simulate an actual server crash and try
and recover it.

 How does the system performance vary during high loads?

This non-functional feature of performance under higher loads could not be tested
due to lack of time and resources to generate large volumes of incident report
requests

 Does the system perform optimally under low/normal load?

It is hard to determine in short time as to what should equate to optimal


performance. Hence this feature cannot be tested. Also it is hard to saw what low
and normal load is.

 Is the system portable across different platforms?

This could be tested, as the system could not be deployed across different
platforms for testing due to lack of time.

5 Pass/Fail Criteria

5.1 System level

The test process will be complete (Pass) if a person is able to report an incident, which is
processed by the dispatcher and he is successfully able to locate an ambulance and
dispatch it to the accident spot with in 3 minutes. This is what we call as the pass criteria.
A problem in any place in the entire process is termed as “System failure” (Fail criteria).
However, a successful test is termed as a test which breaks the system. In other words, a
test is successful if it can make the system fail.

5.2 Integration testing


When testing the integration of various sub-parts the services offered by the sub-systems
are tested. These services are invocated and if the service results in completion of

7
expected tasks then it is the success. If the sub-system fails to perform a task which is
intended to do then it is a failure. The whole concept of successful testing described in the
previous section is applicable here too.

5.3 Unit testing


In unit testing the pass and fail criteria for each sub-system depends on the functionality
of those sub-system. If pre conditions and post conditions are satisfied then it is the
(Black box) success otherwise failure. White box testing goes more into the execution
paths of the program.

6 Approach

6.1 Testing Levels


The testing for the Ambulance dispatch system project will consist of Unit,
System/Integration (combined), Acceptance test levels (Out of scope) and Installation test
(Out of scope). It is expected that the entire test team will be available for
system/integration testing. However, with such a short time line established; most of the
Unit testing will be done by the development team with the test team’s knowledge. The
only system test that will be performed is Functional/Performance test.

6.1.1 Unit Testing


It will be done by the developer and will be approved by the development team lead.
Proof of unit testing (test case list, sample output, data printouts, and defect information)
is provided by the development team. This must be provided by the programmers to the
development team lead before unit testing will be accepted and passed on to the testing
team. All unit test information will also be provided to the test team.

6.1.2 System/Integration Testing


This will be performed by the test team and with the knowledge from the individual
developers who were also involved with developing the system is required. No specific
test tools are available for this project. Programs will enter into System/Integration test
after all critical defects have been corrected (In the unit testing). A program may have

8
errors only in the interfacing level and should not have any errors in the internal
functionality.

6.1.3 Acceptance/Installation Testing:


These tests will be performed by the clients/actual end users with the assistance of the
testing team. The acceptance/installation tests will be done in parallel with the existing
Ambulance dispatch system process for a particular time period (Decided by the client: In
our system it’s the Government) after completion of the System/Integration test process.
But these tests are out of scope for the course project. So we stop here without giving
many details about it.

6.2 Integration Testing Strategy

Out of three Integration testing strategy we have opted the bottom-up Integration testing
strategy for the following purposes,
 The system we developed is an object-oriented system and has been decomposed
accordingly.
 The Ambulance dispatch system also has to interact to the real world environment
variables like ambulance, driver, traffic, etc. it’s a real time system. The system
operates in real time and time is crucial.
 Since we have planned to do the system testing along with the integration testing,
bottom up will help us move slowly from sub-parts (Bottom) to the entire system
(Up).
 Since bottom-up testing is more intuitive and natural we follow it.
 Bottom-up testing does not need any additional stuff like stubs.
 It leads to good performance of testing and it is very simple to perform.

6.3 UML class diagram for integration testing

9
Incident
Layer 0

ADS controller
Layer 1

Request handler
Layer 2

Database Layer 3

First the database and the request handler interface is tested, then the ADS controller,
Request handler and database are integrated and tested. Finally as a system, the Incident
class, ADS controller, Request handler and database are integrated and tested. Kindly see
that the class’s attributes and methods are not shown in detail as this is a generic test plan.

7 Suspension and Resumption


Suspension criteria specify the criteria to be used to suspend all or a portion of the
testing activities while resumption criteria specifies when testing can resume after it has
been suspended.

7.1 Suspension Criteria

The ADS Controller class is a prerequisite to the operation of all of the Services Classes
and must operate successfully in order for testing of any of the Service Classes to
proceed. Should any of the ADS Controller class primitives not perform as predicted, and
then the tests for interconnectivity shall be suspended.

10
The Incident, Dispatcher and Operator classes will be thoroughly tested. Should the
testing within the Service Class not perform as predicted, then the testing shall be
suspended.

7.2 Resumption Requirements

Testing will resume from the beginning when the reasons for suspension of testing have
been determined, have been corrected, and new versions of the Application Entities have
been submitted to the testing group.

Testing within the Service Class will resume from the beginning of testing for that
Service Class when the reasons for suspension of testing have been determined and new
versions of the Application Entity in question have been submitted to the testing group. .

8 Testing materials (hardware/software requirements)

8.1 Hardware requirements

 3 Pentium machines, each having 1GB RAM, 40GB HDD


 These machines need to be set up in a room of 600 sq feet room and each of this
system need a UPS.

8.2 Software Requirements

 These test boxes run on Windows XP operating system. Netbeans was used to test
the java code.
 No special software was needed to test the ADS system.

8.3 Other Resources

4 member of the tem were required as testers, 3 to test the ADS system and 1 to interact
with the developers and review other team activities.

11
9 Test Cases

9.1 Unit Test Cases

Incident Class

Test case specification identifier ADS_UT_01


Test items assignAmbulances()
Input specifications Inc ID, Ambulance List
Output specifications Ambulances assigned correctly
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_02


Test items getIncidentDetails()
Input specifications IncId
Output specifications Incident details
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_03


Test items getIncidentsToBeAssigned()
Input specifications Nothing
Output specifications Incident IDs which needs to be assigned
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_04


Test items getIncidentsAssigned()
Input specifications Nothing
Output specifications Incident IDs which needs to be assigned
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

12
Test case specification identifier ADS_UT_05
Test items isDuplicate()
Input specifications IncID
Output specifications -1 if not a duplicate, incID which is the duplicate
of this id
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_06


Test items markDuplicateIncident()
Input specifications IncID
Output specifications IncID marked as DUPLICATE
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_07


Test items getDuplicateIncidents()
Input specifications IDs
Output specifications IDs are filled with duplicate incident ids
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Ambulances Class

Test case specification identifier ADS_UT_08


Test items getNearestAmbulances()
Input specifications ID, Location, noOfAmbulances
Output specifications IDs are filled with ambulance IDs which are near
to the Location
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Dispatcher Class

Test case specification identifier ADS_UT_09


Test items getIncidents()
Input specifications ID List

13
Output specifications IDList is filled with the Incident ids that needs to
be assigned.
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_10


Test items getAssignedIncidents()
Input specifications ID List
Output specifications IDList is filled with the Incident ids that are
completely assigned
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_11


Test items getDuplicatedIncidents()
Input specifications ID List
Output specifications IDList is filled with the Incident ids that are
marked DUPLICATE
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Supervisor Class

Test case specification identifier ADS_UT_12


Test items addStation()
Input specifications stationId, streetNumber, streetName,
blockNumber, city, state, zip, phone
Output specifications Returns true if a new station was successfully
created. Otherwise it returns false
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_13


Test items getCrewList
Input specifications -
Output specifications Returns a list of available crew which can be
assigned to an ambulance.
Environmental needs Windows XP, J2SE, JUnit

14
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_14


Test items addAmbulance()
Input specifications stationId, crewId
Output specifications Returns true if a new ambulance was successfully
added to the fleet. Otherwise returns false.
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_15


Test items addCrew()
Input specifications personnel1, personnel2
Output specifications Returns true if a new crew was successfully added
to the fleet. Otherwise returns false.
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_16


Test items createCaller()
Input specifications id,fname,lname,addr,ph,email
Output specifications Returns true if a new caller was successfully
added. Otherwise returns false.
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_17


Test items findCallers()
Input specifications id,fname,lname,zipcode,ph,email
Output specifications Returns the callers that are in the database based
on parameters
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

15
Test case specification identifier ADS_UT_18
Test items createIncident()
Input specifications id,desc,location,address,units,status,injured,callerid,todaydate
Output specifications Returns true if a new incident was successfully added.
Otherwise returns false.
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_19


Test items findIncidents()
Input specifications id,zipcode,incident_status1,incident_status2,incident_stat
us3,incident_status4,dateoccured,callerId
Output specifications Returns the list of incidents found based on the
parameters
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_UT_20


Test items closeIncident()
Input specifications Id
Output specifications Marks the incident as CLOSED after the incident
is served.
Environmental needs Windows XP, J2SE, JUnit
Special procedural requirements Load- Normal
Intercase dependencies -

9.2 Integration Test Cases

Test case specification identifier ADS_IT_01


Test items Incident Management Subsystem, ADS Controller
Subsystem
Input specifications Call received and ambulance dispatch
Output specifications 1) The caller calls the operator and reports the
incident.
2) The operator gets all the possible details like
location, kind of incident, name and phone number
of the caller, severity, number of people affected
etc and logs an incident into the Ambulance
Dispatch System.

16
3) The Dispatcher gets a message about the
incident and will dispatch an ambulance
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_IT_02


Test items Incident Management Subsystem, ADS Controller
Subsystem, Persistent Data Storage Subsystem
Input specifications Status update about incident
Output specifications 1) Crew updates status of incident
2) Crew updates information about the hospital
where the patients are admitted.
3) Ambulance GPS updates its current location
periodically to the ADS.
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_IT_03


Test items Incident Management Subsystem, ADS Controller
Subsystem, Persistent Data Storage Subsystem
Input specifications General enquiry about the incident
Output specifications 1) Caller calls up to enquire the status of the
incident reported.
2) Operator transfers the call to the officer.
3) Officer responds to the caller by extracting the
incident report from ADS.
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_IT_04


Test items Incident Management Subsystem, ADS Controller
Subsystem, Persistent Data Storage Subsystem
Input specifications Ambulance position update received
Output specifications The operator shall update the ambulance’s position
in the system
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

17
Test case specification identifier ADS_IT_05
Test items Incident Management Subsystem, ADS Controller
Subsystem, Persistent Data Storage Subsystem
Input specifications Incident status report received
Output specifications The operator shall input the status report based on
scenario.
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_IT_06


Test items Fleet Management Subsystem, ADS Controller
Subsystem, Persistent Data Storage Subsystem
Input specifications Update ambulance fleet information
Output specifications The supervisor shall update the system with
changes to the ambulance fleet.
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_IT_07


Test items UI Interface Subsystem , Incident Management
Subsystem, ADS Controller Subsystem, Persistent
Data Storage Subsystem
Input specifications Update map information
Output specifications The supervisor shall update the system with
changes to the city’s map.
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

9.3 Functional Test Cases

Test case specification identifier ADS_FT_01


Test items ADS, Incident Management
Input specifications Successful telephone call to Operator
Output specifications Operator creates Incident report successfully
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

18
Test case specification identifier ADS_FT_02
Test items ADS, Incident Management, Persistent Data
Storage
Input specifications Caller has called about incident.
Output specifications Caller name, phone number, incident information
are available
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_FT_03


Test items ADS, Incident Management, Persistent Data
Storage
Input specifications Caller provides information about previously
reported incident
Output specifications Caller gets the information he needs
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_FT_04


Test items ADS, Incident Management, Persistent Data
Storage
Input specifications Incident ID is provided
Output specifications Incident information is displayed on screen
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_FT_05


Test items ADS, Incident Management, Persistent Data
Storage
Input specifications ADS provides possible duplicates about incident to
the Dispatcher
Output specifications Dispatcher gets information if this is a new
incident or a duplicate one
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_FT_06


Test items ADS, Incident Management, Persistent Data
Storage
Input specifications A list of incidents that needs Ambulance units are
displayed.

19
Output specifications Ambulance sent to the incident location
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_FT_07


Test items ADS, Incident Management, Persistent Data
Storage
Input specifications Ambulance ID is provided
Output specifications Status updated when the ambulance is being
served.
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_FT_08


Test items ADS, Incident Management
Input specifications Ambulance GPS reports location
Output specifications Ambulance GPS is valid
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_FT_09


Test items ADS, Incident Management, Persistent Data
Storage
Input specifications Demand for more ambulance
Output specifications Necessary updates are completed
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_FT_10


Test items ADS, User Management, Persistent Data Storage
Input specifications New employee joined the company
Output specifications Successful addition
Environmental needs Windows XP, J2SE
Special procedural requirements Load- Normal
Intercase dependencies -

Test case specification identifier ADS_FT_11


Test items ADS, User Management, Persistent Data Storage
Input specifications Staff provides a valid user name and password
Output specifications If information correct, homepage is displayed
Environmental needs Windows XP, J2SE

20
Special procedural requirements Load- Normal
Intercase dependencies -

10 Test Schedule

10.1 Responsibilities
We as a team divided the responsibility among our self. Each person did the following
tasks:

Name Tasks/Testing Status


Ramiah System testing Completed
Anand Integration testing Completed
Kiran Unit testing & Integration Completed
testing
Nikhil Unit testing (Along with Completed
few other developers)

10.2 Staffing and training needs

It is preferred that there will be at least one of the testers assigned to the project for the
system/integration and unit testing phases of the project. This will require assignment of a
person in the full project from the beginning of the project to participate in reviews,
development, etc and three other members will be assigned more into testing. If a
separate test person is not available the test lead will assume this role.
In order to provide complete and proper testing the following areas need to be addressed
in terms of training.
 The developers and tester(s) will need to be trained on the basic operations of the
Ambulance dispatch system. Prior to final acceptance of the project the operations
staff will also require complete training on the Ambulance dispatch process.
 The ambulance driver will need training in using a GPS system.
 The operator should be trained on how to use PC and other software related to the
ambulance dispatch system.

21
 The operator is also trained on fast typing on the keyboard as it is a time crucial
project.

10.3 Risks and contingencies

There are numerous issues to be addressed before installing the ADS as a defective
system will lead to loss of lives and eventually contains many potential risks. The various
risks should be considered and the system should be tested properly. The other things
which should also be checked are as follows:
 The ambulance should be checked whether maintained properly. If it does not
drives good it impacts directly on the entire system.
 The driver should be in a pink of health.
 The operator should have good eye-sight and capacity to act under emergency
situations and not to panic.
 The database should be functioning properly and should provide fast accesses.
 The computer system which the operator use must be fast and efficient.
 The telephone line must be from the best connection provider.

10.4 Testing schedule

Test Date Tester


Unit testing part 1 July 17, 2007 Nikhil
Unit testing part 2 July 18, 2007 Kiran and Nikhil
Integration testing July 19, 2007 Anand and Ramiah
System testing July 19, 2007 Ramiah

22

You might also like