Professional Documents
Culture Documents
3
Final Project Deliverable Guide Date: 18/04/2019
“Race or Die”
DECLARATION
I certify that project title “Race or Die” is under my supervision with students of
Salman Masih
Faculty of Computer Science
Department of Computer Science
University of Gujrat, Punjab, Pakistan.
Email: Salman.munawar@uog.edu.pk
Dated: 17-04-2018
STATEMENT OF SUBMISSION
This is certify that Subhan Ali Roll No. 15121519-082, Monam Khan Roll No.
15121519-107 and Mohsin Ali Roll No. 15121519-127 successfully completed the
final project named as: Race or Die, at the Department of Computer Science
University of Gujrat, to fulfill the requirement of the degree of BS in Computer
Science.
______________________ _____________________
Project Supervisor Project Management Office
Faculty Of CS - UOG
_____________________________ ________________________
External Examiner Chairman
TABLE OF CONTENTS
CHAPTER 1: FINAL PROJECT PROPOSAL........................................................7
1.1. INTRODUCTION:...................................................................................................8
1.2. PROJECT TITLE:...................................................................................................8
1.3. PROJECT OVERVIEW STATEMENT:.......................................................................8
1.4. PROJECT GOALS & OBJECTIVES:.........................................................................9
1.5. HIGH-LEVEL SYSTEM COMPONENTS:...................................................................9
1.6. LIST OF OPTIONAL FUNCTIONAL UNITS:..............................................................9
1.7. EXCLUSIONS:.....................................................................................................10
1.8. APPLICATION ARCHITECTURE:..........................................................................10
1.9. GANTT CHART:..................................................................................................10
1.10. HARDWARE AND SOFTWARE SPECIFICATION:.................................................11
1.11. TOOLS AND TECHNOLOGIES USED WITH REASONING:.....................................11
CHAPTER 2: FIRST DELIVERABLE...................................................................12
2.1. INTRODUCTION..................................................................................................12
2.2. PROJECT/PRODUCT FEASIBILITY REPORT..........................................................12
2.2.1. Technical Feasibility..................................................................................12
2.2.2. Operational Feasibility..............................................................................13
2.2.3. Economic Feasibility.................................................................................13
2.2.4. Schedule Feasibility...................................................................................13
2.2.5. Specification Feasibility............................................................................13
2.2.6. Information Feasibility..............................................................................13
2.2.7. Motivational Feasibility.............................................................................13
2.2.8. Legal & Ethical Feasibility........................................................................13
2.3. PROJECT/PRODUCT SCOPE.................................................................................13
2.4. PROJECT/PRODUCT COSTING.............................................................................13
2.4.1 P Project Cost Estimation by using COCOMO’81 (Constructive Cost
Model)..................................................................................................................14
2.5 TASK DEPENDENCY TABLE................................................................................15
2.6 CRITICAL PATH METHOD...................................................................................15
2.7 GANTT CHART.....................................................................................................17
2.10. TOOLS AND TECHNOLOGY WITH REASONING..................................................17
2.11. VISION DOCUMENT..........................................................................................18
2.12. RISK LIST.........................................................................................................19
2.13. PRODUCT FEATURES/ PRODUCT DECOMPOSITION..........................................19
CHAPTER 3: SECOND DELIVERABLE FOR OBJECT ORIENTED
APPROACH...............................................................................................................20
3.1 INTRODUCTION:..................................................................................................20
3.1.1 Systems Specifications................................................................................21
3.1.2. Identifying External Entities......................................................................22
3.1.4. Capture "shall" Statements:......................................................................22
3.1.5. Allocate Requirements:..............................................................................22
3.1.6. Prioritize Requirements:............................................................................23
3.1.7. Requirements Trace-ability Matrix:..........................................................23
3.2.1. Introduction...............................................................................................23
3.2.6. Capture "shall" Statements:......................................................................23
3.2.7. Allocate Requirements:..............................................................................23
© Department of Computer Science.
4
CS - UOG - Project Coordination Office Version: 2.3
Final Project Deliverable Guide Date: 18/04/2019
1.1. Introduction:
Our project is a car racing game. Considering the craziness of people about games,
and the growing interest of people about game, as the gamers are increasing day by
day. We are developing a server based car racing game that will be much more
challenging than the previous car games. In this game four users will be playing one
match, in that match some kind of interruptions/obstacles will occur that will force
user to choose another alternate way to get throw. That alternate way will be
generated through any button clicked. After some specific time all users can activate
one of the given weapons to kill other users. The last one standing or first one to cross
the finish line will win.
Project Members:
Name Registration # Email Address Signature
Subhan Ali 15121519-082 Inoxentsubhan@gmail.com
Project Goal:
To create a challenging game. That will be great entertainment for car racing game lovers.
Objectives:
Sr.#
1 Build a desktop app and game.
2 Connect it with server (useful for matchmaking service).
Security:
The password and user name for log in will be secured.
Performance:
1.7. Exclusions:
Offline game play levels will not be discussed in this game.
Connected to server.
DATABASE
2.1. Introduction
First deliverable is all about planning and scheduling of project. This deliverable must
contain following artifacts:
a. Project Feasibility
b. Project Scope
c. Project Costing
d. Task Dependency Table
e. Critical Path Method Analysis (CPM Analysis)
f. Gantt Chart
g. Introduction to team members
h. Tasks and member assignment table
i. Tools and Technologies
j. Vision Document
k. Risk List
l. Product Features
Technical
Operational
Economic
Schedule
Specification
Information
Motivational
Legal and Ethical
transaction. The major problem of game is Internet. If you don't have the Internet
connection the game will not work.
1. Effort – a (KLOC)b
2. Development time – c (Effort)d
3. Average Staff Size – Effort/Dev-Time
4. Productivity – KLOC/Effort
LOC/FP for C# = 29
Lines of Code (LOC) = LF * FP
= 29 * 240.89
LOC = 6985.81 line
KLOC = 6985.81 / 1000 = 6.985 ≈ 7
1. Semi-detached:
Effort: 3 (7)1.12 = 26.52 PM
Dev-Time: 2.5 (26.52)0.35 = 7.87 Months
Average Staff Size: 26.52 / 6.32 = 4.19 Persons
Productivity: 7 / 26.52 = 0.26 KLOC/PM
Slack Time
Table 3: Task Slack Time Table
Task No Late Start Early Start LS-ES
A 0 0 0
B 15 15 0
C 33 33 0
© Department of Computer Science.
14
CS - UOG - Project Coordination Office Version: 2.3
Final Project Deliverable Guide Date: 18/04/2019
D 54 54 0
E 70 54 16
F 100 100 0
G 128 128 0
H 128 128 0
I 154 154 0
2. Gantt Project:
We used Gantt project to create Gantt chart for our project.
3. Edraw Max:
We use this tool to create other diagrams like sequence, class, domain
etc. for our project.
4. Language:
C# will be used.
Have you covered all kinds of constraints - for example political, economic,
and environmental?
Have all key features of the system been identified and defined?
Will the features solve the problems that are identified?
Are the features consistent with constraints that are identified?
3.1 Introduction:
Our project is a car racing game. Considering the craziness of people about games,
and the growing interest of people about game, as the gamers are increasing day by
day. We are developing a server based car racing game that will be much more
challenging than the previous car games. In this game four users will be playing one
match, in that match some kind of interruptions/obstacles will occur that will force
user to choose another alternate way to get throw. That alternate way will be
generated through any button clicked. After some specific time all users can activate
one of the given weapons to kill other users. The last one standing or first one to cross
the finish line will win.
Requirements elicitation
Requirements analysis and negotiation
Requirements specification
System modeling
Requirements validation
Requirements management
Our project is a car racing game. Considering the craziness of people about games,
and the growing interest of people about game, as the gamers are increasing day by
day. We are developing a server based car racing game that will be much more
challenging than the previous car games. In this game four users will be playing one
match, in that match some kind of interruptions/obstacles will occur that will force
user to choose another alternate way to get throw. That alternate way will be
generated through any button clicked. After some specific time all users can activate
one of the given weapons to kill other users. The last one standing or first one to cross
the finish line will win.
Existing System
Our game gives an opportunity for game lovers to show their skills in gaming. First
we will provide our user with free Token to enter our game and Explore our game.
The token is itself a Real -Money which user request us to convert Token => Money.
After the free token been consumed user must have to buy the new tokens which get
incremented on winning. The API of 2CO or Google wallet is going to be used in our
project for transactions. The Online server will also be used in our project.
Organizational Chart
Organizational chart will be very much supportive to get a better overview of the
organization’s business areas and their decomposition into different departments.
Scope of the System
The domain of gaming is very big. But our game restricted to only Desktop users. The
game can only be played on Pc’s.
Summary of Requirements: (Initial Requirements)
Requirements of our project include the system, desktop, time duration, cost, budget,
etc. We need to analyze them according to the expectation of the project work.
Desktop:
This system is use as a desktop application for clients.
Time Duration: Require the time duration for the project is 5 th month for analyze the
requirement of the project.
Cost: 20000 to 30000 cost is required for the project.
Software: We use Unity3d, Microsoft Visual studio.
3.2.1. Introduction
Our project is a car racing game. Considering the craziness of people about games,
and the growing interest of people about game, as the gamers are increasing day by
day. We are developing a server based car racing game that will be much more
challenging than the previous car games. In this game four users will be playing one
match, in that match some kind of interruptions/obstacles will occur that will force
user to choose another alternate way to get throw. That alternate way will be
generated through any button clicked. After some specific time all users can activate
one of the given weapons to kill other users. The last one standing or first one to cross
the finish line will win.
Brief description
Used to describe the overall intent of the use case. Typically, the brief description is
only a few paragraphs, but it can be longer or shorter as needed. It describes what is
considered the happy path—the functionality that occurs when the use case executes
without errors. It can include critical variations on the happy path, if needed.
Preconditions
Conditionals that must be true before the use case can begin to execute. Note that this
means the author of the use case document does not need to check these conditions
during the basic flow, as they must be true for the basic flow to begin.
Basic flow
Used to capture the normal flow of execution through the use case. The basic flow is
often represented as a numbered list that describes the interaction between an actor
and the system. Decision points in the basic flow branch off to alternate flows. Use
case extension points and inclusions are typically documented in the basic flow.
Alternate flows
Used to capture variations to the basic flows, such as user decisions or error
conditions. There are typically multiple alternate flows in a single use case. Some
alternate flows rejoin the basic flow at a specified point, while others terminate the
use case.
Post conditions
Conditions that must be true for the use case to completed. Post conditions are
typically used by the testers to verify that the realization of the use case is
implemented correctly.
4.1. Introduction:
Third deliverable is all about the software design. In the previous deliverable, analysis
of the system is completed. So we understand the current situation of the problem
domain. Now we are ready to strive for a solution for the problem domain by using
object-oriented approach. Following artifacts must be included in the 3rd deliverable.
1. Domain Model
2. System Sequence Diagram
3. Sequence Diagram
4. Collaboration Diagram
5. Operation Contracts
6. Design Class Diagram
7. State Transition Diagram
8. Data Model
The UML system sequence diagram (SSD) illustrates events sequentially input from
an external source to the system. The SSD will define the system events and
operations. System sequence diagrams are a timeline drawing of an expanded use
case. Events are related by time with the top events occurring first. System events are
the important items. These are events that cause a system response.
A Sequence diagram depicts the sequence of actions that occur in a system. The
invocation of methods in each object, and the order in which the invocation occurs is
captured in a Sequence diagram. This makes the Sequence diagram a very useful tool
to easily represent the dynamic behavior of a system.
A Sequence diagram is two-dimensional in nature. On the horizontal axis, it shows
the life of the object that it represents, while on the vertical axis, it shows the
sequence of the creation or invocation of these objects.
Because it uses class name and object name references, the Sequence diagram is very
useful in elaborating and detailing the dynamic design and the sequence and origin of
invocation of objects. Hence, the Sequence diagram is one of the most widely used
dynamic diagrams in UML.
6.1. Introduction
1. Site maps
2. Storyboards
3. Navigational maps
4. Traceability Matrix
Feature Register
Priority Highest
DB Table ID Null
Table 6.2.UC_Track_Selection
Feature Register
Priority High
DB Table ID Null
Feature Register
Priority Lowest
DB Table ID Null
Feature Register
Priority High
DB Table ID Null
Feature Register
Priority Highest
DB Table ID Null
7.1 Introduction:
From the start of software engineering different software testing schemes and testing
templates has invented. These testing templates are used to test both UI and unit
testing. Every credible software house developed their testing documentation for the
purpose of developing efficient systems.
To provide a common set of standardized documents the IEEE developed the 829
Standard for Software Test Documentation for any type of software testing, including
User Acceptance Testing.
In our testing module we are going to test different features and modules of the
system using documentation template supported by IEEE829.
7.2.1. Purpose
To prescribe the scope, approach, resources, and schedule of the testing activities. To
identify the items being tested, the features to be tested, the testing tasks to be
performed, the personnel responsible for each task, and the risks associated with this
plan.
7.2.2. Outline
A test plan shall have the following structure:
7.2.2.2. Introduction
Software test planning is the most core part of the testing phase on which all the other
phases depends in one or another way.
a) Requirements specification
b) Design specification
c) Users guide
d) Operations guide
e) Installation guide
Reference any incident reports relating to the test items. Items that are to be
specifically excluded from testing may be identified.
7.2.2.6. Approach
We are going to use following testing technique provided in desktop to test our
project.
Now unit testing covers almost 90 percent of program test. So, we will be using unit
test extensively to test each and every component of our project possibly.
Moreover, Instrumentation tests are used to test the UI of the desktop app. They cover
90 percent of total test coverage.
After doing all this we will be designing and executing test in visual studio and Unity
and their report and summary will be written.
At the end, a coverage of the Project of the desktop Tests will be generated in visual
studio , unity and its report will be attached.
a. Test plan;
b. Test design specifications;
c. Test case specifications;
d. Test procedure specifications;
e. Test item transmittal reports;
f. Test logs;
g. Test incident reports;
h. Test summary reports.
Test input data and test output data should be identified as deliverables.
Test tools (e.g., module drivers and stubs) may also be included.
7.2.2.12. Responsibilities
The main person which is responsible for the testing execution and management is
called test manager. In our documentation and project, the testing is performed by
Subhan Ali
The other two members are also part of the testing team in one way or another.
Following are the responsibility table depicting jobs of different team members in
planning, designing and executing tests.
Test Planning
Test Specification
Test Case Development
Test Writing
Test Execution
7.2.2.14. Schedule
Attached is the full schedule of development of project. You can see that testing is
scheduled to be performed in the First two, three weeks of April 2019.
(1) The system will be rejected by the end user if they found it less interactive and
difficult to use after adding new feature.
(2) A proper documentation of requirement specification subsequently system
specification is crucial. In the absence of these documents test failure can be the
resultant of our effort.
(3) All requirements and feature should be provided and cross checked before the
application is delivered to the client.
(4) If defects remain at the end of designated development period that do not fall into
above categories, then I will conduct a review of the defect to determine that which
action should be taken.
7.2.2.16 Approvals
Test manager and other team members need to approve the overall test strategy.
7.3.1. Purpose
To prescribe the scope, approach, resources, and schedule of the testing
activities. To identify the items being tested, the features to be tested, the
testing tasks to be performed, the personnel responsible for each task, and the
risks associated with this plan.
7.3.2. Outline
A test plan shall have the following structure:
The sections shall be ordered in the specified sequence. Additional sections may be
included immediately prior to Approvals. If some or all of the content of a section is
in another document, then a reference to that material may be listed in place of the
corresponding content. The referenced material must be attached to the test plan or
available to users of the plan.
Details on the content of each section are contained in the following sub-clauses.
7.3.2.2. Introduction
Summarize the software items and software features to be tested. The need for each
item and its history may be included. References to the following documents, when
they exist, are required in the highest-level test plan:
a. Project authorization
b. Project plan
c. Quality assurance plan
d. Configuration management plan
e. Relevant policies
f. Relevant standards
In multilevel test plans, each lower-level plan must reference the next higher-level
plan.
programs must be transferred from tape to disk). Supply references to the following
test item documentation, if it exists:
a. Requirements specification
b. Design specification
c. Users guide
d. Operations guide
e. Installation guide
Reference any incident reports relating to the test items. Items that are to be
specifically excluded from testing may be identified.
7.3.2.6. Approach
Describe the overall approach to testing. For each major group of features or feature
combinations, specify the approach that will ensure that these feature groups are
adequately tested. Specify the major activities, techniques, and tools that are used to
test the designated groups of features.
The approach should be described in sufficient detail to permit identification of the
major testing tasks and estimation of the time required to do each one. Specify the
minimum degree of comprehensiveness desired. Identify the techniques that will be
used to judge the comprehensiveness of the testing effort (e.g., determining which
statements have been executed at least once).
Specify any additional completion criteria (e.g., error frequency). The techniques to
be used to trace requirements should be specified. Identify significant constraints on
testing such as test item availability, testing resource availability, and deadlines.
Test input data and test output data should be identiÞed as deliverables. Test tools
(e.g., module drivers and stubs) may also be included.
7.3.2.12. Responsibilities
Identify the groups responsible for managing, designing, preparing, executing,
witnessing, checking, and resolving. In addition, identify the groups responsible for
providing the test items identified in 7.2.2.3 and the environmental needs identified in
7.3.2.11.
These groups may include the developers, testers, operations staff, user
representatives, technical support staff, data administration staff, and quality support
staff.
7.3.2.14. Schedule
Include test milestones identified in the software project schedule as well as all item
transmittal events. Define any additional test milestones needed. Estimate the time
required to do each testing task. Specify the schedule for each testing task and test
milestone. For each testing resource (i.e., facilities, tools, and staff), specify its
periods of use.
7.3.2.16. Approvals
Test manager and other team members need to approve the overall test strategy.
7.4.1. Purpose
To define a test case identified by a test design specification.
7.4.2. Outline
A test case specification shall have the following structure:
The sections shall be ordered in the specified sequence. Additional sections may be
included at the end. If some or all of the content of a section is in another document,
then a reference to that material may be listed in place of the corresponding content.
The referenced material must be attached to the test case specification or available to
users of the case specification. Since a test case may be referenced by several test
design specifications used by different groups over a long time period, enough
specific information must be included in the test case specification to permit reuse.
Details on the content of each section are contained in the following sub-clauses.
7.4.2.5.1. Hardware
Specify the characteristics and configurations of the hardware required to execute this
test case (e.g., 132 character´ 24 line CRT).
7.4.2.5.2. Software
Specify the system and application software required to execute this test case. This
may include system software such as operating systems, compilers, simulators, and
test tools. In addition, the test item may interact with application software.
7.4.2.5.3. Other
Specify any other requirements such as unique facility needs or specially trained
personnel.
7.5.1. Purpose
To specify the steps for executing a set of test cases or, more generally, the steps used
to analyze a software item in order to evaluate a set of features.
7.5.2 Outline
A test procedure specification shall have the following structure:
b. Purpose
c. Special requirements
d. Procedure steps
7.5.2.2. Purpose
Describe the purpose of this procedure. If this procedure executes any test cases,
provide a reference for each of them. In addition, provide references to relevant
sections of the test item documentation (e.g., references to usage procedures).
7.5.2.4.1. Log
Describe any special methods or formats for logging the results of test execution, the
incidents observed, and any other events pertinent to the test (see Clauses 9 and 10).
7.5.2.4.2. Set up
Describe the sequence of actions necessary to prepare for execution of the procedure.
7.5.2.4.3. Start
Describe the actions necessary to begin execution of the procedure.
7.5.2.4.4. Proceed
Describe any actions necessary during execution of the procedure.
7.5.2.4.5. Measure
Describe how the test measurements will be made (e.g., describe how remote terminal
response time is to be measured using a network simulator).
7.5.2.4.7. Restart
© Department of Computer Science.
49
CS - UOG - Project Coordination Office Version: 2.3
Final Project Deliverable Guide Date: 18/04/2019
Identify any procedural restart points and describe the actions necessary to restart the
procedure at each of these points.
7.5.2.4.8. Stop
Describe the actions necessary to bring execution to an orderly halt.
7.5.2.4.9. Wrap up
Describe the actions necessary to restore the environment.
7.5.2.4..10. Contingencies
Describe the actions necessary to deal with anomalous events that may occur during
execution.
7.6.1. Purpose
To identify the test items being transmitted for testing. It includes the person
responsible for each item, its physical location, and its status. Any variations from the
current item requirements and designs are noted in this report.
7.6.2. Outline
A test item transmittal report shall have the following structure:
The sections shall be ordered in the specified sequence. Additional sections may be
included just prior to Approvals. If some or all of the content of a section is in another
document, then a reference to that material may be listed in place of the
corresponding content. The referenced material must be attached to the test item
transmittal report or available to users of the transmittal report.
Details on the content of each section are contained in the following sub clauses.
7.6.2.3. Location
Identify the location of the transmitted items. Identify the media that contain the items
being transmitted. When appropriate, indicate how specific media are labeled or
identified.
7.6.2.4. Status
Describe the status of the test items being transmitted. Include deviations from the
item documentation, from previous transmittals of these items, and from the test plan.
List the incident reports that are expected to be resolved by the transmitted items.
Indicate if there are pending modifications to item documentation that may affect the
items listed in this transmittal report.
7.6.2.5. Approvals
Specify the names and titles of all persons who most approve this transmittal. Provide
space for the signatures and dates.
7.7.2. Outline
A test log shall have the following structure:
a. Test log identifier;
b. Description;
c. Activity and event entries.
The sections shall be ordered in the specified sequence. Additional sections may be
included at the end. If some or all of the content of a section is in another document,
then a reference to that material may be listed in place of the corresponding content.
The referenced material must be attached to the test log or available to users of the
log. Details on the content of each section are contained in the following sub clauses.
7.7.2.2. Description
Information that applies to all entries in the log except as specifically noted in a log
entry should be included here. The following information should be considered:
Identify the items being tested including their version/revision levels. For each
of these items, supply a reference to its transmittal report, if it exists.
Identify the attributes of the environments in which the testing is conducted.
Include facility identification, hardware being used (e.g., amount of memory
being used, CPU model number, and number and model of tape drives, and/or
mass storage devices), system software used, and resources available (e.g., the
amount of memory available).
Record the identifier of the test procedure being executed and supply a reference to its
specification. Record all personnel present during the execution including testers,
operators, and observers. Also indicate the function of each individual.
7.8.1. Purpose
To document any event that occurs during the testing process that requires
investigation.
7.8.2. Outline
A test incident report shall have the following structure:
The sections shall be ordered in the specified sequence. Additional sections may be
included at the end. If some or all of the content of a section is in another document,
then a reference to that material may be listed in place of the corresponding content.
The referenced material must be attached to the test incident report or available to
users of the incident report.
Details on the content of each section are contained in the following sub clauses.
7.8.2.2. Summary
Summarize the incident. Identify the test items involved indicating their
version/revision level. References to
the appropriate test procedure specification, test case specification, and test log should
be supplied.
Related activities and observations that may help to isolate and correct the cause of
the incident should be included (e.g., describe any test case executions that might
have a bearing on this particular incident and any variations from the published test
procedure).
7.8.2.4.Impact
If known, indicate what impact this incident will have on test plans, test design
specifications, test procedure specifications, or test case specifications.
Test Cases
Login Test
4. Select Gender
Game Test
Table 7.3 race or die test
Test Engineer: Monam Khan
Test Case ID: TC-3
Date: 30-11-18
Purpose: To check is the system allowing user to play the game offline as
well as online.
Pre-Req: User should login successfully and should have internet for getting
data.
Test Data: 1. User authentication
2. Server’s service availiable
7.9.1. Purpose
To summarize the results of the designated testing activities and to provide
evaluations based on these results.
7.9.2. Outline
A test summary report shall have the following structure:
a. Test summary report identifier
b. Summary
c. Variances
d. Comprehensive assessment
e. Summary of results
f. Evaluation
g. Summary of activities
h. Approvals
The sections shall be ordered in the specified sequence. Additional sections may be
included just prior to Approvals. If some or all of the content of a section is in another
document, then a reference to that material may be listed in place of the
corresponding content. The referenced material must be attached to the test summary
report or available to users of the summary report.
Details on the content of each section are contained in the following sub clauses.
7.9.2.2. Summary
Summarize the evaluation of the test items. Identify the items tested, indicating their
version/revision level. Indicate the environment in which the testing activities took
place.
For each test item, supply references to the following documents if they exist: test
plan, test design specifications, test procedure specifications, test item transmittal
reports, test logs, and test incident reports.
7.9.2.3. Variances
Report any variances of the test items from their design specifications. Indicate any
variances from the test plan, test designs, or test procedures. Specify the reason for
each variance.
7.9.2.6. Evaluation
Provide an overall evaluation of each test item including its limitations. This
evaluation shall be based upon the test results and the item level pass/fail criteria. An
estimate of failure risk may be included.
7.9.2.8. Approvals
Test manager and other team members need to approve the overall test strategy.