You are on page 1of 22

Basic form of

testing process:
Verification and
Validation
1
Verification: (as defined by IEEE/ANSI)

It is a process of evaluating a system or component to


determine whether the products of a given development phase
satisfy the conditions imposed at the start of that phase.

Verification is the process of evaluating, reviewing, inspecting and


doing desk checks of work products such as requirement
specifications, design specification and code.
It can be applied to all those things that can be reviewed in the
early phases to make sure that what comes out of that phase is
what we expected to get.
It is a ‘human testing’ as it involves looking at the documents on
paper.

2
Validation: (as defined by IEEE/ANSI)
It is a process of evaluating a system or component during or
at the end of development process to determine whether it
satisfies the specified requirements.

It involves executing the actual software.

 It is a computer based testing process.


It usually exposes symptoms of errors.

Definition: Testing = Verification + Validation

3
 Verification and validation are complementary: The
effectiveness of error detection suffers if one or the other is
not done.

Each of these provides filters that are designed to catch


different kinds of problems in the product.

Historically: Testing has been largely validation oriented.

4
Testing and the development cycle:
Fig. 2.1 shows a typical model for the development life cycle
and the place of testing within it:

Now: There are phases in s/w life cycle and for each
development phase there is corresponding phase for testing.

5
 Fig. 2.2 shows the Software Development Technologies
Dotted U-Model in which the integration of development cycle
and the testing cycle is shown.

There is one to one correspondence between development


and test phases in their respective cycles.

Each major deliverable produced by development is tested


(verified or validated) by the testing organization.

6
Fig. 2.2 7
Effective testing and cost effective testing:

Basic forms of testing are as follows:

1. Full testing: Starts no later than the requirement phase and


continues through acceptance testing.

2. Partial testing: Begins any time after functional design has


been completed with less than optimum influence on
requirement and functional design.

3. Endgame testing: It is highly validation oriented with no


influence on requirements or functional design.

8
1. Audit level testing: It is a bare bones audit of plans,
procedures, and products for adequacy, correctness and
compliance to standards.

Use full testing for critical software or any software that


will have heavy and diverse usage by a large user
population.

Use partial testing for small, non-critical software


products with a small, captive user population.

9
Critical Software (IEEE/ANSI)

Is a software whose failure could have an impact on safety, or


could cause large financial or social losses.

To enable an effective test effort, we need a software


development process that produces:

Requirement specifications (required for full testing)

Functional design specifications (required for full, partial


and endgame testing)
Internal design specification (required for maximum,
effectiveness of full and partial testing)
10
Effective testing is a testing that is successful at detecting
errors but may not be cost effective.

Do we know what testing is really costing us?

What %age of our development resource testing represents

Are testing tools saving time?

Are they are exploited to their fullest?

Is our testing cost effective? (Fig. 2.3)

11
The more effective the error detection, the greater the
savings in the development and maintenance costs over the
life of the product.
12
Basic Verification methods

•Verification is a “human” examination or review of the work


product.
•There are various types of reviews. Inspections, walkthroughs,
technical reviews.

•Inspections are more structured.

Inspection: Key elements and phases

Objectives:
• To obtain defects and collect data;
• To communicate important work product information.
13
Elements:
• A planned, structured meeting requiring individual
preparation by all participants;
• A team of 3-6 people, led by an impartial moderator;
• Presenter is “reader” other than the producer.
Input:
•Document to be inspected;

•Related source documents;


•General and “tailored” checklist.
Output:
•Inspection summary/report;
•Data on error types.
14
Presenter is neither the producer nor the author of the
document being inspected.

Testers see the product differently than developers.


They attend to weak areas. They have questions like:

• How is this thing going to work?


•If we test this way what is going to happen?
Thus, testers are part of the inspection or reviews process.

15
Walkthroughs:

Walkthroughs are less formal inspection mainly because of the


lack of preparation. Participants simply come to the meeting;
presenter prepares (usually the author of the product) and there’s
no additional effort by the participants prior to the meeting.

Walkthroughs : Key elements

Objective:

• To detect defects and to become familiar with the material.

16
Elements:

• A planed meeting where only the presenter must prepare;

•A team of 2-7 people, let by the producer/author;

•The presenter is usually the producer.

Input:
• Element under examination, objectives for the walkthrough,
applicable standards.

17
Output:

•Report

•Walkthroughs can cover more material than inspections


and reviews because the presenter is the producer. Thus
provide an opportunity for larger number of people to become
familiar with the material.

•Occasionally, walkthroughs are used for purpose of


communication rather than for discovering defects.
The main goal is to familiarize ourselves with the product.

18
What (and how much) verification to do?

•If resource or schedule limitations preclude any of the verification


activities, candidates for elimination should be considered in the
reverse order of their occurrence.

•Requirements verification offers the biggest potential saving to


software development efforts. Like all testing activities, verification
of large work products will not be exhaustive and will usually
involve risk assessments and trade-offs. Thus, “mix and match”
of verification methods is used.

19
• The part of the code that’s critical will be inspected.

• Another part of the code that’s not so important:


A less formal review or desk check or an informal walkthrough or
variations on these.

•There are always trade offs and this is where risk analysis comes
into play.

20
Checklists: the verification tool

There are generic checklists that can be applied at a high


level and maintained for each type of inspection.

Sample generic checklists.


The following checklists are there for each type of inspection:

•Requirements verification checklists

•Functional design verification checklist

•Internal design verification checklists


•Generic code verification checklists
•Generic document verification checklists
21
Checklists are an important part of test ware. They should be
carefully kept, improved, developed, updated and someone
has to take responsibility for this.

Advantages:

•They are a vital tool for verification testing,

•They are in important training device,

•They ensure continuity of the verification effort through


different projects.
•They provide a record of the organization’s progress in
verification. 22

You might also like