You are on page 1of 30

SWE 434 - Software Testing & Validation

Verifications and Validations

Dr. M. Shamim Hossain


L. Manal Alonaizan
L. Arwa Alamoudi

Some of the material in these slides is derived from slides produced by Prof.
Some, Alan and Bob of U Ottawa, and Ian Somerville (from instructor
resources). Thanks to them. 1
Outline

● Recap to SQA definition and Software Quality.


● The three general principles of Quality Assurance.
● SQA, Verification and Validation.
● Static verification and dynamic validation.
● Verification and validation techniques.
● Important terminologies.
● Test architecture.
● Test result, test script, and test script development.

2
What is Software Quality
Assurance?
According to D. Galin,

Software quality assurance is:

“A systematic, planned set of actions necessary to provide adequate


confidence that the software development process or the maintenance
process of a software system product conforms to established
functional technical requirements as well as with the managerial
requirements of keeping the schedule and operating within the
budgetary confines.”

3
What is Software Quality?
According to the IEEE,

Software quality is:

(1) The degree to which a system, component, or process meets


specified requirements.
(2) The degree to which a system, component, or process meets
customer or user needs or expectations.

4
What is Software Quality?
According to Pressman,

Software quality is:

“Conformance to explicitly stated functional and performance


requirements, explicitly documented development standards, and
implicit characteristics that are expected of all professionally
developed software.”

5
Three General Principles of QA

● Know what you are doing.


● Know what you should be doing.
● Know how to measure the difference.

6
Three General Principles of QA
● Know what you are doing
● Understand what is being built, how it is being built
and what it currently does.
● Follow a software development process with
● Management structure (milestones, scheduling)
● Reporting policies
● Tracking

7
Three General Principles of QA
● Know what you should be doing:
● Having explicit requirements and specifications.
● Follow a software development process with
● Requirements analysis,
● Acceptance tests,
● Frequent user feedback.

8
Three General Principles of QA
● Know how to measure the difference.
● Having explicit measures comparing what is being done from
what should be done.
● Four complementary methods:
1. Formal methods – verify mathematically specified
properties.
2. Testing – explicit input to exercise software and check for
expected output.
3. Inspections – human examination of requirements, design,
code, ... based on checklists.
4. Metrics – measures a known set of properties related to
quality.
9
SQA
●Typical activities of a SQA process
■ Requirements validation.
■ Design verification.
■ Static code checking (inspection/reviews).
■ Dynamic testing.
■ Process engineering and standards.
■ Metrics and continuous improvement.

10
SQA, Verification and Validation
● Assuring that a software system meets a user's needs
● Objectives
● To introduce software verification and validation and to discuss
the distinction between them
● To explain static analysis as a verification technique

11
Verification VS. Validation
● Verification: "Are we building the product right"
● The software should conform to its specification
● Syntax (Method, tools, algorithm...)
● Validation: "Are we building the right product"
● The software should do what the user really requires
● Semantics (all the functionalities)
● Two principal objectives
● Discovery of defects in a system
● Assessment of whether the system is usable in an operational
situation
12
Static and Dynamic V&V

13
Verification and Validation
● Software Verification and Validation (V&V) is a
disciplined approach to assessing software products
throughout the product life cycle.
● A V&V effort strives to ensure that quality is built into the
software and that the software satisfies user
requirements.

IEEE 1059, Guide for Software Verification and Validation Plans

14
Why V and V?
● Many historical software disasters
● google “software disasters”
● According to the US National Institute of Standards and
Technology (NIST) in 2002
● Software bugs, or errors, are so prevalent and so detrimental that
they cost the U.S. economy an estimated $59.5 billion annually, or
about 0.6 percent of the gross domestic product

15
Verification
● Verification
● The process of determining whether or not the products of a given
phase of the software development cycle fulfill the requirements
established during the previous phase. (IEEE Std 729-1983)
● Verification that the products of each software life cycle
phase:
● Comply with previous life cycle phase requirements and products
(e.g., for correctness, completeness, consistency, and accuracy)
● Satisfy the standards, practices, and conventions of the phase
● Establish the proper basis for initiating the next life cycle phase
activities 16
Validation
● Validation
○ The process of evaluating software at the end of the software
development process to ensure compliance with software
requirements. (IEEE Std 729-1983)

● Validation that the completed end product complies with established


software and system requirements

17
V and V Techniques

● Software verification and validation employs review,


analysis, and testing techniques to determine whether a
software system and its intermediate products comply
with requirements. These requirements include both
functional capabilities and quality attributes.
(IEEE 1059, Guide for Software Verification and Validation Plans)

18
V and V techniques

● Static Approaches (applicable to verifications)


● Reviews & Inspections
● Static analysis

● Dynamic Approaches (applicable to both verification and


validation)
● Involve running the system
● Resume to Testing

19
Exercise

● In a group of three students:

List 3-4 major differences between Validation


and Verification.

20
Terminology
Warning: many organizations, tools, or standards use inconsistent
terminology ☹.
● Stimulus: an action / request / command sent to a system under
test.
● Response: something that can be observed from a system under
test.
● Test case: a sequence of stimuli and expected responses with a
specified purpose and decisive result.
● Test script: a test case as a set of instructions, for either manual
or automated execution.
● Test suite: a collection of related test cases.
● Verdict: the decision as to the result of a test case. 21
Testing Basics
● What do we need to do testing?
● Test script
● Stimuli to send to system under test (SUT).
● Responses expected from SUT.

● For an automated test execution system, additional


requirements are:
● Test controller, to manage execution.
● Mechanism to read test script, and connect test case to
SUT.

22
Test case Verdicts
● A verdict is the declared result of executing a single test.
● Pass: the test case achieved its intended purpose, and the software
under test performed as expected.
● The expected responses were observed in the proper sequence with
the proper timing, and no other anomalies were observed.

● Fail: the test case achieved its intended purpose, but the software
under test did not perform as expected.
● One or more of the expected responses were not observed

● Error: the test case did not achieve its intended purpose.
● Potential reasons:
● An unexpected event occurred during the test case.
● The test case could not be set up properly
23
Test Architecture (1)
● Includes defining the set of Points of Control and Observation (PCOs)

Tester or
automated tool
Test script

PCOs ... A PCO could be…


• a device or user interface
System under
• a particular method to call (Junit)
test • a network port
• etc.
24
Test Results
● Documenting of test results is a crucial element
of testing.
● When was a test suite run (day, time, etc.)?
● What is the summary count of each type of verdict?
● What are the specific verdicts for each test case?

25
Test Scripts
● What should the format of a test script be?
● natural language? (for manual testing)
● tool dependent?
● a standard test language?
● a programming language?

26
Test Scripts
● natural language? (for manual testing)

27
Photo taken from https://www.linkedin.com/pulse/20140712034521-99719098-test-scenario-vs-test-case-vs-test-script
Test Script Development
● Creating test scripts follows a parallel development process,
including:
● Requirements
● Design
● Debugging
● Configuration management
● Maintenance
● Documentation

● Result: they are expensive to create and maintain, especially


for automated test execution.
28
Summary

● Verification is The process of determining whether or not the


products of a given phase fulfill the requirements established during
the previous phase.
● Validation is the process of evaluating software at the end of the
software development process.
● Static approaches are applicable to verification process; while
dynamic approaches are applicable to both verification and
validation processes.
● There are many important terminologies related to testing: test case,
test suite, test script, … etc.

29
Video: How to write a test case?

● https://www.youtube.com/watch?v=BBmA5Qp6Ghk

30

You might also like