Professional Documents
Culture Documents
Some of the material in these slides is derived from slides produced by Prof.
Some, Alan and Bob of U Ottawa, and Ian Somerville (from instructor
resources). Thanks to them. 1
Outline
2
What is Software Quality
Assurance?
According to D. Galin,
3
What is Software Quality?
According to the IEEE,
4
What is Software Quality?
According to Pressman,
5
Three General Principles of QA
6
Three General Principles of QA
● Know what you are doing
● Understand what is being built, how it is being built
and what it currently does.
● Follow a software development process with
● Management structure (milestones, scheduling)
● Reporting policies
● Tracking
7
Three General Principles of QA
● Know what you should be doing:
● Having explicit requirements and specifications.
● Follow a software development process with
● Requirements analysis,
● Acceptance tests,
● Frequent user feedback.
8
Three General Principles of QA
● Know how to measure the difference.
● Having explicit measures comparing what is being done from
what should be done.
● Four complementary methods:
1. Formal methods – verify mathematically specified
properties.
2. Testing – explicit input to exercise software and check for
expected output.
3. Inspections – human examination of requirements, design,
code, ... based on checklists.
4. Metrics – measures a known set of properties related to
quality.
9
SQA
●Typical activities of a SQA process
■ Requirements validation.
■ Design verification.
■ Static code checking (inspection/reviews).
■ Dynamic testing.
■ Process engineering and standards.
■ Metrics and continuous improvement.
10
SQA, Verification and Validation
● Assuring that a software system meets a user's needs
● Objectives
● To introduce software verification and validation and to discuss
the distinction between them
● To explain static analysis as a verification technique
11
Verification VS. Validation
● Verification: "Are we building the product right"
● The software should conform to its specification
● Syntax (Method, tools, algorithm...)
● Validation: "Are we building the right product"
● The software should do what the user really requires
● Semantics (all the functionalities)
● Two principal objectives
● Discovery of defects in a system
● Assessment of whether the system is usable in an operational
situation
12
Static and Dynamic V&V
13
Verification and Validation
● Software Verification and Validation (V&V) is a
disciplined approach to assessing software products
throughout the product life cycle.
● A V&V effort strives to ensure that quality is built into the
software and that the software satisfies user
requirements.
14
Why V and V?
● Many historical software disasters
● google “software disasters”
● According to the US National Institute of Standards and
Technology (NIST) in 2002
● Software bugs, or errors, are so prevalent and so detrimental that
they cost the U.S. economy an estimated $59.5 billion annually, or
about 0.6 percent of the gross domestic product
15
Verification
● Verification
● The process of determining whether or not the products of a given
phase of the software development cycle fulfill the requirements
established during the previous phase. (IEEE Std 729-1983)
● Verification that the products of each software life cycle
phase:
● Comply with previous life cycle phase requirements and products
(e.g., for correctness, completeness, consistency, and accuracy)
● Satisfy the standards, practices, and conventions of the phase
● Establish the proper basis for initiating the next life cycle phase
activities 16
Validation
● Validation
○ The process of evaluating software at the end of the software
development process to ensure compliance with software
requirements. (IEEE Std 729-1983)
17
V and V Techniques
18
V and V techniques
19
Exercise
20
Terminology
Warning: many organizations, tools, or standards use inconsistent
terminology ☹.
● Stimulus: an action / request / command sent to a system under
test.
● Response: something that can be observed from a system under
test.
● Test case: a sequence of stimuli and expected responses with a
specified purpose and decisive result.
● Test script: a test case as a set of instructions, for either manual
or automated execution.
● Test suite: a collection of related test cases.
● Verdict: the decision as to the result of a test case. 21
Testing Basics
● What do we need to do testing?
● Test script
● Stimuli to send to system under test (SUT).
● Responses expected from SUT.
22
Test case Verdicts
● A verdict is the declared result of executing a single test.
● Pass: the test case achieved its intended purpose, and the software
under test performed as expected.
● The expected responses were observed in the proper sequence with
the proper timing, and no other anomalies were observed.
● Fail: the test case achieved its intended purpose, but the software
under test did not perform as expected.
● One or more of the expected responses were not observed
● Error: the test case did not achieve its intended purpose.
● Potential reasons:
● An unexpected event occurred during the test case.
● The test case could not be set up properly
23
Test Architecture (1)
● Includes defining the set of Points of Control and Observation (PCOs)
Tester or
automated tool
Test script
25
Test Scripts
● What should the format of a test script be?
● natural language? (for manual testing)
● tool dependent?
● a standard test language?
● a programming language?
26
Test Scripts
● natural language? (for manual testing)
27
Photo taken from https://www.linkedin.com/pulse/20140712034521-99719098-test-scenario-vs-test-case-vs-test-script
Test Script Development
● Creating test scripts follows a parallel development process,
including:
● Requirements
● Design
● Debugging
● Configuration management
● Maintenance
● Documentation
29
Video: How to write a test case?
● https://www.youtube.com/watch?v=BBmA5Qp6Ghk
30