Professional Documents
Culture Documents
UNIT-IV
Prepared by
S.Shashikanth
Asst. Professor (C), IT Dept
JNTUH College of Engineering Jagtial
UNIT-IV SYLLABUS
• Testing Strategies : A strategic approach to
software testing, test strategies for conventional
software, Black-Box and White-Box testing,
Validation testing, System testing, the art of
Debugging.
• Product metrics : Software Quality, Metrics for
Analysis Model, Metrics for Design Model, Metrics
for source code, Metrics for testing, Metrics for
maintenance.
• What is Testing?
• What is Debugging?
A strategic Approach for Software
testing
• Software Testing
• Testing Strategy
A road map that incorporates test planning, test case
design, test execution and resultant data collection
and execution
Validation refers to a different set of activities that
ensures that the software is traceable to the customer
requirements.
verification& validation encompasses a wide
array of Software Quality Assurance
A strategic Approach for Software testing
9
• S/w development begins at system engineering
defines the role of s/w and ends at coding.
• Testing begins at coding phase and ends at
system engineering phase
• At each level of testing a different testing
method is adopted
• Those method are Unit Testing, Integration
Testing, Validation Testing, System Testing
1) Unit Testing: After developing the code, each unit or
component in it is tested individually.
-It ensures that maximum errors are detected & entire code is
tested.
4) System Testing: System engineer combines the validated s/w with other
* Whitebox testing
Black box testing
Black Box
Testing
Example
• If 0.0<=x<=1.0
• Then test cases (0.0,1.0) for valid input
and (-0.1 and 1.1) for invalid input
3.Graph based testing
• It is also called as state based testing.
• Draw a graph of objects and relations
• Devise test cases to uncover the graph such
that each object and its relationship
exercised.
3.Graph based testing
• A graph is collection of nodes that specifies
the objects
• The relationship between these objects are
represented by links
• In graphical representations nodes are
represented by circles.
• All nodes are connected by following links
3.Graph based testing
• Directed link: Describes the relationship between the
node is unidirectional
Node
Undirected weight
link
Parallel
object Links
#3
Fig:A
Advantages & Disadvantages of Black-Box testing
• Advantages
• More effective compared to white-box testing
• Test case are generated from the user view
• Programming language is enough to perform a test
accurately
• Disadvantages
• Takes more time for testing each & every input
• Test cases may be repeated
• Many of the program paths are remain unchanged
White Box testing
• Also called glass box testing or open-box
testing
• Involves knowing the internal working of a
program
• Guarantees that all independent paths will be
exercised at least once.
• Exercises all logical decisions on their true
and false sides
White-Box
Testing
• Advantages
• The application is effective because of internal
knowledge of the program code
• code is optimized
• Unnecessary lines of code which results in
hidden errors are removed
• Disadvantages
• Very expensive test
Validation Testing
• At the end of s/w development the product is usually
delivered to the customer.
2)Security Testing
3) Stress Testing
4)Performance Testing
System Testing
• 1) Recovery Testing
• If the system is effected by some faults or failures, it
should resume processing within a stipulated period of
time.
• Moreover the system should be strong enough to
overcome the failures & function normally.
• To known whether a system is fault-tolerant , try to fail
the system in all aspects & measure its
recovering/resuming capabilities.
• This type system testing is called recovery testing.
System Testing
• 2)Security Testing
• Computer systems prone to be security attacks.
• Penetration into these systems can be from
hackers , employees of the organization
• In order to carry out security testing tester acts
as the penetrator and tries all possibilities of
breaking the security.
System Testing
• 3) Stress Testing
• When developed s/w is given 20 interrupts/second
whereas the average rate 3 interrupts/second &
increasing data rates , requesting maximum memory,
huge data to be search in disk…etc the system is being
stress tested.
• In stress testing the aim of the tester is design such test
cases that can overwhelm the program.
• Stress testing is to know the maximum functionality of
a system before it fails
System Testing
• 4) Performance Testing
• It ensures that the s/w meets the performance
requirements
• It is performed during the entire processing of
testing from the lowest level of testing to
highest level of testing(unit testing to system
testing)
The Art of Debugging
• Once the testing successfully carried out &
bugs are uncovered, the next step is the
removal of these bugs and is named as “
debugging”
•Debugging Stratergies
1)Brute Force Method.
2)Back Tracking 3)Cause
Elimination and
4)Automated debugging
The Art of Debugging
• Brute force
-- Most common and least efficient
-- Applied when all else fails
-- Memory dumps are taken
-- Tries to find the cause from the load of information
• Back tracking
-- Common debugging approach
-- Useful for small programs
-- Beginning at the system where the symptom has been
uncovered, the source code traced backward until the
site of the cause is found.
The Art of Debugging
• Cause Elimination
-- Based on the concept of Binary
partitioning
-- A list of all possible causes is developed
and tests are conducted to eliminate each
The Art of Debugging
The Debugging process
Execution of
test cases
Results
Test
cases Additional Suspected
tests causes
• McCall’s quality factors: This model classifies all software requirements into 11
software quality factors. The 11 factors are grouped into three categories – product
operation, product revision, and product transition factors.
Product Operation
• According to McCall’s model, product operation category includes five software quality factors, which deal
with the requirements that directly affect the daily operation of the software. They are as follows −
• Correctness
• These requirements deal with the correctness of the output of the software system. They include −
• Output mission
• The required accuracy of output that can be negatively affected by inaccurate data or inaccurate calculations.
• Reliability
• Reliability requirements deal with service failure.
• Efficiency
• It deals with the hardware resources needed to perform the different functions of the software system. It
includes processing capabilities (given in MHz), its storage capacity (given in MB or GB) and the data
communication capability (given in MBPS or GBPS).
• Integrity
• This factor deals with the software system security, that is, to prevent access to unauthorized persons, also to
distinguish between the group of people to be given read as well as write permit.
• Usability
• Usability requirements deal with the staff resources needed to train a new employee and to operate the
software system.
Product Revision Quality Factors
• According to McCall’s model, three software quality factors are
included in the product revision category. These factors are as follows −
• Maintainability
• This factor considers the efforts that will be needed by users and
maintenance personnel to identify the reasons for software failures, to
correct the failures, and to verify the success of the corrections.
• Flexibility
• This factor deals with the capabilities and efforts required to support
adaptive maintenance activities of the software.
• Testability
• Testability requirements deal with the testing of the software system as
well as with its operation.
Product Transition Software Quality Factor
• According to McCall’s model, three software quality factors are included in the
product transition category that deals with the adaptation of software to other
environments and its interaction with other software systems. These factors are as
follows −
• Portability
• Portability requirements tend to the adaptation of a software system to other
environments consisting of different hardware, different operating systems, and so
forth. The software should be possible to continue using the same basic software in
diverse situations.
• Reusability
• This factor deals with the use of software modules originally designed for one project
in a new software project currently being developed.
• Interoperability
• Interoperability requirements focus on creating interfaces with other software systems
or with other equipment firmware.
Software Quality-ISO 9126 Quality Factors
• The ISO 9126-1 software quality model identifies 6 main quality characteristics, namely:
• Functionality
Functionality is the essential purpose of any product or service. It reflects working nature.
• Reliability
• Reliability requirements deal with service failure.
• Usability
• Usability requirements deal with the staff resources needed to train a new employee and to operate the software
system.
• Efficiency
• It deals with the hardware resources needed to perform the different functions of the software system. It
includes processing capabilities (given in MHz), its storage capacity (given in MB or GB) and the data
communication capability (given in MBPS or GBPS).
• Maintainability
• This factor considers the efforts that will be needed by users and maintenance personnel to identify the reasons
for software failures, to correct the failures, and to verify the success of the corrections.
• Portability
• Portability requirements tend to the adaptation of a software system to other environments consisting of
different hardware, different operating systems, and so forth. The software should be possible to continue using
the same basic software in diverse situations.
Product metrics
• Product metrics for computer software helps us to
assess quality.
• Measure
-- Provides a quantitative indication of the extent, amount,
dimension, capacity or size of some attribute of a
product or process
• Metric(IEEE 93 definition)
-- A quantitative measure of the degree to which a system,
component or process possess a given attribute
• Indicator
-- A metric or a combination of metrics that provide insight
into the software process, a software project or a product
itself
Product Metrics for analysis,Design,Test
and maintenance
• Product metrics for the Analysis model
Function point Metric
1. First proposed by Albrecht
2. Measures the functionality delivered by the
system
3. FP computed from the following
parameters
1)Number of external inputs(EIS)
2)Number external outputs(EOS
45
Product metrics for the Analysis model
51
METRIC FOR SOURCE CODE
• HSS(Halstead Software science)
• Primitive measure that may be derived after the
code is generated or estimated once design is
complete
• n1 = the number of distinct operators that appear in a
program
• n2 = the number of distinct operands that appear in a
program
• N1 = the total number of operator occurrences.
• N = the total number of operand occurrence.
2
• e = V/PL
METRICS FOR MAINTENANCE
•
••M = the number of modules in the current release
t