Tech III IT ± SEC- II Unit-III PPT Slides Text Books: 1.Software Testing Techniques: Boris Beizer 2. Craft of Software Testing: Brain Marrick First Semester

Prof. N. Prasanna Balaji

Sub Topic No¶s 1 2 3 Introduction

Sub Topic name

Lecturer No

Slide No¶s

L1 L1 L1

5 6 6

Purpose of testing: To Catch Bugs Purpose of testing: Productivity Related Reasons

4 5 6 7

Purpose of testing: Goals for testing Purpose of testing: 5 Phases in tester¶s thinking Purpose of testing: Testing & Inspection Dichotomies: Testing & Debugging

L1 L1 L2 L2

7 8 10 13


Dichotomies: Functional Vs Structural Testing Dichotomies: Designer Vs Tester







Dichotomies: Modularity Vs Efficiency




Dichotomies: Programming Small Vs Big Dichotomies: Buyer Vs Builder







Model for Testing : Project




Model for Testing : Roles of Models for Testing




Feature & Functionality Bugs Taxonomy Bugs: Structural bugs L7 40 18 L7 42 19 Taxonomy Bugs: Data Bugs L7 45 20 Taxonomy Bugs: Coding Bugs L8 50 21 Taxonomy Bugs: Interface. Integration and System Bugs Taxonomy Bugs: Testing Test Design Bugs L8 51 22 L8 60 4 .15 Consequences of Bugs: L6 33 16 Taxonomy Bugs: Introduction L7 38 17 Taxonomy Bugs: Requirements.

QC. Lecturer 1 5 . Fails: Application functionality NOK. Validation Verification of functionality for conformation against given specifications By execution of the software application A Test Passes: Functionality OK. It¶s not always obvious. Bug/Defect/Fault: Deviation from expected functionality.Introduction What is Testing? Related terms : SQA. Verification.

design. To Catch Bugs ‡ ‡ Bugs are due to imperfect Communication among programmers ‡ Specs. Productivity Related Reasons ‡ Insufficient effort in QA => High Rejection Ratio => Higher Rework => Higher Net Costs ‡ Statistics: ‡ QA costs: 2% for consumer products 80% for critical software ‡ Quality Productivity Lecturer 1 6 . low level functionality Statistics say: about 3 bugs / 100 statements 2.Purpose of Testing 1.

Lecturer 1 7 .Purpose of Testing Purpose of testing contd« 3. correction. First and then code. 5 Phases in tester¶s thinking Phase 0: says no difference between debugging & testing  Today. retesting] If it is not possible. it¶s a barrier to good testing & quality software. Good test design & tests   clear diagnosis   easy bug correction Test Design Thinking From the specs. write test specs. 4. If this fails. debugging. Eliminates bugs at every stage of SDLC. Goals for testing Primary goal of Testing: Bug Prevention Bug prevented   rework effort is saved [bug reporting. testing is to detect the remaining bugs. Testing must reach its secondary goal of bud discovery.

Release the product when the Risk is under a predetermined limit. Lecturer 1 8 . even if many tests pass.  Objective not achievable. But we do not know when to stop testing. perception of product Risk reduces. Tests are to be redesigned to test corrected software.Purpose of Testing Purpose of testing contd« Phase 1: says Testing is to show that the software works  A failed test shows software does not work. Our perception of the software quality changes ± when a test passes/fails. Phase 2: says Software does not work    One failed test proves that. Phase 3: says Test for Risk Reduction     We apply principles of statistical quality control. Consequently.

must work.  Most testable software must be debugged.  Phase2 tests alone will not show software works  Use of statistical methods to test design to achieve good testing at acceptable risks. What makes software testable´. Lecturer 1 9 .Purpose of Testing 5 Phases in tester¶s thinking continued« Phase 4: A state of mind regarding ³What testing can do & cannot do.  Testable software reduces effort  Testable software has less bugs than the code hard to test Cumulative goal of all these phases:  Cumulative and complementary. must be hard to break. One leads to the other.  Applying this knowledge reduces amount of testing.

Purpose of Testing purpose of testing contd. 5. To prevent and catch most of the bugs. Testing & Inspection Inspection is also called static testing. we must Review Inspect & Read the code Do walkthroughs on the code & then do Testing Lecturer 2 10 . Methods and Purposes of testing and inspection are different. but the objective is to catch & prevent different kinds of bugs..

history. Redesign tests & test the redesigned tests Bug Prevention Mix of various approaches. project size. depending on factors culture. path verification & other controls Design methodologies & development environment Its better to know: Pesticide paradox Complexity Barrier Lecturer 2 11 . development environment.Purpose of Testing Further« Some important points: Test Design After testing & corrections. language Inspection Methods Design Style Static Analysis Languages ± having strong syntax. application.

Designer vs Tester Efficiency Vs programming in BIG 4. Programming in SMALL 6. Buyer vs Builder Lecturer 2 12 .Dichotomies Dichotomies division into two especially mutually exclusive or contradictory groups or entities the dichotomy between theory and practice Let us look at six of them: 1. Modularity (Design) vs 5. Testing & Debugging 2. Functional Vs Structural Testing 3.

There are intuitive leaps. dull.Dichotomies 1. 3 A demo of an error or apparent correctness. Has predictable outcomes. End cannot be predicted. Designed and Scheduled. methods and psychology applied to these # Testing Debugging Starts with possibly unknown initial conditions. experimentation & freedom. Debugging is to find the cause or misconception leading to the bug. conjectures. Procedures & Duration are not constrained. Testing Vs Debugging Testing is to find bugs. 4 Proves programmer¶s success or failure. A Deductive process. 2 Planned. 5 Should be predictable. Lecturer 2 . there are differences in goals. 13 1 Starts with known conditions. Uses predefined procedure. But. Their roles are confused to be the same. It is programmer¶s Vindication. rigid & inhuman. constrained.

14 . There are only Rudimentary Results (on how much can be done. effort. Time. 8 A theory establishes what testing can do or cannot do. how etc. depends on human ability). 7 Can be done by outsider to the development team. Must be done by an insider (development team). 6 Much of testing can be without design knowledge.Dichotomies Dichotomies # contd« Testing Debugging Impossible without a detailed design knowledge.Automation is a dream. 9 Test execution and design can be automated. Debugging .

. source language. Each layer is a structure with its outer layer being its function. control method. Application1 15 User . Examples: Application2 Malloc() Link block() Devices O. Outputs are verified for conformance to specifications from user¶s point of view. Outside layer is pure system function from user¶s point of view. Interleaving of functional & Structural testing: A good program is built in layers from outside. Structural Testing: Looks at the implementation details: programming style.Dichotomies Dichotomies contd. database & coding details. Functional Vs Structural Testing Functional Testing: Treats a program as a black box.S. 2.

Choice depends on which seems to be the natural choice.. but cannot detect all bugs. have limitations and target different kind of bugs. The Art of Testing is how much allocation % for structural vs how much % for functional.Dichotomies Interleaving of functional & Structural testing: (contd. Functional tests can detect all bugs in principle. Or vice-versa.) For a given model of programs. 16 . Both are useful. Structural tests are inherently finite. but would take infinite amount of time. Structural tests may be done first and later the Functional.

programmer & tester are independent in all of unit. Tester With knowledge about internal test design. trying to reduce the complexity. the tester can eliminate useless tests. 17 1 Tests designed by designers are more 2 Likely to be biased.Dichotomies Dichotomies contd.  The extent to which test designer & programmer are separated or linked depends on testing level and the context. 3. uncompromising.  Artistry of testing is to balance knowledge of design and its biases against ignorance & inefficiencies. unit integration. component. formal system feature testing. Unit testing may be done by either.. optimize & do an efficient test design. . # Programmer / Designer oriented towards structural testing and are limited to its limitations. component integration. 3 Tries to do the job in simplest & cleanest way. hostile and obsessed with destroying program. Tests designed by independent testers are biasfree.  Tests are more efficient if the designer. system. Designer vs Tester  Completely separated in black box testing. Tester needs to suspicious.

4. Modularity (Design) vs Efficiency 1. A module implies a size.Dichotomies Dichotomies contd. 3. 18 . an internal structure and an interface. Or.. 2. A module (well defined discrete component of a system) consists of internal complexity & interface complexity and has a size. system and test design can both be modular. in other words.

) 1 Smaller the component easier to understand. So: Optimize the size & balance internal & interface complexity to increase efficiency Optimize the test design by setting the scopes of tests & group of tests (modules) to minimize cost of test design.Dichotomies # Modularity Efficiency Implies more number of components & hence more # of interfaces increase complexity & reduce efficiency (=> more bugs likely) Higher efficiency at module level. 19 . when a bug occurs with small components. systems & the software. 4 Easier to design large modules & smaller interfaces at a higher level. Implies more rework and hence less efficiency with microscopic test cases Less complex & efficient. debugging. execution & organizing ± without compromising effectiveness. More # of test cases implies higher possibility of bugs in test cases. (Design may not be enough to understand and implement. It may have to be broken down to implementation level. Hence can have bugs. 3 Microscopic test cases need individual setups with data. 2 Small components/modules are repeatable independently with less rework (to check if a bug is fixed).

for one¶s office or for the institute. bugs. rework quality). 20 . 3 Complete test coverage is easily done.. Program size implies non-linear effects (on complexity. Programming in SMALL Vs programming in BIG  Impact on the development environment due to the volume of customer requirements..g.Dichotomies Dichotomies contd. effort. Big A large # of programmers & large # of components. 1 More efficiently done by informal. for oneself. # Small means and lack of formality ± if it¶s done by 1 or 2 persons for small & intelligent user population. Acceptance level could be: Test coverage of 100% for unit tests and for overall tests • 80%. 5. intuitive 2 Done for e.

Interests are guarded by the Tester Tester.Dichotomies 6. The accountability increases motivation for quality. Tester: Tester Dedicated to the destruction of the s/w (builder) Tests s/w in the interests of User/Operator. Buyer Vs Builder (customer vs developer organization) Buyer & Builder being the same (organization) clouds accountability. The roles of all parties involved are: Builder: Designs for Buyer: Pays for the system. Hopes to get profits from the services to the User User. Separate them to make the accountability clear. even if they are in the same organization. User: User Ultimate beneficiary of the system system. Operator: Lives with: Mistakes of the Builder Oversights of Tester Murky specs of Buyer Complaints of User 21 & is accountable to the Buyer Buyer. .

1. 4) Specifications is good. 3) Schedule: project may take about 24 months from start to acceptance. 6 month maintenance period. (2) look at the roles of the Testing models. PROJECT: An Archetypical System (product) allows tests without complications (even for a large project).with a project environment .with tests at various levels. (1) understand what a project is. Testing a one shot routine & very regularly used routine is different. Undocumented ones are understood well in the team.A Model for Testing A model for testing . A model for project in a real world consists of the following 8 components: 1) Application An online real-time system (with remote terminals) providing timely responses Application: to user requests (for services). documented. 2) Staff: Manageable size of programming staff with specialists in systems design. Specifications: 22 .

At first it¶s the customer¶s & then the software design team¶s responsibility. test and interface standard (documented and followed). A centralized standards data base is developed & administrated 23 .A Model for Testing 4) Acceptance test: Application is accepted after a formal acceptance test. 6) Standards Standards: Programming. 5) Personnel The technical staff comprises of : A combination of experienced professionals Personnel: & junior programmers (1 ± 3 yrs) with varying degrees of knowledge of the application.

Similar systems with up to 75% code in common may be implemented in future. A model project is A Well Run & Successful Project.A Model for Testing 1. but the delivery date is met. 8) History: Typically: History:      Developers quit before his/her components are tested. with some hardware. Excellent but poorly documented work. Unexpected changes (major & minor) from customer may come in Important milestones may slip. Combination of Glory and Catastrophe.up to 1/3rd 7) Source: (for a new project) Source:    New Code From a previous reliable system Re-hosted from another language & O. Problems in integration.S.up to 1/3rd .up to 1/3rd . PROJECT: contd « 6) Objectives (of a project) Objectives:   A system is expected to operate profitably for > 10 yrs (after installation).. 24 . is a combination of . redoing of some component etc«.

A Model for Testing The World The Model World Environment Model Environment Unexpected Program Program Model Expected Tests Outcome Nature & Psychology Bug Model 25 .

 Focus on control structure ignoring processing & focus on processing ignoring control structure. (with established h/w & s/w)  But arise from our understanding of the environment. libraries) required to make the program run.A Model for Testing 2. 3) Program:  Complicated to understand in detail. utilities.. 26 .  Deal with a simplified overall view. modify the program. Roles of Models for Testing 1) Overview:     contd. OS. 2) Environment: includes  All hardware & software (firmware. Revise bug model and program.  Usually bugs do not result from the environment. Testing process starts with a program embedded in an environment. modify the program model to include more facts.  If bug¶s not solved. compiler. Create tests out of these models & execute Results is expected   It¶s okay unexpected   Revise tests and program. loader. Human nature of susceptibility to error leads to 3 models. linkage editor. & if that fails.

 An incorrect spec. (subtle bugs => from violation of data structure boundaries & data-code separation) 27 . may lead us to mistake for a program bug. Subtle bugs have no definable pattern. Roles of Models for Testing 4) Bugs: contd « (bug model)  Categorize the bugs as initialization. a. wrong variable etc. Control Dominance hypothesis:  Belief that most errors are in control structures. Benign Bug Hypothesis:    The belief that the bugs are tame & logical. but data flow & data structure errors are common too.A Model for Testing 2. Weak bugs are logical & are exposed by logical means.  There are 9 Hypotheses regarding Bugs. contd..  Subtle bugs affect that component & external to it. Bug locality hypothesis:  Belief that bugs are localized. call sequence. c..  Subtle bugs are not detectable only thru control structure. b.

g. In real systems the distinction is blurred and hence such bugs exist. f. e. contd. Lingua Salvator Est hypothesis:   Belief that the language syntax & semantics eliminate most bugs. Corrections Abide hypothesis:   Belief that a corrected bug remains corrected. A correction in a data structure µDS¶ due to a bug in the interface between modules A & B.. Roles of Models for Testing 4) Bugs: (bug model) contd « contd . d. could impact module C using µDS¶. For e. 28 .A Model for Testing 2. Subtle bugs may not. such features may not eliminate Subtle Bugs.. But. Code/data Separation hypothesis:   Belief that the bugs respect code & data separation in HOL programming.

low cunning & intuition (by independent testers) are sufficient to extirpate most bugs. Sadism Suffices hypothesis:   Belief that a sadistic streak. grant immunity from bugs. Subtle & tough bugs are may not be « . representation. h. design method. environment etc. contd. Not for subtle bugs. i. Remember the pesticide paradox.these need methodology & techniques. Angelic Testers hypothesis:  Belief that testers are better at test design than programmers at code design. Silver Bullets hypothesis:    Belief that ..language.. Roles of Models for Testing 4) Bugs: (bug model) contd « contd .A Model for Testing 2. 29 . g.

f. Component testing .  An unexpected test result may lead us to revise the test and test model. Roles of Models for Testing 5) Tests: contd. c. loaded & put under the control of test harness / driver. Unit testing . Component is an integrated aggregate of one or more units (even entire system) e. b. documentation of test. A unit is the smallest piece of software that can be compiled/assembled. 6) Testing & Levels: 3 kinds of tests (with different objectives) 1) Unit & Component Testing a.  Input preparation.verifying the unit against the functional specs & also the implementation against the design structure.  Formal procedures. linked. Problems revealed are unit bugs. 30 . contd. outcome prediction and observation. execution & observation of outcome are subject to errors.A Model for Testing 2.verifying the component against functional specs and the implemented structure against the design.. d.. Problems revealed are component bugs.

 Integration testing & Testing Integrated Objects are different A B A B C D  Sequence of Testing:  Unit/Component tests for A.A Model for Testing 2.B) component 31 . Integration tests for A & B. Roles of Models for Testing 2) Integration Testing: contd « contd. inconsistent data validation criteria & inconsistent handling of data objects..  Examples of integration testing are improper call or return sequences.  Verification of consistency of interactions in the combination of components. Component testing for (A.  Integration is a process of aggregation of components into larger components. B.

Includes testing for performance. The model should be able to express the program.A Model for Testing 2. contd. 32 . Roles of Models for Testing contd « 3) System Testing a. exploring and revising models. Testing Model. Concerns issues & behaviors that can be tested at the level of entire or major part of the integrated system. System is a big component. Role of the Model of testing :     Used for the testing process until system behavior is correct or until the model is insufficient (for testing). Unexpected results may force a revision of the model. configuration sensitivity. b. accountability.. start up & recovery After understanding a Project. c. selecting. now let¶s see finally. Art of testing consists of creating. security.

g. Hence accountability is lost. bills for $0. ‡ Mild ‡ Aesthetic bug such as misspelled output or mal-aligned print-out. ‡ For e.0 are sent. names are truncated/modified arbitrarily.Consequences of Bugs Consequences: (how bugs may affect users) These range from mild to catastrophic on a 10 point scale. ‡ Disturbing ‡ Legitimate transactions refused. ‡ Serious ‡ Losing track of transactions & transaction events. ‡ Till the bugs are fixed operators must use unnatural command sequences to get proper response. ‡ Annoying ‡ Systems behavior is dehumanizing for e. 33 . ATM machine may malfunction with ATM card / credit card.g. ‡ Moderate ‡ Outputs are misleading or redundant impacting performance.

g. 34 .not sporadic & unusual. even when it may not fail.) ‡ Catastrophic ‡ System fails and shuts down. (not easily discovered and may lead to system down.Consequences of Bugs Consequences contd « ‡ Very serious System does another transaction instead of requested e. ‡ Infectious ‡ Corrupts other systems. Credit another account. convert withdrawals to deposits. ‡ Extreme ‡ Frequent & Arbitrary . ‡ Intolerable ‡ Long term unrecoverable corruption of the Data base.

‡ Organizations design & use quantitative. ‡ Quantified nightmares help calculate importance of bugs. ‡ Parts are weighted depending on environment. 35 . correction cost. ‡ That helps in making a decision on when to stop testing & release the product.Consequences of Bugs Consequences contd « Assignment of severity ‡ Assign flexible & relative rather than absolute values to the bug (types). quality metrics based on the above. ‡ Number of bugs and their severity are factors in determining the quality quantitatively. ‡ Nightmares ‡ Define the nightmares ± that could arise from bugs ± for the context of the organization/application. application. current SDLC phase & other factors. culture.

This is called µbug design process¶. Convert the consequences of into a cost. nuclear reactor meltdowns.Consequences of Bugs Consequences contd « When to stop Testing 1. 4. experience.) 3. (but if the scope extends to the public. Calculate the importance of a bug type as: Importance of bug type j =  ™ C j k P j k all k 5. Discard those with which you can live with. measured data. and published statistics postulate the kind of bugs causing each symptom. statistics etc. There could be rework cost. C j k = cost due to bug type j causing nightmare k P j k = probability of bug type j causing nightmare k ( Cost due to all bug types = ™ all k ™ C jk P jk all j ) 36 . lost business. List all nightmares in terms of the symptoms & reactions of the user to their consequences. where. A bug type can cause multiple symptoms.). intuition. Order these from the costliest to the cheapest. there could be the cost of lawsuits. Order the causative bugs by decreasing probability (judged by intuition. Based on experience. 2.

9. ‡ Hence. know and update test suites as required. 7. QA improves. Rank the bug types in order of decreasing importance. Design tests & QA inspection process with most effective against the most important bugs. 8. some nightmares disappear.Consequences of Bugs Consequences 6.. Important points to Note: ‡ ‡ Designing a reasonable. finite # of tests with high probability of removing the nightmares. Stop testing when probability (importance & cost) proves to be inconsequential. As testing progresses. Test suites wear out. revise the probabilities & nightmares list as well as the test strategy. ‡ As programmers improve programming style. If a test is passed or when correction is done for a failed test. This procedure could be implemented formally in SDLC. contd « When to stop Testing contd . 37 .

to stop testing We will now see the: 3. nightmares. Importance of Bugs .along with some remedies In order to be able to create an organization¶s own Bug Importance Model for the sake of controlling associated costs« 38 . Consequences of Bugs .causes.statistical quantification of impact 2. we had seen the: 1. Taxonomy of Bugs .Taxonomy of Bugs ..

Functionality Bugs 2)Structural Bugs 3)Data Bugs 4)Coding Bugs 5)Interface. Integration and System Bugs 6)Testing & Test Design Bugs 24.8 % 39 . impact and the methods of prevention and correction.3% bugs 25.0% 10. Adopt known taxonomy to use it as a statistical framework on which your testing strategy is based.2% 22.7% 2. importance.   1)Requirements. nightmares.. probability..Taxonomy of Bugs . 6 main categories with sub-categories.3% 9. Features. and remedies Reference of IEEE Taxonomy: IEEE 87B  Why Taxonomy ? To study the consequences.

Requirements & Specs.     Incompleteness.. I. and remedies Reference of IEEE Taxonomy: IEEE 87B 1) Requirements. & feature interaction bugs II. Functionality Bugs 3 types of bugs : Requirement & Specs. Features. if they increase complexity Removing features may foster bugs 40 . ambiguous or self-contradictory Analyst¶s assumptions not known to the designer Some thing may miss when specs change These are expensive: introduced early in SDLC and removed at the last Feature. Feature Bugs      Specification problems create feature bugs Wrong feature bug has design implications Missing feature is easy to detect & correct Gratuitous enhancements can accumulate bugs.Taxonomy of Bugs .

III.  Testing Techniques  Functional test techniques . and remedies 1) Requirements. The earlier removed the better as these are costly if detected at the end. Long-term support:  Even with a great specification language. Explicitly state & test important combinations Remedies    Use high level formal specification languages to eliminate human-to-human communication It¶s only a short term support & not a long term solution Short-term Support:  Specification languages formalize requirements & so automatic test generation is possible. state & local tax laws. leaving tougher bugs. Federal. and state testing .transaction flow testing. It¶s cost-effective. logic testing.. syntax testing. but is shifted to a higher level.Taxonomy of Bugs .  No magic remedy. Feature Interaction Bugs  Arise due to unpredictable interactions between feature groups or individual features. Simple ambiguities & contradictions may only be removed.  Examples: call forwarding & call waiting. domain testing. Features. Functionality Bugs contd. 41 ..can eliminate requirements & specifications bugs. problem is not eliminated.

. IV. Novice programmers. Missing process steps. Structural Bugs we look at the 5 types. and pachinko code. Improper nesting of loops. I. II. unreachable code. duplicated or unnecessary processing. rampaging GOTOs.. path. Incorrect loop-termination or look-back. and remedies 2.      Control & Sequence Bugs: Paths left out. & functional testing. III. their causes and remedies. spaghetti code. Control & Sequence bugs Logic Bugs Processing bugs Initialization bugs Data flow bugs & anomalies 1. ill-conceived switches. structural. 42  Unit. V. Old code (assembly language & Cobol) Prevention and Control:  Theoretical treatment and.Taxonomy of Bugs .

Deeply nested conditional statements & using many logical operations in 1 stmt.     contd. data type conversion. improper use of relational operators. Processing Bugs  Arithmetic. algorithm selection & general. Structural Bugs II. Prevention   Caught in Unit Testing & have only localized effect Domain testing methods 43 . Prevention and Control: Logic testing. and remedies 2. functional testing III. Logic Bugs Misunderstanding of the semantics of the control structures & logic operators Improper layout of cases. careful checks.. including impossible & ignoring necessary cases. processing. improper simplification. algebraic. Using a look-alike operator. confusing Ex-OR with inclusive OR.Taxonomy of Bugs . ignoring overflow. mathematical function evaluation..

(prevention & correction) Remedies   Programming tools. Wrong initial value of a loop control parameter. Detected mainly by execution (testing). Initialize to wrong data type or format. Initialization Bugs      Forgetting to initialize work space. or data areas. Not storing modified data. and remedies Structural bugs contd. Data flow test methods help design of tests and debugging. Re-initialization without an intermediate use. V. Remedies (prevention & correction)  Data flow testing methods & matrix based testing methods.. Dataflow Bugs & Anomalies     Run into an un-initialized variable. preprocessors. 44 . Accepting a parameter without a validation check. Very common. IV. Explicit declaration & type checking in source language. registers.Taxonomy of Bugs ..

Generic Data Bugs Dynamic Data Vs Static Data Information. and Control Bugs IV. Parameter. Contents. III.Taxonomy of Bugs . There are 4 sub categories. and remedies 3.. Data Bugs Depend on the types of data or the representation of data. Structure & Attributes related Bugs 45 . II. I.

Remedies (prevention & correction):   Using control tables in lieu of code facilitates software to handle many transaction types with fewer data bugs. Common as much as in code. and remedies Data Bugs contd« I.Taxonomy of Bugs . Generalized components with reusability ± when customized from a large parametric data to specific installation. Caution . Control tables have a hidden programming language in the database. Data bug introduces an operative statement bug & is harder to find. # of objects & their initial values.. Generic Data Bugs     Due to data object specs. especially as the code migrates to data.there¶s no compiler for the hidden control language in data tables 46 . formats..

a generic large program & site adapter program to set parameter values. Difficult to catch. Appear in source code or data base. unit testing 47 . directly or indirectly Software to produce object code creates a static data table ± bugs possible Examples Telecom system software: generic parameters.Taxonomy of Bugs . Due to unclean / leftover garbage in a shared resource. build data declarations etc. Due to an error in a shared storage object initialization. Dynamic Data Vs Static Data Dynamic Data Bugs Transitory. Postprocessor : to install software packages. and remedies II. Examples Generic & shared variable Static Data Bugs Fixed in form & content.. Prevention Compile time processing Source language features Shared data structure Prevention Data validation. Data is initialized at run time ± with configuration handled by tables.

When a subroutine (with good data validation code) is modified. Examples: name.Taxonomy of Bugs . forgetting to update the data validation code. Preventive Measures (prevention & correction)  Proper Data validation code.. Information. Parameter. What is information can be a data parameter or control data else where in a program. function using these. local to a single transaction or task. and Control Bugs Static or dynamic data can serve in any of the three forms.  Control: data used in a control structure for a decision. results in these bugs.  Information: dynamic. and remedies Data Bugs contd. III. A variable in different contexts. Bugs   Usually simple bugs and easy to catch.. hash code. 48 . It is a matter of perspective.  Parameter: parameters passed to a call.

Contents. documentation & coding style (incl. Data structures be globally administered.g. 49 . shape & alignment of data object in memory. subroutine). program.  Attributes: Semantics associated with the contents (e. probably at an interface Preventive Measures (prevention & correction)     Good source lang. data dictionary dictionary). string.. Structure & Attributes related Bugs  Contents: are pure bit pattern & bugs are due to misinterpretation or corruption of it. integer. IV. Attribute bugs are due to misinterpretation of data type. use field-access macros & not directly accessing any field. In an assembly lang. Structural bugs may be due to wrong declaration or when same contents are interpreted by multiple structures differently (different mapping). and remedies Data Bugs contd.Taxonomy of Bugs . Strongly typed languages prevent mixed manipulation of data. Local data migrates to global.  Structure: Size.. A structure may have substructures. Bugs    Severity & subtlety increases from contents to attributes as they get less formal.

Taxonomy of Bugs .. and remedies
4. Coding Bugs 
  Coding errors create other kinds of bugs. Syntax errors are removed when compiler checks syntax. Coding errors typographical, misunderstanding of operators or statements or could be just arbitrary. Documentation Bugs    Erroneous comments could lead to incorrect maintenance. Testing techniques cannot eliminate documentation bugs. Solution: Inspections, QA, automated data dictionaries & specification systems. 


Taxonomy of Bugs .. and remedies
5. Interface, Integration and Systems Bugs
There are 9 types of bugs of this type.
User System component component

1) External Interfaces 2) Internal Interfaces 3) Hardware Architecture Bugs 4) Operating System Bugs 5) Software architecture bugs


6) Control & Sequence bugs 7) Resource management bugs 8) Integration bugs 9) System bugs
Application software O. S. Drivers


Taxonomy of Bugs .. and remedies
5. Interface, Integration and Systems Bugs 1) External Interfaces 
   Means to communicate with the world: drivers, sensors, input terminals, communication lines. Primary design criterion should be - robustness. Bugs: invalid timing, sequence assumptions related to external signals, misunderstanding external formats and no robust coding. Domain testing, syntax testing & state testing are suited to testing external interfaces. contd..

2) Internal Interfaces 
  Must adapt to the external interface. Have bugs similar to external interface Bugs from improper  Protocol design, input-output formats, protection against corrupted data, subroutine call sequence, call-parameters. 

Remedies (prevention & correction): 
  Test methods of domain testing & syntax testing. Good design & standards: good trade off between # of internal interfaces & complexity of the interface. Good integration testing is to test all internal interfaces with external world.


assuming a device is initialized. I/O device address  H/W simultaneity assumption.  Remedies (prevention & correction):  Good software programming & Testing. H/W race condition ignored.  An elaborate H/W simulator may also be used.  Centralization of H/W interface software. S/w bugs originating from hardware architecture are due to misunderstanding of how h/w works. device status code.  Bugs are due to errors in:  Paging mechanism.Taxonomy of Bugs . address generation  I/O device instructions. device data format error etc.. and remedies Interface. Integration and Systems Bugs contd « 3) Hardware Architecture Bugs:   A s/w programmer may not see the h/w layer / architecture. interrupt handling. device protocol  Expecting a device to respond too quickly.  Nowadays hardware has special test modes & test instructions to test the H/W function. 53 . or to wait for too long for response..

Taxonomy of Bugs . Bugs & limitations in O. Bugs in O. itself and some corrections may leave quirks. Integration and Systems Bugs contd « 4) Operating System Bugs:  Due to:     Misunderstanding of H/W architecture & interface by the O. S. may be buried some where in the documentation. The above may localize bugs and make testing simpler. calls. S. S.S.. 54 . S. system interface specialists Use explicit interface modules or macros for all O. S. Not handling of all H/W issues by the O.  Remedies (prevention & correction):     Same as those for H/W bugs. Use O. and remedies Interface.

Assumption that a called routine is memory resident or not. and remedies Interface. Or. Depend on the Load. Failure to open an interlock.  Remedies:  Good design for software architecture. These are the most difficult to find and correct. 55 . Failure to block or unblock an interrupt. Integration and Systems Bugs contd « 5) Software Architecture Bugs: (called Interactive) The subroutines pass thru unit and integration tests without detection of these bugs.Taxonomy of Bugs . Bypassing data interlocks.  Due to:      Assumption that there are no interrupts. when the system is stressed.. Assumption that code is re-entrant or not re-entrant. Or.  Local setting of global parameters & Global setting of local parameters. Assumption that the registers and the memory are initialized.  Test Techniques  All test techniques are useful in detecting these bugs. Stress tests in particular. Or. that their content did not change.

Taxonomy of Bugs . Starting a process before its prerequisites are met. Not recognizing when prerequisites are met. highly structured sequence control . Specifying wrong priority. Missing. redundant. Integration and Systems Bugs contd « 6) Control & Sequence Bugs:  Due to:        Ignored timing Assumption that events occur in a specified sequence. Waiting for an impossible combination of prerequisites. 56 . wrong.useful Specialized internal sequence-control mechanisms such as an internal job control language ± useful.  Remedies:     Good design. Program state or processing level. and remedies Interface.easier to test & to correct bugs. Storage of Sequence steps & prerequisites in a table and interpretive processing by control processor or dispatcher ..  Test Techniques  Path testing as applied to Transaction Flow graphs is effective. or superfluous process steps.

Integration and Systems Bugs contd « 7) Resource Management Problems:  Resources: Internal: Memory buffers.  Remedies:    Design: keeping resource structure simple with fewest kinds of resources.  Due to:    Wrong resource used (when several resources have similar structure or different kinds of resources in the same pool). queue blocks etc. Designing a complicated resource structure to handle all kinds of transactions to save memory is not right. and no private resource mgmt. Failure to return a resource. Resource use forbidden to the caller. subroutines. macros etc.  Test Techniques  Path testing.. External: discs etc. Resource already in use. and remedies Interface.Taxonomy of Bugs . data-flow testing & stress testing. or deadlock Resource not returned to the right pool. transaction flow testing. Centralize management of all resource pools thru managers. 57 . fewest pools.

Taxonomy of Bugs . Integration and Systems Bugs contd « 8) Integration Bugs: Are detected late in the SDLC and cause several components and hence are very costly. Some communication methods are: data structures.. and data flow testing when applied across components. syntax testing. semaphores.  Due to:   Inconsistencies or incompatibilities between components. ***  Test Techniques  Those aimed at interfaces.. domain testing. protocols etc.  Remedies:  Employ good integration strategies. call sequences. communication links. Error in a method used to directly or indirectly transfer data between components. and remedies Interface. 58 . registers.

Integration and Systems Bugs contd « 9) System Bugs:  Infrequent.Taxonomy of Bugs .  Remedies:  Thorough testing at all levels and the test techniques mentioned below  Test Techniques   Transaction-flow testing.. 59 . hardware. data. but result from the totality of interactions among many components such as: programs. and remedies Interface.S.are useful. & the O. but are costly  Due to:  Bugs not ascribed to a particular component. All kinds of tests at all levels as well as integration tests .

but the criterion for judging software¶s response to tests is incorrect or impossible. to be executed. It¶s difficult & takes time to identify if a bug is from the software or from the test script/procedure.  If a criterion is quantitative (throughput or processing time). Test Criteria  Testing process is correct. the measurement test can perturb the actual value. this lack of bias may lead to an incorrect interpretation of the specs. Though an independent functional testing provides an un-biased point of view..  60 . and remedies 6. 1)   Bugs could be due to: Tests require code that uses complicated scenarios & databases.Taxonomy of Bugs . Testing & Test Design Bugs Bugs in Testing (scripts or process) are not software bugs.

61 . Test Design Automation: Test design is automated like automation of software development. 2. For a given productivity rate. 3. test scripts etc. Test Quality Assurance: To monitor quality in independent testing and test design. It reduces bug count.Taxonomy of Bugs . Test Debugging: Testing & Debugging tests. Simpler when tests have localized affect.. Test Execution Automation: Test execution bugs are eliminated by test execution automation tools & not using manual testing. and remedies Testing & Test Design Bugs contd«  Remedies: 1. 4.

The two factors are multiplicative and results in high productivity.. Good test works best on good code and good design. we could say Good design inhibits bugs and is easy to test. Good test cannot do a magic on badly designed software. 62 . and remedies A word on productivity At the end of a long study on taxonomy.Taxonomy of Bugs .

3 Bugs percentage Activity Source: Boris Beizer 63 .

Ans: Dichotomies 4 Q. Give the metric for it. Differentiate between function and structure Ans: Dichotomies 2 Q. Ans: same as for dichotomies 2 : function vs structure Q. Ans: levels of testing as mentioned in a model for testing: unit. Give Differences between functional and structural testing. Ans: Importance of bugs as discussed in chapter 2 Q. Q. U1 64 .. What are the remedies for test bugs? Ans: 6th and last point in taxonomy of bugs. Ans: consequences as seen from the user point of view Q. What are different types of testing? Explain them briefly. Remedies. What are the differences between static data and dynamic data? Ans: 2nd point in Data bugs in taxonomy of bugs Q. component. What are the principles of test case design? Explain. Briefly explain various consequences of bugs. Ans: Dichotomies 2 Q.Questions from Previous Exams Q. system. Give brief explanation of white box testing & black box testing and give the differences between them. Specify on which factors the importance of bugs depends. integration. (possibly could add « functional & structural)..

Control Flow Graphs and Path Testing U2 Towards Unit 2 « 65 .

Sign up to vote on this title
UsefulNot useful