You are on page 1of 20

Test Planning

Readings: Ron Patton ch 17 – 20 & Daniel Galin ch 10

Objectives of Test Planning

Record, organize and guide testing activities Schedule testing activities according to the test strategy and project deadlines Describe the way to show confidence to Customers Provide a basis for re-testing during system maintenance Provide a basis for evaluating and improving testing process

IEEE Std. schedule of intended testing activities. Identifies ● ● test items. who will do each task. and any risks requiring contingency planning. features to be tested. ● ● ● ● . approach resources. 829-1998 Test Plan ● Describes scope. testing tasks.

deadlines)... design.g requirements. 8. Significant constraints on testing (test item availability. Suspension criteria and resumption requirements . Configuration management plan. .IEEE Std..) 3. Test items . Features not to be tested 6. Item pass/fail criteria 9.. 829-1998 Test Plan 1. Quality assurance plan. Features to be tested 5. code. testing-resource availability.) 4. Test plan identifier Outline 2. Introduction (refers to Project plan. .identify test items including version/revision level (e. Testing Approach 7.

Staffing and training needs 15. Approvals 18. References . test summary report) 11.g. Schedule 16. Risks and contingencies 17. 829-1998 Test Plan Outline 10.IEEE Std. test cases specifications. Testing tasks 12. Test deliverables (e. Responsibilities 14. Environmental needs 13. test incident reports. test design specifications. test procedure specifications. test logs.

Features to be tested • IEEE Std.Test Design Specification 2. Test identification • Identifier and description of test cases associated with the design Pass/fail criteria for each feature 5. Features pass/fail criteria • . Approach refinements 4. 829-1998 Test Plan Outline 1. Test design specification identifier features addressed by document 3.

Intercase dependencies . Test items – Items and features to be exercised by the test case. Test case specification identifier 2.IEEE Std. 3. data bases. Output specifications 5. 829-1998 Test Plan Test Case Specification Outline 1. input required to execute the test case. Environmental needs 6. Special procedural requirements 7. files. Input specifications – 4. etc.

Special requirements 3.Test Procedure Specification 1. Purpose 2. 829-1998 Test Plan Outline Log – how to log results Set Up – how to prepare for testing Start – how to begin procedure execution Proceed – procedure actions Measure – how test measurements will be made Shut Down – how to suspend testing procedure Restart – how to resume testing procedure Stop – how to bring execution to an orderly halt Wrap Up – how to restore the environment Contingencies – how to deal with anomalous events during execution . Procedure steps – – – – – – – – – – IEEE Std.

Test log identifier 2. messages). 829-1998 Test Plan Outline 1.g. outputs Environmental Information specific to the entry Anomalous Events (if any) Incident Report Identifiers (identifier of test incident reports if any generated) .Test Log IEEE Std. Activity and event entries – – – – – Execution Description Procedure Results .observable results (e. Description – Information on all the entries in the log 3.

references to linked documents (e.items involved. 829-1998 Test Plan Test Incident Report Outline 1.g. Summary . Test incident report identifier 2. Incident description – – – – – – – – – – Inputs Expected results Actual results Anomalies Date and time Procedure step Environment Attempts to repeat Testers Observers . procedure. log) 3. test case.IEEE Std.

test can continue but with minor restrictions. bypass needed H: High . no bypass needed L: Low – testing not affected. 829-1998 Test Plan Test Incident Report Outline 4. bypass needed M: Medium . Impact – on testing process – – S: Show Stopper – testing totally blocked. problem is insignificant – – .IEEE Std. test can continue with severe restrictions.major portion of test is partially blocked.

Comprehensive assessment . . Summary of results – issues (resolved. 4. incident reports 3. Variances – of test items (from specification).of testing process against comprehensiveness criteria specified in test plan 5.overall evaluation of each test item including its limitations 7. procedure. references to plans. Approvals . Test summary report identifier 2.names and titles of all persons who must approve the report . 829-1998 Test Plan Test Summary Report Outline 1.. Evaluation . logs. Summary . unresolved) 6.IEEE Std..Summarize the evaluation of the test items. Summary of activities 8. plan.

Test Implementation Phase .

Assessing Testing effort ● How to know when testing can stop? How to assess testing success at removing defects? Examples of approaches ● ● ● Independent test groups Fault seeding technique Mutant-based testing Defect plotting ● ● ● .

Estimation using Independent Test Groups ● Evaluate how many defects are in a software product and the efficiency of defect removal Uses independent groups testing same program Example: 2 groups x = number of faults detected by Test Group 1 y = number of faults detected by Test Group 2 q = number of common faults detected by both Test Groups n = total number of faults in the program ● ● ● E1 effectiveness of Group 1: x/n E2 effectiveness of Group 2: y/n n = (x*y)/q = q/(E1*E2) ● ● .

compare the number of seeded and nonseeded faults detected ● ● ● N = number of nonseeded (unintentional) faults S = number of seeded faults placed in the program n = actual number of faults detected during testing s = number of seeded faults detected during testing ● ● ● N = S*(n/s) . seed program with a number of typical faults After a period of testing.Fault seeding technique ● To estimate the number of faults in a program Before testing.

parameters by adding/subtracting one. Mothra) .g.Mutant-based testing ● To assess the effectiveness of a test suite for defects detection Mutant ∈ version of a program obtained by replicating the original program except for one small change (a mutation) ● ● corresponds to a typical error Value – change constants. better the test suite is ● Needs tool support (e. etc.g. < to >=) Statement – delete/exchange statements ● Examples of mutations ● ● ● ● Mutant killed by a test suite if it is revealed (fail) Kill ratio of a test suite = # mutant killed/# of mutants ● ● higher the kill ratio. subscripts. Decision – modify the sense of decisions (e.

for (j =1. } return result. } Original Test suite Test Case 1 2 3 Input[0] 1 3 1 Input[1] 2 1 3 Input[2] 3 2 2 Expected 2 0 1 Examples of mutants Mutant 1 2 3 4 Statement changed to Result input[j] < input[result] Killed input[j] == input[result] Killed input[j] <= input[result] Killed input[j] >= input[result] Live need to add a test case for mutant #4 .Mutant-based testing . j++) { if (input[j] > input[result]) result = j. result.Example /* returns index of 1st occurrence of maximum element in an array */ int max(int[] input) { int j. j < input.length. result = 0.

int b) { int max. if (a > b) max = a.Mutant-based testing Equivalent programs Mutant that produces exact same outputs as original ● Example ● int max(int a. else max = } Changing a > b by a >= b produces an equivalent program .

then drop and plateau .Defect Plotting ● To help decide when to stop testing Plot number of defects found per period of testing Graph supposed to – – – ● ● peak.