Testing Basics IVS-TRAINING

“© 2005 Infosys Technologies Limited. All rights reserved. The information provided in this presentation is intended for the sole use of the recipient and is for educational purposes only. No part of this presentation may be reproduced or transmitted in any form or by any means, including photocopying and recording, without written permission. Permission must also be obtained before any part of this presentation is stored in a retrieval system in any nature. No responsibility can be accepted by Infosys Technologies Limited, the Editorial Board or contributors for action taken as a result of information contained in this presentation. The views expressed in this presentation by the presenter are not necessarily those of the Editorial Board or Infosys Technologies Limited.”

-1-

Ground Rules
    Please mute your mobile phones Stick to timeliness Help each other in learning – as learning is a continuous process Please participate actively to make the session interactive

-2-

Session Objectives
Software Testing Life Cycle Software Testing Techniques Types of testing Introduction to Test automation and Test tools Test Planning and Strategizing

-3-

A simple ‘Coffee Vending machine’ Example
A simple scenario: What are the necessary checks one would like to do, when using a Coffee vending machine?

-4-

Parameters in “Coffee Vending Machine” example
Check if
Correct Labels are present for the appropriate dispenser (Coffee, Tea, Milk, Hot water, etc). The selected option has an indicator. (E.g. glowing light) Correct content is dispensed when we press the corresponding option. There is an indicator to indicate the decrease in water level. Display of relevant messages. Etc………..

-5-

completeness and quality of developed computer software. -6- . A set of activities conducted with the intent of finding errors in software.What is Software Testing? Software testing is a process used to identify the correctness.

An indication of quality -7- . only their presence.What Testing Shows Errors Requirements conformance Performance Testing can never be used to show the absence of errors.

To make software predictable in behavior. To reduce incompatibility and interoperability issues.Why Testing? To verify that all requirements are implemented correctly (both for positive and negative conditions) To identify defects before software deployment To help improve quality and reliability. -8- . To help marketability and retention of customers.

Bug in the reservation system caused tickets to be issued from NY to London at dirt cheap rates.A) because the captain entered an incorrect one-letter computer command that sent the jet into a mountain killing 158 people aboard. Ariane 5 was remotely destroyed within 40 seconds of launch causing a loss of US$ 500 million because of a wrong exception handling. The airline had to bear the loss. -9- .Cost of Ineffective Testing A single error can cause death or injury if it fails in case of safety critical applications. An AA jet crashed in Colombia (S.

To make software predictable in behavior.Why Testing? To verify that all requirements are implemented correctly (both for positive and negative conditions) To identify defects before software deployment To help improve quality and reliability. -10- . To reduce incompatibility and interoperability issues. To help marketability and retention of customers.

0 0 0 $ 6 .0 0 0 $ 2 .0 0 0 $0 $139 $455 $977 B u ild $ 1 4 .Cost of Ineffective Testing …Continued C o s ts o f C o rre c tin g D e fe c ts (E x a m p le ) S o urc e: IE E E C o mputer S o c iety $ 1 6 .0 0 0 $ 1 4 .0 0 0 $ 4 .1 0 2 $ 7 .0 0 0 $ 1 2 .0 0 0 $ 1 0 .0 0 0 $ 8 .1 3 6 Re q u ir e me n ts A r c h & De s ig n Te s t & Imp le me n tMa in te n a n c e S y s te m D e v e lo p m e n t P h a s e s -11- .

will attempt to break it and. but. will test "gently“ and. is driven by "delivery" Independent tester Must learn about the system. is driven by quality -12- .Who Tests the Software? Developer Understands the system but.

SI & UAT) Transition / Rollout ents could be defined along many dimensions e. : Functional Reqts. Usability Reqts. System Reqts. Technica Testing Process Requirements Testability review User Interviews Testing Strategy Test analysis & design IT test planning Functional Test planning & scripting Non-functional Test planning Test data generation Unit test planning Unit testing Integration testing IT results review Test bed setup Functional testing Functional results review Non-functional testing Non-functional results review Defect tracking Activities owned by IVS -13- .SDLC Vs.g. STLC Software Development Life Cycle Detailed Design & Development Unit and Integration Testing Business Analysis Detailed * Requirements High Level Design Testing (System. Quality Reqts. Performance Reqts.

V –Model User Acceptance Test Plan Requirement Analysis User Acceptance Testing System Test Plan Functional Specification System Testing Integrated Test Plan High Level Design Integration Testing Unit Test Plan Detailed Design / Program Specification Unit Testing CODE -14- .

-15- ..Testing Lifecycle Requirements Capture Is there a Defect? The Test Cycle Closure Def ect Test Result Analysis Analysis Test Case Development Test Planning & Scenario Design Fi x Test Execution Click on any Stage to see the details….

. Interview stakeholders. Study Software requirements and existing Test process Entry Criteria….. Elicit Business objectives and constraints.Requirement Capture Activities •Functional Requirements Specification •Non-Functional Requirements •Performance •Security •Usability/ Compatibility Exit Criteria…. Requirements Capture Deliverables Test Requirements Document •Test process followed currently by the client is assimilated •Test Requirements documented and signed off by client -16- .

.Analysis •Test Requirements Document •Software Integration Plan •Prioritization of Business objectives Activities Requirements testability review Test effort estimation Test Strategy & Approach planning (Including consideration for test phases. Entry Criteria….. Analysis Deliverables System Requirement Specification User Acceptance Plan System Test Plan •Test effort estimation completed •Master Test Strategy document signed off by the client -17- . location. Automation etc. environment.) Exit Criteria….

Exit Criteria…. Use cases). •Test Matrix and list of Scenarios approved for coverage.Planning & Design •Master Test Strategy and effort estimation.. Planning & Design Deliverables Integrated Test Plan Unit Test Plan •Detail Test plans approved by client.g. Define various project metrics (Quality and Testing). Create Test Traceability Matrix. Document Test scenarios and dependencies (including data). Activities Create detail Test plans for each type of testing. •Business process flows (e. •Software Functional design. -18- .. Entry Criteria….

•Test Matrix and prioritized Test Scenarios. Scripts. Data. or create data. Test Case Development Deliverables Test cases. scripts.. review and approval by client Automate scripts. Environment •Test cases. •Environment setup. •Test Data Specifications. Exit Criteria…. -19- . Create test cases. if required Modify existing test data. Test Scripts and Test data approved by client and are base lined.Test Case Development Activities •Detail Test Plans.. Setup test environment Baseline test cases and scripts Entry Criteria….

after unit test. •New build of the testable application. •Test results and defects logged... Test the environment and connectivity Perform smoke test on the build Execute tests as per plan Update test plans. and log defects for failed cases Entry Criteria…. -20- . scripts and data.Execution •Test Cases. •Test environment operational. if necessary Document test results. Activities Exit Criteria…. Execution Deliverables Defect report •Tests executed as per plan. •Test traceability matrix updated with execution status.

•Test cycle closure approved by client.. Test Cycle Closure •Defect logs updated. Deliverables Test Closure report -21- . Evaluate cycle completion criteria based on test coverage Time Cost Software Quality Critical Business Objectives Test cycle completion report Entry Criteria….. Exit Criteria….Test Cycle Closure Activities •All tests for the cycle executed. •Updated Test traceability matrix. •Test results and Defect logs.

-22- . •Test Defect Logs.Test Result Analysis Activities •Updated Test traceability matrix. Exit Criteria….. Test coverage against plan Variance from Test plans and reasons Root cause analysis for defects Project Metrics analysis Entry Criteria…. •Document project learning and best practices.. Test Result Analysis Deliverables Metrics Report •Quantitative & Qualitative recommendation for process improvement. •Project metrics (Quality and Testing).

Basic forms of Testing VERIFICATION Am I Building VALIDATION R I G H T ? e Static T Inspections Plan .r.t user needs and requirements” . High Level Design Reviews & Code Walk Unit Test Plan through Detailed Design / Program Specification Integrated Test Dynamic T esting P R O D U C T Functional & Non-Function User Acceptance Test Plan Specifications are Requirement Analysis verified with the customerSystem Test Plan Functional Specification User Acceptance Testing System Testing sting Black Box Testing R I G H T P R O D U C T ? Integration Testing Unit Testing White Box Testing CODE “Complies with the process to yield a right product” -23- “Validates correctness of software w.

Testing Techniques (Contd. so saves rework cost –Checklist-based approach –Focuses on coverage –Group perspective –Highest probability of finding defects –Time consuming –Cannot test data dependencies –High skill levels required -24- .) Advantages Advantages Disadvantage Disadvantage s s –Capture defects early.

Scope Requirements Feasibility Technical Architecture Design Test Cases Test Plans Program specifications User Documents -25- .) Static Testing / Reviews: Testing of an object without execution on a computer.Testing Techniques (Contd.

) Dynamic Testing: The process of executing a program or system with the intent of finding errors. how the system was implemented – Tests Testing based on knowledge of internal structure and logic – E. what the system is supposed to do – Based on external specifications without knowledge of how the system is constructed – E. Sub Techniques under this are  White Box testing – Tests that validate the system architecture. System testing. Integration testing  Black Box testing – Tests that validate business requirements.g. Unit testing. User Acceptance testing Input Events -26- Output .Testing Techniques (Contd.g.

) White Box Testing  Structure & Design based Testing  Program-Logic driven Testing  Examines the internal structure of program Advantages  High Code coverage – Exhaustive (thorough) path testing  Program logic is tested  Internal boundaries are tested.Testing Techniques (Contd. not implementation  Big-picture approach .  Testing is algorithm specific -27- Vs Black Box Testing  Specification based Testing  Business-Transaction driven Testing  No concern to internal behavior and program structure  User’s perspective  Focus on features.

Types of Testing Functional Testing Non Functional Testing Other Types of testing • • • • • • • • • • • • Unit Testing Smoke testing / Sanity testing Integration Testing Interface Testing Usability Testing System Testing Regression Testing User Acceptance Testing o Alpha Testing o Beta Testing White Box Testing Black Box Testing Globalization Testing Localization Testing • Performance Testing o Stress Testing o Volume Testing o Load Testing o Endurance Testing • Compatibility Testing • Migration Testing • Data Conversion Testing • Security/Penetration Testing • Installation Testing • Scalability Testing • Recovery testing • Exploratory Testing • Adhoc Testing • Mutation Testing • Comparison Testing • Destructive/Validation Testing • Conformance Testing • Disaster Recovery Testing • Reliability Testing -28- .

-29- . This isolated testing provides four main benefits: Flexibility when changes are required. Facilitates Integration Ensures documentation of the code Separation of Interface from Implementation This type of testing is mostly done by the developers. Focuses on implementation logic.Unit Testing Testing performed on a single. The goal of unit testing is to isolate each part of the program and show that the individual parts are correct. so the idea is to write test cases for every method in the module. standalone module or unit of code to ensure correctness of the particular module.

They do not perform any real computation or data manipulation Drivers are simple programs designed specifically for testing the calls to lower layers. It takes as its input modules that have been checked by unit testing. Stubs are used in top-down approach while drivers are used in bottom-up approach Stubs are pieces of code which have the same interface as the low level functionality. -30- . the integrated system . groups them. applies tests defined in an Integration test plan and delivers as its output. This is especially relevant to client/server and distributed systems. Test drivers permit generation of data in external form to be entered automatically into the system.Integration Testing Phase of software testing which follows unit testing and precedes system testing in which individual software modules are combined and tested as a group. The purpose of Integration testing is to detect any inconsistencies between the software units that are integrated together Stubs And Drivers Stubs and drivers are dummy module interfaces which are used in integration testing.

C++ Financial Account Opening VB based Inter module interface Online banking Java Based Intra module interface Interface between various modules of a system Unit -31- Module .Integration Testing Contd… Banking System Interface of External System External System External interface CRM Interface Between units within a module HR Loans Based on C.

Identify the postprocess to validate the interface External Black Box Simulate live environment Check for parameter / procedure called.Integration Testing Contd… Approach for testing interfaces Intra-Module Test Techniques White Box Test Environment Setup Test Approach Development environment Inter-Module Black Box Isolated development environment Identify pre-process and approach to Interface. Database updation Identify external / internal interface and validate against external system -32- .

Types Of Integration Testing Big-bang Integration (non-incremental) All components are combined in advance. Top-Down Strategy It is an incremental approach which can be done Depth-first or Breadth First Stubs are used until the actual program is ready Bottom-up Strategy Process starts with low level modules where critical modules are built first Cluster approach and test drivers are used Often works well in less structured applications -33- . Correction is difficult because isolation of causes is complicated Incremental Integration Incremental integration can be defined as continuous testing of an application by constructing and testing small components.

It is carried out by a non-Development Team System testing performs environment testing including live/simulated user data The whole system including the functional and non-functional requirements tested It helps in finding compatibility errors and performance limitations.System Testing System testing is a black-box Testing technique which is primarily aimed at end to end testing of the application. Sanity Testing Regression Testing Performance Testing User Acceptance Testing Unit Testing Integration Testing Open Account Deposit Amount Withdraw Amount Bank Application Close Account -34- .

System Testing Contd… Types Of System Testing Functional Testing Configuratio n Testing Performance Testing Instability Testing Volume Testing Disaster and Recovery Testing Configuration Testing Reliability Testing The scope of the session is restricted to the first three types of system testing. -35- .

System Testing Contd… Functional Testing – Sanity Test Very basic minimal number of tests to verify the product for the feature / protocol compliance Typically an initial testing effort to determine if a new software version is performing well enough to accept it for a major testing effort Functional Testing – Regression Testing • • Regression testing refers to the continuous testing of an application for each new release. The regression testing is done to ensure proper behavior of an application after fixes or modifications have been applied to the software or its environment and no additional defects are introduced due to the fix. The Regression testing scope increases with new builds • -36- .

(Most frequently used transactions) Volume intensive transactions (for both volume and stress testing) Installation Testing Basic installation Installation of various configurations Installation on various platforms Regression testing of basic functionality -37- .System Testing Contd… Performance Testing Number of concurrent users accessing at any point in given time System’s performance under high volume of data Stress testing for systems. which are being scaled up to larger environments or implemented for the first time Operational intensive transactions.

For softwares developed under contracts acceptance testing involves evaluating the software against the acceptace criteria defined in contract. For software not developed under contract. The focus is on a final verification of the required business function and flow of the system.Acceptance Testing Acceptance testing is one of the last phases of testing which is typically done at the customer place. to which the system should conform. ALPHA Testing and BETA Testing form a part of acceptance testing Alpha & Beta Testing Forms of Acceptance testing Testing in the production environment Alpha testing is performed by end users within a company but outside development group Beta testing is performed by a sub-set of actual customers outside the company -38- . Testers usually perform the tests which ideally are derived from the User Requirements Specification.

When do you stop testing? When time runs out Depending on appropriate calculations derived from statistics available from Test cases Target test coverage attained Predicted density of errors left drops below a threshold Certain number of errors found * Error detection rate drops below a threshold * * Infosys Process Capability Baseline -39- .

Pros & Cons of stopping Testing Positive Negative E. till they are corrected testing cannot progress Key components are missing Test environment is unstable Defects found are too minor/cosmetic in nature -40- . stoppers and critical bugs Error levels are on a higher side compared to the expectations Testers fail to find the defects for long hours Too many errors found in a short duration of testing When some components fail.g.

Test Automation Complex and time-consuming tests Tests requiring a great deal of precision Preferred for: Tests involving many simple. repetitive tests Tests involving many data combinations One-time Testing only tests peripheral devices assessment tests (look Not preferred for: Subjective and feel based) -41- .

Simple Vs Sophisticated Automation C O S T Simple Automation Maintenance Cost -42- Sophisticated Automation Implementation Cost .

Testing Tools -43- .

Categories of Testing Tools Functional / Regression Web Testing: Testing the Web Sites Load /Stress: Performance testing under high load / stress Data Generation: Generation of test data Benefits Reduces manpower and time More coverage within the same testing time Testing products which are very difficult to test manually e. Load Tests Can look inside the software for memory leaks. redundant code etc -44- .g.

QTP Team Test QA Center Performance Testing OptimizeIT Load Runner QA Load Vendor Source Code Testing BoundsChecker -45- .Test Tools (Client Server) Category Tool Numega Rational Rational Sitraka Software ATTOL Software Compuware Mercury Rational Compuware VM Gear Mercury Compuware Pure Coverage Purify Jprobe ATTOLCoverage Functional Testing QA Run Win Runner.

Test Tools (Web Applications) Category Functional Testing Tool Rational Robot Silk Test Performance Testing Silk Performer Web Load Link & HTML Testing NetMechanic Linkbot Site Inspector Doctor HTML Imageware Rational Segue Software Segue Software Radview Monte Sano Tetranet Vendor -46- .

Network Associates Inc. Sara TCP Wrappers TCP Dump Vendor Interworking Labs Inc.Test Tools (Network & Security) Category Security & Communication Tool SNMP Test Suite Sniffer Pro Lophtcrack Saint. SwTech -47- . Satan.

Going Forward… -48- .

Network.Test Planning/Strategizing Includes Testing Objectives and Goals Test Strategy/Approach based on customer priorities Test Environment (Hardware. Software. Communication etc) Features to test with priority/criticality Test Deliverables Test Procedure – Activities and tools Test Entry and Exit criteria Test Organization and Scheduling Testing Resources and Infrastructure Test Measurements/Metrics -49- .

roles and responsibilities Improves test efficiency Improves test measurability -50- .Test Planning/Strategizing Benefits Sets clear and common objectives Helps prioritize tests Facilitates Technical tasks Helps improve coverage Provides structure to activities Improves communication Streamlines tasks.

Test Optimization Categorize entire gamut of testing into Sanity / Regression /Performance and Stress Efficient Review Process Helps improve coverage Use Tools to do the testing (both Automation and Simulators ) Look out for Automation if it helps ( leverage 24 hours a day ) Use Traceability Matrix Tracking and Reporting of bugs and defects – Defining a process Revisit the Test coverage and categorization on an ongoing basis A Strategy needs to be formulated for mitigating risk Strategy for Re-testing essentially regression testing also needs to be in place -51- .

Traits of a good Tester Destructive creativity Detective skills Appreciating the users’ perspective Adapt to and Understand Requirements change An eagerness to embrace new technologies Good communication Skills Good problem solving skills A skeptical. but not hostile attitude Good Eye – for –Details Good Domain understanding -52- .

Questions? -53- .

com -54- .Thank You!! IVS-TRAINING Please note that submission of Course and Instructor feedback is mandatory for availing attendance for the Course. Any doubts or suggestions for improvement can be forwarded to: IVS_TRAINING@infosys.

Sign up to vote on this title
UsefulNot useful