Professional Documents
Culture Documents
Manual Testing
Manual Testing
Fault: Is a condition that causes the software to fail to perform its required function.
Error: Error refers to difference between actual output & expected output.
To discover defects.
To avoid the user from detecting problems.
To prove that the s/w has no defects.
To learn about the reliability of the software.
To ensure that product works as user expected.
To stay in business
To avoid being sued by customers
To detect defects early, which helps in reducing the cost of fixing those
defects?
Testing is the process of creating, implementing & evaluating tests. Testing measures
software quality.
Testing can find faults. When they are removed software quality is improved.
Reviews Means Re-verification. Reviews have been found to be extremely effective for
detecting defects, improving productivity & lo0wering costs. They provide good check
points for the management to study the progress of a particular project. Reviews are also a
good tool for ensuring quality control. In short, they have been found to be ext4remely
useful by a diverse set of people and have found their way in to standard management &
quality control practice of many institutions. Their use continues to grow.
Quality Assurance:
Quality assurance measures the quality of processes used to create a quality product.
Software QA involves the entire s/w development process monitoring & improving the
process, making sure that any agreed upon standards & procedures are followed and
ensuring that problems are found and deal with.
AREAS OF TESTING:
Equivalence Class
For each piece of the specification, generate one or more equivalence class.
Label the classes as valid or invalid.
Generate one test case for each Invalid Equivalence Class.
Generate a test cases that covers as many as possible equivalence classes.
Eg: In LIC different types of policies are there
Here we divide who comes under which policy & write TCs for valid & invalid classes.
Eg: In LIC,
When user applies for type-5 insurance, system asks to enter the age of the
custo0mer. Here age limit is greater than 40 yrs. & less than 60 yrs.
40-60
Error Guessing:
This is just a combination of both black box and white box testing. Tester should
have the knowledge of both the internal and externals of the function.
Tester should have good knowledge of white box testing & complete knowledge of
black box testing
Grey box testing is especially important with web & internet applications, because
the internet is built around loosely integrated components that connect via relatively well-
defined interfaces.
PHASES OF TESTING V MODEL
BRS Acceptance
Test
Verification Validation
Verification Validation
Design Integration
Testing Test
Verification Validation
Build System Unit Test
Verification Validation
V MODEL
V stands for verification & validation. It is a suitable model for large scale
companies to maintain testing process. This model defines co-existence relation between
development process and testing process.
PHASES ARE
1) Unit Testing
2) Integration Testing
3) System Testing
4) User Acceptance Testing
1) Unit Testing
The main goal is to test the internal logic of the module. In unit testing tester
is supposed to check each and every micro function. All field level validations
are expected to be tested at this stage of testing. In most cases the developer
will do this.
In unit testing both black box & white box testing conducted by
developers.
Depends on LLD
Follows white box testing techniques.
Basic path testing
Loop coverage
Program technique testing
Approach:
i. Equivalence Class
ii. Boundary value analysis
iii. Error guessing
2) Integration Testing:
In this the primary objective of Integration Testing is to discover errors
in the interface between modules / sub-systems.
App server and database server.
In this many unit tested modules are combined into sub-systems. The
goal here is to see if the modules are combined can be integrated properly. Follows white
box testing techniques to verify coupling of corresponding modules.
Approach
i. Top-down approach --- this is used for new systems.
ii. Bottom-up approach --- this is used for existing systems.
Top-down Approach
Testing main module without coming sub modules is called top-down approach. We
can use temporary programs instead of sub modules is called stub.
Bottom-up approach:
Testing sub modules with out coming main modules is called bottom-up approach.
We can use temporary programs instead of main module is called driver.
3) System Testing;
The primary objective of system testing is to discover errors when the system is
tested as a whole. System testing is also called as End to End testing. Tester is expected
to test from login to logout by covering various business functionalities, conducted by test
engineers. Depends on SRS.
Approach:
Identify the end-to-end business life cycle.
Design the test data.
Optimize the end-to-end business life cycle.
4) Acceptance testing:
Acceptance testing is to get the acceptance from the client. Client will be
using the system against the business requirements. Client side tests the real-
life data of the client.
Approach:
Building a team with real-time users, functional users and developers.
Execution of business test cases.
WHAT IS A TEST CASE ?
Test cases are valuable because they are repeatable, reproducible under the
same/different environments.
F Functionality Testing
U Usability Testing
R Reliability Testing
R Regression Testing
P Performance Testing
S Scalability Testing
C Compatibility Testing
1) Functionality Testing
Testing Approach :
Equivalence class
Boundary value analysis
Error guessing
2) Usability Testing:
To test the ease(comfort, facility) and user-friendliness of the system.
Approach:
Qualitative approach
i. Each and every function should be available from all the pages of the
site.
ii. User should be able to submit request within 4-5 actions.
iii. Confirmation message should be displayed for each submit.
Quantative approach:
3) Reliability Testing
Objective is to find mean time between failure/time available under specific load
pattern and mean for recovery.
Eg:
23 hours/day availability & 1 hour for recovery (system).
City bank have 4 servers in each region. Every 6 hrs. It will change servers.
Approach
4) Regression Testing
To check the new functionalities have been incorporated correctly without failing
the existing functionalities.
The bugs need to be communicated and assigned to developers that can fix it. After
the problem is resolved, fixes should be re-tested, and determination mode regarding
requirements for regression testing to check that fixes did not create problems else where.
5) Performance Testing
Performance parameters;
Stress testing:
Finding break point of application. Max. no.of users that an application can
handle(at the same time)
Approach:
RCQE
Repeatedly working on the same functionality.
Critical Query Execution.
To emulate peak load.
Volume testing:
With the load that customer wants ( not at the same time) . Load is increasing
continuously till the customer is required load.
6) Scalability testing:
To find the maximum number of user system can handle. (customer will give
max. no. )
Classification:
Network scalability
Server scalability
Application scalability
7) Compatibility testing:
How a product will perform over a wide range of hardware, software & network
configuration and to isolate the specific problems.
Approach: ET Approach.
Environment Selection :
Understanding the end users application environment.
Importance of selecting both old browser & new browser.
Selection of the operating system.
V model is the most suitable way to follow for deciding when to start writing test
cases and conduct testing.
Testing limitations: ?
Tester responsibilities :
Follow the test plans, scripts etc, as documented.
Report faults objectively and factually.
Check tests are correct before reporting s/w faults.
Assess risk objectively.
Prioritize what you report.
Communicate the truth.
We cant test every thing. There is never enough time to do all testing you would
like, so what testing should you do ?
Prioritize tests, so that, whenever you stop testing, you have done best testing in the
time available.
Tips :
Possible ranking criteria (all risk based)
Test where a failure would be most severe
Test where failures would be most visible.
Take the help of customer in understand what is most important to him.
What is most critical to the customers business.
Areas changed most often.
Areas with most problems in the past.
Most complex areas, or technically critical.
Software :
Before starting the analysis we first check the feasibility of the project/work/system.
If we feel it is feasible then we will go to SDLC phases.
Finance feasibility
Cost feasibility
Resource feasibility
Ability to accept
Analysis
Design
Coding
Testing
Analysis :
i. Requirements analysis is done to understand the problem the software system
is to solve.
ii. Understanding the requirement of the system is a major task.
iii. Analysis is on identifying what is need from the system.
iv. Main goal of the requirements specification is to produce the SRS document.
v. Once he understood the requirement must be specified in the document.
Design :
i. Purpose of the design is to plan a solution of the problem specified by the
requirement documents.
ii. This phase first step is moving from the problem domain to solution domain.
iii. The o/p of this phase is the design document.
iv. This document similar to a blue print.
Coding:
i. Once the design is complete, most of the major decisions about the system
have been made.
ii. The goal of the coding phase is to translate the design.
iii. The coding effect both testing & maintenance. Well-written code can reduce
the testing & maintenance efforts. Because of testing and maintenance costs of
s/w are much higher than to coding cost.
So the goal of the coding should be to reduce the testing & maintenance efforts.
Testing :
i. Testing is the major quality control measure used during s/w development. Its
basic function is to detect errors in the s/w.
ii. After the coding ,computer programs are available that can be executed for
testing purpose different levels of testing are used.
iii. The starting point of testing is unit testing. A module is tested separately. This
is done by the coder himself simultaneously along with the coding of the
module.
iv. After this modules are gradually integrated into subsystems which are then
integrated from the entire system. We do integration tests.
v. System testing: system is tested against the requirement to see if all the
requirement are met all the specified by the documents.
vi. Acceptance testing : client side on the real-life data of the client.
Draw back :
Once request made freeze, it cannot be changed i.e. changes cannot be done
after requirements are freezed.
Uses:
It is well suited for routine type of projects where the requirements are well
understood & small project.
2. Prototype Model:
In this model the requirements are not freeze before any design or can proceed.
The prototype is developed based on the currently known requirements.
It is sample of how actual system looks like,
Requirement
Analysis D
Design C T
Code
3.Test
Iterative Model:
In this model we can make changes at any level, but all the four phases of
SDLC will take place again.
A A A
D D
D
C C C
T T T
4. Spiral Model :
In this model system is divided into modules and each module follows phases
of SDLC. It is good & successful model.
C C C A
Module 2
T
Module 3
TEST LIFE CYCLE(TLC)
TLC PHASES:
System study
Scope/Approach/Estimation
Defect Handling
GAP Analysis
1. System study:
Domain:
In domain, there may be different types of domains like banking,
finance, Insurance, Marketing, Real-time, ERP, SEIBEL,
Manufacturing etc.
2. Software:
3. Scope/Approach/Estimation:
What to be tested.
Scope
What not be tested.
Eg:
U I S A
Module
TC no.
Pre-condition
Description
Expected output
Actual output
Status
Remarks
TYPES OF REVIEWS:
Peer peer review same level
Team lead review
Team Manager review
REVIEW PROCESS:
iii. Output:
Raise the defect
Take a screen shot & save it.
9. Defect Handling :
Submit to developer
Finding the difference between the client requirement & the application
developed .
Deliverables:
Test plan
Test scenarios
Defect reports
BRs Vs SRs.
SRs Vs Test Case.
TC vs. Defect.
Defect is open / closed.
What is Test Plan : A software project test plan is a document that describes
the objectives, scope, approach & focus of a software testing effort. The completed
document will help people outside the test group understand the why & how of product
validation.
WHAT IS DEFECT:
Any one who has involved in software development lifecycle and who is using the
software can report a defect. In most of the cases defects are reported by testing team.
A short list of people expected to report bugs.
Testers / QA engineers.
Developers.
Technical support.
End users.
TYPES OF DEFECTS:
Cosmetic flow
Data corruption
Data loss
Documentation issue.
Incorrect operation.
Installation problem.
Missing feature.
Slow performance
Unexpected behavior
Unfriendly behavior
Priority:
Relative importance of the defect, how fast the developer has to take up the defect.
The general rule fortune fixing the defects will depend on the severity. All the high
severity defects should be fixed first.
This may not be the same in all cases some times even though severity of the bug is
high it may not be taken as the high priority.
At the same time the low severity bug may be considered as high priority.
Not based on any knowledge of internal design or code. Tests are based on
requirements and functionality.
3. INTEGRATION TESTING:
4. FUNCTIONAL TESTING:
Black box type of testing. This type of testing should be done by testers. This
does not mean that the programmers should not check that their code works
before releasing it.
5. REGRESSION TESTING:
6. SYSTEM TESTING:
Black box type testing that is based on over all requirements specifications.
Covers all combined parts of a system.
7. ACCEPTANCE TESTING:
Testing how well a system recovers from crashes, hardware failures or other
catastrophic(sudden calamity) problems.
9. SECURITY TESTING:
How well the system protects against unauthorized internal or external access.
10.COMPATABILITY TESTING:
11.ALPHA TESTING:
12.BETA TESTING:
Testing when development and testing are essentially completed and final
bugs and problems need to be found before final release. Typically done by end-
users or others not by programmers or testers.
13.SANITY TESTING:
This is before testing. Application is stable or not, we want to write test cases
for product whether development team released build is able to conduct complete
testing or not ?
14.SMOKE TESTING:
After testing major & medium or critical functions are closed or not
15.MONKEY TESTING:
Testing like monkey . As no proper approach. Taking any functions and test it.
Coverage of main activities during testing is called monkey testing (If give one
day for testing)
16.MUTENT TESTING:
Is the defect we have to inject defect into application and test.
Password : *************
OK
Approach for the integration when checking the errors between module or sub
module.
19.AD-HOC TESTING:
Doing a short cut way, does not following a sequential order mentioned in the
test cases or test plan.
20.PATH TESTING:
SOFTWARE QUALITY:
BRS:
It specifies needs of customer.
Total business logic documents.
SRS:
It specifies Functional Specifications to develop,
HLD:
High level design document.
It specifies interconnection of modules.
LLD:
It specifies Internal logic of sub-modules.
TESTING TEAM:
Quality Control
Quality Analyst
Test Manager
Test Lead
Test Engineers
VERIFICATION:
Typically involves reviews and meetings to evaluate( estimate , calculate)
documents, plans, code requirements and specifications. This can be done with check lists,
issue lists, walk through & inspections meetings.
VALIDATION:
Typically involves actual testing and takes place after verifications are completed.
SEVERITY:
PRIORITY:
Now we are using V model and we will also include in some other methods like
prototype and spiral in single application.
Quality s/w is reasonably bug-free, delivered on time and with in budget, meats
requirements and/or expectations and is maintainable.
SEI:
Software Engineering Institute.
Initiated by the U.S. defense department to help improve software development
processes.
CMM:
A good test engineer has a test to break attitude (approach, manner) an ability to
take the point of view of the customer, a strong desire for quality and attention to details.
A test case is a document that describes an input action or event and an expected
response, to determine if a feature of an application is working correctly.
This can be difficult to determine. Common factors in deciding when to stop are
Dead lines ( release deadlines, testing dead lines etc..)
TC completed with certain percentage passed
Test budget / depleted (used U P)
Bug rate falls below a certain level
Beta or alpha testing period ends.
PRODUCT:
Developing a product without interactions to two client before the
product release
PROJECT:
Developing a product based on the client needs or requirements.
WHAT IS A TEST PROCEDURE ?
It is version control. It covers the process which is used to control, co-ordinate and
track the requirement documentation, the problem faced, change request and design and
the tools to be used. The changes made again and who made the changes.
TESTING TECHNIQUE:
TESTING METHODOLOGIES: