Professional Documents
Culture Documents
Test PDF
Test PDF
com
Manual Testing
Software Quality:
SQA are the Concepts to be followed by company to develop the software. An SQA
team is responsible for Monitoring & Measuring the strength of developing processes.
Software Project:
A set of problems assigned by the client, who can be solved by the software people
through the process of software Engineer called Software project. In short, the problem, the
people, the process called project. Software related problem is solved by software engineers
through software engineer process is software project.
In this stage, Business Analyst studies the requirement of the client /customer and
they prepare Business Requirement Specification (BRS) documents.
Analysis:
In this stage, Sr. Analyst prepares Software Requirement Specification (S/w RS)
document with respect to corresponding BRS document. This document consists of two
sub-documents System Requirement Specification (SRS) & Functional Requirement
Specification (FRS). SRS contain details about software & hardware requirement. FRS
contains details about the functionality to be used in project.
Designing:
In designing phase, Designers creates two documents High Level Document (HLD)
& Low Level Document (LLD). HLD consists of main modules of the project from root to
leaf and multiple LLD’s. LLD consists of sub-modules of main module along with Data
flow diagrams, ER-Diagrams, etc., are prepared by technical support people or designers
called Internal designers.
Testing:
V MODEL TESTING
Development Testing
Port testing
Maintenance Test Software changes
Test efficiency
Coding
From the above refinement form of V-Model, small & medium scale organizations
are maintaining separate testing team for Functional & System testing stage.
documents and after completion documents preparation, they conduct reviews on the
documents for completeness & correctness. This review focuses on below factors:
After completion of Analysis phase & their reviews, our Project-level designers will
start logical design of application in terms of External & Internal design (HLD’s & LLD’s).
In this stage, they conduct reviews for completeness & correctness of designed documents.
This review focuses on below factors:
After completeness of Design & their reviews, software programmers will starts
coding the logical design to physical construction of software. During these coding stage
programmers is conducting Unit Testing through a set of White box testing techniques, Unit
Testing also known as Module / Component / Program / Micro testing
3) Mutation Testing
Mutation means that a change program. White box testers are performing the change
in the program to estimate test coverage on that program. Mutation testing can decide
whether the test coverage is correct or wrong
4) Integration Testing
After completion of dependent modules of development & testing, Programmers
combine them to form a System. In this Integration, they are conducting Integration testing
on the compiled modules w.r.t HLD.
a) Top-Down Approach
Mai
Stub
Sub 1 Sub 2
Top-Down Approach
b) Bottom-Up Approach
Main
Driver
Sub 1
Sub 2
Bottom-Up Approach
Main
Driver
Sub 1
Stub
Sub 2
Sub 3
* Build: A finally integrated all modules set *.exe form file is called build.
1) Usability Testing
2) Functional Testing
3) Performance Testing
4) Security Testing
From Above 1 & 2 are Core level and 3 & 4 are Advance level
1) Usability Testing
In general, TT starts with test execution with Usability testing. During test, Testing
team validates User-Friendliness of screens of build. During Usability testing, TT applies two
types of sub-test:
2) Functional Testing
The major part of Black box testing is Functional testing, during this test, Testing
team1 concentrates on “meet customer requirements”. This Functional Testing is classified
into below sub-test.
During this test, Test Engineers validates correctness of every functionality in terms
of below coverage’s.
c) Recovery Testing
It is also known as Reliability testing. During this test, Testing team validates
whether the application is changing from abnormal state to normal state or not.
d) Compatibility Testing
It is also known as Portability testing. During this test, Testing team validates
whether application build run on customer expected platforms or not. During this test,
Testing Engineers are finding Backward compatibility at maximum.
Forward compatibility -> application is ready to run but Operating system is not
supporting.
Backward compatibility -> Operating system is supporting but the application has
some internal coding problems to run on Operating system
e) Configuration Testing
It is also known as Hardware compatibility testing. During this test, Testing team
validates whether application build supports different technology Hardware devices or not.
f) Inter-Systems Testing
During this test, Testing team validates whether application build co-existence with
other existing software’s or not and also test whether any Dead lock situation occurs or not.
g) Installation Testing
During this test, Testing team validates whether application build along with
supported software’s into customers site like configured systems. During this test, Testing
team observes below factors.
Setup program execution to start installation
Easy Interface
Amount of disk occupied after installation
During this test, Testing team compares application build with competitive products
in market.
During this test, Testing team tries to find extra features in application build w.r.t
customer requirements.
* Defects
During this test, Testing team reports defects to developers in terms of below
categories
j) Re-testing
The re-execution of a test with multiple test data to validate a function, e.g. To
validate multiplication, Test Engineers use different combinations of input in terms of min,
max, -ve, +ve, zero, int, float, etc.
k) Regression Testing
The re-execution of test on modified build to ensure bug fixing work & occurrence
of any side effects, Test Engineers conducts this test using Automation
l) Error, Defect & Bug
A mistake in code is Error, due to errors in coding, Test Engineers are getting
mismatches in application build are defects, if the defects are accepted by developers to be
solves then it is Bug.
Testing Documents
Test Policy
Test Strategy
Test Methodology
Test Plan
Test Cases
Test Procedure
Test Script
Defect Report
Above Figure, shows the various levels of documents prepared at project testing.
Test Policy is documented by Quality Control. Test Strategy & Test Methodology are
documented by Quality Analyst or Project Manager. Test Plan, Test Cases, Test Procedure,
Test Script & Defect Report are documented by Quality Assurance Engineers or Test
Engineers.
Test Policy & Test Strategy are Company Level Documents. Test Methodology, Test
Plan, Test Cases, Test Procedure, Test Script, Defect Report & Final Test Summary Report
are Project Level Documents.
1) TEST POLICY:
This document developed by Quality Control people (Management). In this
document Quality Control defines “Testing Objective”.
Testing Standards : One defect per 250 lines of code or 10 FP (Functional points)
*******
(C.E.O)
TTM: Testing Team Measurements, how much testing is over & is yet to complete
PCM: Process Capability Measurements, depends on old project to the upcoming projects.
2) TEST STRATEGY:
This is a Company level document & developed by Quality Analyst or Project
Manager Category people, it defines “Testing Approach”.
Components:
Testing Issues:
3) TEST METHODOLOGY:
Test
Reporting
Pet Process:
Process involves experts, tools & techniques. It is a refinement form of V-Model. It
defines mapping between development & Testing stages. From this model, Organizations
are maintaining separate team for Functional & System testing & remaining stages of testing
done by development people. This model is developed in HCL & recognized by QA Forum
of INDIA.
TESTING PROCESS
Level – 0
Sanity / Smoke / Tester Acceptance Test / Build Verification Test
Test Automation
Otherwise
Test Closure
Level – 3
Final Regression / Releasing Testing / Pre-Acceptance / Post-Mortem testing
Sign Off
4) TEST PLANNING:
After finalization of possible test for current project, Test Lead category people
concentration on test plan document preparation to define work allocation in terms of What,
Who, When & How to test. To prepare test plan document, test plan order follows below
approach;
Development
document Team Formation
Identify Tactical Risks System Test Plan
Prepare Test Plan
Test Responsible
Review Test Plan
Matrix
1] Team Formation:
In general, Test planning process starts with testing team formation. To define a
testing team, test plan author depends on below factors;
1. Availability of testers
2. Test duration
3. Availability of test environment resource
After Testing team formation Plan author analysis possible & mitigation (ad hoc
testing)
# Risk 7: Lack of communication in between Test Engineer - > Test team and
Test team - > Development team
After completion of testing team formation & Risk analysis, Test plan author
concentrate on Test Plan Document in IEEE format.
Above (3), (4) & (5) decides which module to be tested – > What to test?
12) Staff & Training: Names of selected Test Engineers & training requirements to them
13) Responsibilities: Work allocation to every member in the team (dependable modules
are given to single Test Engineer)
14) Schedule: Dates & Times of testing modules
Above (4) specifies -> When to test?
15) List & Mitigation: Possible testing level risks & solution to overcome them
16) Approvals: Signatures of Test plan authors & Project Manager / Quality Analyst
After completion of plan document preparation, Test plan author conducts a review
of completion & correctness. In this review, Plan author follows below coverage analysis
5) TEST DESIGNING:
BRS
Test
Use Cases & Cases
Functional
Cases
HLD’s
LLD’s
Coding
*.exe
From the above model, Test Engineers are preparing Test Cases depends on
corresponding Use Cases & every test case defines a test condition to be applied.
To prepare test cases, Test Engineers studies Use Cases in below approach:
Steps:
Use Case I:
A login process allows user id & password to validate users. During these validations,
login process allows user id in alpha-numeric from 4 to 16 characters long & password in
alphabets in lowercase from 4 to 8 characters long.
Case study:
Test Case 1) Successful entry of user id
BVA (Size)
ECP
Valid Invalid
During Test design Test Engineers are writing list of Test Cases in IEEE format.
01) Test Case ID: Unique no or name
02) Test Case Name: Name of test condition to be tested
03) Feature to be tested: Module / Function / Feature
04) Test suit ID: Batch ID, in which this case is member
05) Priority: Importance of Test Case {Low, Med, High}
5) Actual:
6) Result: -> Test Execution
7) Comments:
11) Test Case passes or fails criteria: When this case is pass or fail
Note: Test Engineers follows list of Test Cases along with step by step procedures only
Example 1:
Prepare Test Procedure for below test cases “Successful file save operation in Notepad “.
b) Input Domain based Test Case design (E-R diagrams / Data Models)
In general, Test Engineers are preparing maximum Test Cases depends on Use Cases
or functional requirements in S/wRS. These functional specifications provide functional
descriptions with input, output & process, but they are not responsible to provide
information about size & type of input objects. To correct this type of information, Test
Engineers study Data model of responsible modules E-R diagram. During Data model
study, Test Engineer follows below approach: Steps:
A/C No:
Critical
A/C Name:
Balance:
Non-Critical
Address:
front-end as well
6) Accuracy of data in the database as the result of external factors. Ex. File attachments
7) Meaningful help messages (Manual support testing)
1) BR based coverage
2) Use Cases based coverage
3) Data model based coverage
4) User Interface based coverage
5) TRM based coverage
At the end of this review, Test Lead prepare Requirement Traceability Matrix or
Requirement Validation Matrix (RTM / RVM)
xxxx xxxx
xxxx
xxxx xxxx
xxxx
xxxx
xxxx xxxx
xxxx
xxxx
From RTM / RVM model, it defines mapping between customer requirement &
prepared Test Cases to validate the requirement.
6) TEST EXECUTION:
After completion of Test Cases selections & their review, Testing team concentrate
on Build release from development side & Test execution on that build.
Initial Build
Stable Build Level – 0
Sanity / Smoke / ….
Level – 3
Final Regression
Build
Server
FTP (file transport
Protocol) Testing Environment
Testers
During Test execution Test Engineer are receiving modified build from software. To
distinguish old & new builds, development team maintains unique version in system, which
is understandable to Tester / Testing team. For this version controlling developers are using
version control tools (Visual Source Safe)
During this Sanity testing, Testing Engineer observes below factors on that build
1) Understandable
2) Operatable
3) Observable
4) Consistency
5) Controllable
6) Simplicity
7) Maintainable
8) Automation
From the above “8” testable issues, Sanity test is also known as Testability testing /
OCT angle testing.
e) Test Automation:
If test automation is possible than Testing team concentrate on Test script creation
using corresponding testing tools. Every Test script consists of navigation statement along
with required check points.
Stable Build
Test Automation
(Select Automation)
Pass
Skip
Blocked Partial
Pass / Fail
Case I:
If development team resolve bugs severity which is high, Test Engineers re-execute
all P0, P1 & carefully selected P2 Test Cases on modified build
Case II:
Bugs severity is medium, then all P0, carefully selected P1 & some of P2 Test Cases
Case III:
Bugs severity is low, then some P0, P1 & P2
Case IV:
If development team released modified build due to sudden changes in project
requirement then Test Engineers re-execute all P0, P1 & carefully selected P2 Test Cases w.r.t
that requirement modification.
h) Level – 3 (Final Regression / Pre-Acceptance testing)
Gather Regression
Requirement
7) TEST REPORTING:
Level - 0
Level - 1
Test Reporting
Level - 2
Level - 3
15) Suggested fix (Optional): Tester tries to produce suggestion to solve this defect
16) Fixed by: PM or Team Lead
17) Resolved by: Developer name
18) Resolved on: Date of solving By Developers
19) Resolution type: check out in next page
20) Approved by: Signature of Project Manager (PM)
Defect Age: The time gap between “reported on” & “resolved on”
Defect submission process:
Team Lead
Test Lead
If high defect
is rejected
Developer
Test Eng
After receiving defect report from Testers, developers review the defect & they send
resolution type to Tester as a reply
9) No plan to fix it, not (accepted / rejected) but developers want extra time to fix
10) Fixed, developers accepted to resolve
11) Fixed indirectly, accepted but not interested to resolve in this version (default)
12) User misunderstanding, need extra negotiation between testing & development
team.
Types of defects:
01) User Interface bugs (low severity):
1) Spelling mistakes (high priority)
2) Improper alignment (low priority)
02) Boundary related bugs (medium severity)
1) Doesn’t allows valid type (high priority)
2) Allows invalid type also (low priority)
03) Error handling bugs (medium severity)
1) Doesn’t providing error message window (high priority)
2) Improper meaning of error message (low priority)
04) Calculations bugs (high severity)
1) Final output is wrong (low priority)
2) Dependent results are wrong (high priority)
05) Race condition bugs (high severity)
1) Dead lock (high priority)
2) Improper order of services (low priority)
06) Load conditions bugs (high severity)
1) Doesn’t allow multiple users to access / operate (high priority)
2) Doesn’t allow customers accepted load (low priority)
07) Hardware bugs (high severity)
1) Doesn’t handle device (high priority)
2) Wrong output from device (low priority)
08) ID control bugs (medium severity)
1) Logo missing, wrong logo, Version No mistake, Copyright window missing,
Developers Name missing, Tester Name missing
09) Version control bugs (medium severity)
1) Differences between two consecutive build versions
10) Source bugs (medium severity)
1) Mistake in help documents – Manual support
8) TEST CLOSURE:
After completion of all possible test cycle execution, Test Lead conducts a review to
estimate the completeness & correctness of testing. In this review, Test Lead follows below
factors with Test Engineers
1) Coverage Analysis
a) BR based coverage
b) Use Cases based coverage
c) Data model based coverage
d) User Interface based coverage
e) TRM based coverage
2) Bug density
a) Module A has 20% percentile bugs found
b) Module B has 20% percentile bugs found
c) Module C has 40% percentile bugs found
d) Module D has 20% percentile bugs found
After completion of User Acceptance testing & their modifications, Test Lead
concentrates on final test summary report creation. It is a part of Software Release Node (S/
w RN). This final test summary report consist of below documents
• Bug Description
• Feature
• Found By
Local host
client DSN DB
Windows 2000
This product maintains a default administrator to create new users and every valid
user search data in database.
Login
Search keys
Existing DB
FUNCTIONAL POINTS: -
→ Last name
→ Last with first name
→ Last name with first name and d-o-b
→ Last name with first name and age
→ Customer id only
“Search records” window consists of “start search” and “stop search” buttons.
Allows last name as full or partial
Allows customer id as full or partial with * or wild card
Customer id
2286*-----
8 6 * ------
Access Control √
Audit trail ×
Continuity to √
Correctness √
Coupling ×
Data integrity √
Ease of use √
Ease of operate √
Reliability √
Fortable ×
Performance ×
Service levels √
Maintainable √
Methodology √
Note: Test responsibility matrix for IIXI (11 by 1)
→ Installation testing
→ Security testing
→ Complaints testing
7. Entry criteria: -
→ Are the necessary documents available?
→ Is the “X” product is ready for release from developers?
→ Is the supporting database available to search required records?
→ Is the test environment ready?
8. Suspension criteria:
→ Data base disconnect may require suspension of testing
→ Suspension of testing is mandatory when searching records process went to
infinite.
→ Admin is failed to new user creation, a decision can be mad to continue testing
on “searching records” module using admin or other existing valid users.
9. Exit Criteria: -
→ Ensure that the “X” product provides the required services
→ Ensure that all test documents has been completed up to date.
→All high severity bugs resolved.
14. Responsibilities:
Document / Report Responsibility Effort
15. Schedule:
Task Effort Start date
End date
2. Lack of time
→ Over-time
17. APPROVALS: -
Test case 1:
Signature of PM and signature of test lead.
BVA (size)
BVA (size)
Test case 6: Un successful selection of this option due to current user is not admin.
Test case10: unsuccessful new user creation due to given user id is not unique.
Test case 11: successful closing of new user creation window using cancel
(After enter user id and offer enter passes)
----------------------------
a- z o-9,
A-Z special characters
BVA (range)
Ecp (type)
Day: min → 01 -----------------
min → 31 valid invalid
-----------------
0- 9 a-z, A-Z
Month: min → 01 special characters
max→ 12
Year: min → 00
max→ 99
Ecp(type)
min → 01 -----------------
max→ 22 valid invalid
-----------------
0- 9 a-z, A-Z
Special characters
Test case 18: successful entry of age to
Test case 19: successful display of matched records with full last name and all other
fields are
Blank
Test case 20: successful display of matched records with full last name and first name
Test case21: successful display of matched records with full last name first name and dob
Test case22: successful display of matched records with full last name first name age
Test case 23: on successful search operation due to invalid combination of filled fields.
Test case 25: unsuccessful search operation due to age from greater than age ‘To’
Test case26: successful display of records with search key as partial last name and other
fields
as blank
Test case 27:successful display of records with search key a partial last name and first
name
Test case 28: successful display of records with search key as partial last name, first name
and
Dob
Test case 29: successful display of records with search key as partial last name, first name
Test case 30: successful display of records with search key as customer id
Test case 31: successful display of records with search key as partial customer id with *
as
wild card
Test case 32: Unsuccessful display of records due to no matching records in database
w.r.to
given search keys
Test case 33: unsuccessful display of records due to “too many” records to display when
no
of matched records greater than 1000
Test case 35: Successful termination of search operation when click “stop search” button.
Test case 36: successful closing of records window through click o.k after searching.