You are on page 1of 29

ISTQB FOUNDATION LEVEL

TÀI LIỆU TÓM TẮT

Giảng viên: Tạ Thị Thinh


Email: thinhtt0204@gmail.com
Zalo/SDT: 0986775464
Skype: ta.thinh0204
Website: qr-solutions.com.vn

1
Contents
ISTQB FOUNDATION LEVEL .............................................. Error! Bookmark not defined.
1.1.1 Test objectives ............................................................................................................... 3
1.2 Why is Testing Necessary?............................................................................................... 4
1.3 Seven Principles of Testing .............................................................................................. 6
1.4 Test Process (test activities) ............................................................................................. 7
1.5 Good communication ....................................................................................................... 8
CHAPTER 2. Testing Throughout The Software Development Lifecycle .............................. 10
2.1 Software Development Lifecycle Models ..................................................................... 10
2.2 Test levels ....................................................................................................................... 10
2.3 Test Types ...................................................................................................................... 11
2.4 Maintenance Testing....................................................................................................... 12
Chapter 3. Static Testing techniques ........................................................................................ 13
3.1 Static testing (Review and Static Analysis) .................................................................... 13
3.2 Review Process ............................................................................................................... 14
Chapter 4. Test design techniques ............................................................................................ 17
4.1 Category of test design ................................................................................................... 17
4.2 Black-box Test Techniques (Specification based/ Requirement based) ........................ 18
4.3 While Box test design (Structure based) ........................................................................ 21
4.4 Experience_base techniques ........................................................................................... 22
Chapter 5: Test Management.................................................................................................... 23
5.1 Test independence .......................................................................................................... 23
5.2 Test Planning and Estimation ......................................................................................... 24
5.3 Test Monitoring and Control .......................................................................................... 25
5.4 Configuration Management ............................................................................................ 26
5.5 Risks and Testing ........................................................................................................... 26
5.6 Defect Management ........................................................................................................ 27
Chapter 6: Tool Support for Testing ........................................................................................ 28

2
CHAPTER 1- FOUNDATION TESTING
Tip trả lời
Câu trả lời thường đúng Câu trả lời thường sai
Should be Must be, have to
May be Only
Can be All, full
Prove

1.1.1 Test objectives


Objectives Phase
1 Evaluate the work products Review
( URD, SRS, Design, Test documents, code)
2 Verify all requirements All phases
3 Validate test object Acceptance test
4 Build confidence
5 Prevent defect Early phase: review documents,
design test case
6 Find defects and failures Development phase
7 Ensure that no new defect was introduced Maintenance phase
(regressed) by change
8 Provide information Smoke test (entry check), final test
9 Reduce risks All phases
10 Comply with contract, legal If contract is required.

Test methods
Verify ( verification) Validate ( validation)
- Compare with the first documents - Compare with user’s need or
expectations
- Do it right
- Do right it

3
Kỹ thuật review Requirement và make QnA: 5W1H
Kỹ thuật design test case: success và unsuccess ( valid và invalid)
Change
requirement
Fix bug X
X-
Regression
test
X
x x
X

1.1.2 Testing and Debugging


Debugging is the development activity that finds, analyzes, and fixes such defects ( remove,
repair)

1.2 Why is Testing Necessary?


- help to reduce the risk of problems (risk: failures, poor non-functional could happen in the
future and result negative consequences)
- contributes to the quality
- meet contractual or legal requirements

4
1.2.2 Quality Assurance and Testing

Quality Assurance (QA):


Purpose: prevent defects
Testing (Quality Control)
And provide confidences
Purpose: find defects
Action:
Action: review, run test,
- -Build, follow Process design test case
- Training
- Measurement Quality Management

1.2.3 Errors, Defects, and Failures

Defect= Bug= Fault

5
(Incorrect in documents or source code)
1.2.4 Defects, Root Causes and Effects
Identify the root causes of failure (Error/ mistake) can:
 Reduce the occurrence of similar defects in the future
 Lead to process improvements that prevent a significant number of future defects

1.3 Seven Principles of Testing


1. Testing can show that defects are present, but cannot prove that there are no defects (
không prove chứng minh gì hết)
2. Exhaustive testing is impossible
Test everything (all combination of inputs and preconditions) is not feasible, except trivial
case
3. Early testing
Perform the test design and review activities early can find defects early on when they are
cheap to find and fix
4. Defect clustering (defect density)
A small numbers of modules usually contains most of the defects

5. Pesticide paradox
If the same tests are repeated over and over again, no new defects can be found
6.Testing is context dependent
6
Testing is done differently in different context

7. Absence of error fallacy


All specified requirements and fixing all defects that does not fulfill the users’ needs and
expectations

1.4 Test Process (test activities)

Test activities Task


Test planning - Create and update a test plan: scope, objectives, approach, schedule,
define entry & exit criteria

Test - Compare actual with plan


monitoring
- Measurement: progress, quality,…

- Create a test report

Test control - Make decisions (corrective actions)

- Evaluate exit criteria for test execution

Test analysis - Review and evaluate the test basis (make QnA)

7
- Identifying test conditions (list of features to be tested)

Test design - Designing and prioritizing test cases

- Identifying test data and test environment( tool, infrastructure)

Test - Automation test: test procedures, test scripts, test suites


implementation
- Building or verify the test environment

- Preparing test data

Test execution - Run tests

- Analyzing anomalies (discrepancies)

- Reporting defects

- Logging test results

- Retest and regression test

Test - Checking whether all defect reports are closed


completion
- Create Change Request for unresolved defects

- Creating a test summary report

- Collect and hand over testware

- Analyzing lessons learned

- improve test process

1.4.4 Traceability between the Test Basis and Test Work Products
- Analyzing the impact of changes

- Calculate requirement coverage

- Making testing auditable

- Meeting IT governance criteria

- Improving the understandability

1.5 Good communication


- Start with collaboration rather than battles.

- Remind everyone of the common goal of better quality systems

8
- Emphasize the benefits of testing

- Communicate test results and other findings in a neutral (objective), fact-focused


way

1.5.2 Tester’s and Developer’s Mindsets

Developer Tester

Objective design and build a product verifying and validating the product,
finding defects prior to release

Mindset more interested in designing and building curiosity, professional pessimism, a


solutions than in contemplating what might critical eye, attention to detail, and a
be wrong with those solutions motivation for good and positive
communications and relationships
difficult to find mistakes in their own work
developers should be able to test their own
code

9
CHAPTER 2. Testing Throughout The Software Development Lifecycle

2.1 Software Development Lifecycle Models


several characteristics of good testing:
- For every development activity, there is a corresponding test activity
- Each test level has test objectives
- Test analysis and design for a given test level begin during the corresponding
development activity
- Reviewing documents as soon as drafts are available

2 models:

1. Sequential development models: V-model


- integrates the test process throughout the development process
- early testing.
- includes test levels associated with each corresponding development phase
- test levels overlapping occurs
2. Iterative and incremental development models
- Establishing requirements, designing, building, and testing a system in pieces
- series of cycles
- Involve changes
- Regression testing is increasingly important
Software development lifecycle models must be selected and adapted to the context of
project and product characteristics

2.2 Test levels


Item Component Integration System Testing Acceptance test
testing (Unit Testing
test)
Objective focuses on focuses on focuses on a - Establishing
components that interactions whole system confidence
are separately and - Validating
testable interfaces the system
Special required mock two different 4 Forms:
objects, stubs, levels:
- User
and drivers.
+ Component acceptance
integration testing (UAT)
testing by end user
+ System
integration - Operational
testing acceptance testing

10
Test basis Detailed design Global design requirement (OAT) by
Code specifications Administrator
Use cases
Use cases - Contractual and
regulatory
Environm Development Specific correspond to the acceptance testing
ent environment with environment production
framework, environment - Alpha and beta
debug tool,... testing:
(Can’t find Alpha testing is
operational performed at the
defects)
developing
organization’s site

Beta testing is
performed at
Customer’s
locations.

Test types Functionality Functional Functional and all


Non-functional and non- non-functional
characteristics functional and data
Structural based quality
( white box test) characteristic
Approach Test-first Big-bang
approach integration
Test driven Incremental
development integration:
(TDD)
+Top-down
+Bottom up

2.3 Test Types


1. Software Characteristics (ISO standard 2. Change-related 3.White
(ISO/IEC 25010) (old ISO 9126) test box test
(structure)
Functional and Non-functional Confirmation testing Học ở
and Regression chương 4
testing
Functional/ Non-functional/ Quality Confirmation testing
Suitability attributes ( Retest): confirm
(What the system (How the system do?) whether the original
do?) defect has been
11
Includes: Includes: successfully fixed
- Completeness - Performance (time behavior:
- Correctness load test, stress test, volume Regression testing:
(accuracy) test) re-run tests to detect
- Appropriateness - Compatibility such unintended
(usefull) - Usability (How easy to use?) side-effects
- Reliability (Recoverability) (introduced issues)
- Security
- Maintainability (How easy to
modify?)
- Portability ( how easy to
install?)

2.4 Maintenance Testing


Modify
Fix bug X
Migrate X-
Regression
test
Retirement X
x x
X

Maintenance testing focuses on:


+ testing the changed parts
+ testing unchanged parts that might have been affected by the changes (regression test)
2.4.1 Triggers for Maintenance
- Modification: enhancements, corrective and emergency changes, changes of the
operational environment, upgrades, and patches for defects
- Migration: data conversion (when all old application retired)
- Retirement

2.4.2 Impact Analysis for Maintenance


- identify the areas in the system that will be affected by the change

- identify the scope of existing tests for regression test.

- Impact analysis may be done before a change is made, to help decide if the change
should be made

12
Chapter 3. Static Testing techniques

3.1 Static testing (Review and Static Analysis)


3.1.2 Benefits of Static Testing
- Detecting defects prior to dynamic
- Identifying defects which are not easily found by dynamic testing
- Preventing defects
- Increasing productivity (velocity)
- Reducing cost and time
- Improving communication

3.1.3 Differences between Static and Dynamic Testing


Static testing Dynamic testing

Find defects without execution of Find failures with execution of


code code (run package)

Include: Review (manual), Static Include techniques to design test


analysis (tool, e.g: compiler, case, test data, test input,
Jenkins) expected results.

Retest, regression test,


automation test, dynamic analysis

Find problems: (Typical defects) Find problems: Failures, poor


non-functional (performance,

13
Review: Requirement defects, security), code Coverage,
Design defects, Incorrect memory leak
interface specifications,

Static analysis: Coding defects,


Deviations from standards
(coding conventions), Security
vulnerabilities, Maintainability
defects

Calculate code metric:

cyclomatic complexity = number


of single condition +1

3.2 Review Process


3.2.1 & 3.2.2 Review Process & Responsibility
Review process Main tasks Roles & Responsibility
Planning - Defining the scope, objectives Management
- Estimate time, effort - Review planning
- Decides & Monitors
- Select people, roles
- Assigns staff, budget,
- Define entry & exit criteria and time
- Check entry criteria Facilitator (moderator)
- Lead of review
- Create plan for review
Initiate review - Distribute work products Facilitator (moderator)
- Explaining
(Kick off - Run meeting
- Answering any questions
meeting)
Individual - Self-review Reviewers
review
- Note comments, defects Review leader
(preparation)
Scribe ( recorder)
- Collects defects

14
Issue - Communicating Facilitator (moderator)
communication
- Analyzing - Run meeting
and analysis
- Evaluating Scribe (or recorder)
(Review
meeting) - Records new potential
defects

Fixing - Creating & update defect reports Author


(Rework) - Fixing defects
- Communicating defects to reviewer
Reporting - checking that defects have been Facilitator (moderator)
addressed
(Follow up)
- Gathering metrics
- Checking that exit criteria
- Accepting the work product

3.2.3 Review Types & 3.2.4 Applying Review Techniques


Item Informal Formal review
review
Walkthrough Technical review Inspection
(Pair review) (Peer review)
(Demo)
Purpose Find minor Exchanging ideas Gain consensus Find defects as much as
problems possible
Training Evaluate the work
Cheap to find products
defects
Considering
alternative,
solutions
Leader no Author ideally led by a By a trained moderator
trained moderator
Review No process Follow review Follow review Follow most formal
Process process process review process
May be
documented Optional: Optional: Mandatory:
(undocumented)
- Individual - Review meeting - based on rules and
Use in Agile preparation checklists
- Management
project
- Review report participations - entry and exit criteria

15
and defect reports - metrics are collected
- improve process
3.2.4 Ad hoc: Scenarios and dry Role-based or Checklist-based
Review runs: Perspective-based:
- little or no - List of questions from
techniqu
guidance - Better - Based on different past defects or
es
guidelines stakeholder standards
- dependent on
viewpoints
reviewer skills - miss other - Miss defects outside
defect types (e.g., checklist.
missing features)

3.2.5 Success Factors for Reviews


Process People
clear objectives right people
Review types Testers are seen as valued reviewers
Review techniques adequate time and attention to detail
checklists used on small chunks
in small chunks Defects found are objectively
adequate time The meeting is well-managed
adequate notice an atmosphere of trust
Management supports avoid body language
Adequate training

16
Chapter 4. Test design techniques

4.1 Category of test design


Test case includes inputs, expected result, steps by steps, pre-conditions
Category test case or test data
Valid ( The system work) Invalid (the system doesn’t work)
Successful Unsuccessful
Happy Unhappy
Normal Abnormal
Constructive Negative

High level TC Detail level TC


- A test without test data ( input), A test with test data ( input), output, step by
output, step by step step
- Early phase, poor requirement - Detail Requirement
- Experience tester - Inexperience tester

17
4.1.2 Categories of Test Techniques and Their Characteristics
Black box test White box test Experience based
( Specification based or ( Structure based)
Requirement based)
- Design tests from documents - Design tests from how the - Design tests from
software is constructed knowledge or
experience
- Measure code coverage
- Find defects that was
miss by black box,
white box
- Formal or systematical - Formal or systematical - Informal
Process: Process:

1. 1. Equivalence partitioning 1. Statement coverage 1. Error guessing


2. 2. Boundary Value analysis 2. Decision coverage 2. Exploratory
testing
3. 3. Decision table 3. Path coverage
3. Checklist
4. 4. State transition testing 4. LCSAJ
5. 5. Use case testing 5. Condition coverage
6. Condition decision
coverage
7. Condition Determination
coverage
8. Multiple coverage

4.2 Black-box Test Techniques (Specification based/ Requirement based)


4.2.1 Equivalence partitioning/ class (EP)
- Divide (partition) the inputs, outputs, etc. into areas
18
- One value for each area, test both valid and invalid areas
4.2.2 Boundary value analysis (BVA)
- Test at the edge of each equivalence partition
- Two-point boundary: The maximum and minimum values
- Three- point boundary: Before, at, over

4.2.3 Decision tables


- combinations of inputs, situations or events
- expressing the input conditions by TRUE or FALSE
Example: Login of gmail
Input conditions Full decision = all combinations of inputs= 2*2*2
valid username? F F F F T T T T
Valid password? F F T T F F T T
Space is enough? F T F T F T F T
Output
Login success F F F F F F T T
Restricted turn on

Input conditions Collapse decision


valid username? F T T T
Valid password? - F T T

19
Don’t case
Space is enough? - - F T
Output
Login success F F T T
Restricted turn on - - T F

4.2.4 State transition testing

four basic parts:


- State , transition, event, action (có thể có hoặc không)
State transition testing is much used within the embedded software and automotive system
2 Test case types:
- a typical scenario (a normal situation: start to end) to the coverage every states/ every
transitions
- specific sequences of transitions: N-1 Switch ----- N Transitions
0 --- 1

4.2.5 Use case testing


- Test the whole system
- Test from system test level and over
- Describe interactions between actors (user, system) and system
- Useful to uncover the defect types:
+ integration defects caused by interaction and interference
+ in the process flows during real-world use of the system

20
1 Use case includes:
+ 1 basic flow (mainstream)
+ n alternate flow (exception)
+ some errors
Test case types
GUI Function Flow
Purpose Test each field or item on Test combination of inputs, Test end to end of
a screen events, pre-conditions system
- Xác nhận dữ liệu đúng Test environment
format, đúng định dạng corresponding to
hay chưa production
environment
Test level Integration test Integration test System test or system
integration test
Acceptance test

Test Equivalence partition or Decision table Use case


techniques Boundary value
State transition test
Checklist
Experience based

4.3 While Box test design (Structure based)


4.3.1 Statement coverage (statement testing)
Line of code: Statement, comments ( // , /* */), Blank
Percentage of executable statements exercised.
4.3.2 Decision coverage/ Decision testing (Branch coverage)
T, F  Decision outcomes
Percentage of decision outcomes exercised
4.3.3 Path coverage (path testing)
Percentage of paths exercised.
4.3.4 LCSAJ coverage (Linear Code Sequence And Jump)
Summary White box test
Control flow Data flow

21
Statement coverage Condition coverage: % of condition outcomes
exercised.
Decision coverage
Condition decision coverage
Path coverage
Condition Determination coverage
LCSAJ
Multiple condition coverage

4.4 Experience_base techniques


4.4.1 Error Guessing
Design test from past failures or common mistakes by developer
4.4.2 Exploratory Testing ( Free test, Monkey test, Random test)
- informal tests ( no process, no documents) are designed, executed, logged, and evaluated
dynamically
- use session-based: write a test charter contain some guidelines for test within a defined time-
box
- most useful when there are few or inadequate specifications or time pressure, experience
testers
4.4.3 Checklist-based Testing
List of questions to remind, checked (questions from standards or common defects)

22
Chapter 5: Test Management

5.1 Test independence

outsource
Testers external
Testers from
the business
Test team or organization
group
Other
developers or
Author testers within
developers test team
their own code

Benefits of test independence include:


- Recognize different kinds of failures (unbiased)
- Verify assumptions

Drawbacks of test independence include:


- Isolation from the development team
- Developers may lose a sense of responsibility for quality
- Independent testers may be seen as a bottleneck or blamed for delays in release
- Independent testers may lack some important information

5.1.2 Tasks of a Test Manager and Tester


Testing Test Manager (leader) tasks Tester tasks
Activities
Planning - Write and update the test plan - Review and contribute to test plans
- Coordinate - Create the detailed schedule
- Share testing perspectives

Analysis - Initiate - Analyze, review, and assess the test


and design - Support basis
Implement - Choose tools - Identify test conditions
and - Set up configuration - Design, set up, and verify test
execution management environment(s)
- Decide - Design test cases and test
procedures
- Priority
- Prepare and acquire test data

23
- Execute tests, evaluate the results
- Automate tests (decide, implement)
- Evaluate non-functional
- Review tests developed by others

Monitoring - Monitor test progress and results, - Use management tools


and control and check the status of exit
criteria
- Create test progress reports
- Adapt planning
- Take corrective actions
(decision)

5.2 Test Planning and Estimation


5.2.1 Purpose and Content of a Test Plan ( IEEE 829)
- scope, objectives, and risks
- test approach: test activities (test process), test levels, test types, test techniques,…
- resources ( people, tool, environment)
- Schedule
- Selecting metrics
- Define entry & exit criteria
- Budgeting for the test activities
- Determining the level of detail for test documentation

5.2.2 Test Strategy and Test Approach


1. Analytical: requirement based or risk based
2. Model based: based on aspect of product (model, embedded)
3. Methodical based: error guessing or checklist from standard (ISO 25010)
4. Process compliant: follow Agile process (rules, user story, acceptance criteria)
5. Consultative: guide by expert, user
6. Regression averse: highly automation test
7. Reactive/ Dynamic/ Heuristic: exploratory testing

24
5.2.3 Entry Criteria and Exit Criteria
Entry Criteria (smoke test) Exit Criteria
When to start a test level? When to stop a test level?
When to release?
How much testing is enough?
Check available (readiness): Check 5 criteria:
- Documents - Coverage ( thoroughness)
- Prior test levels met exit criteria - Defect (functional, non-functional:
reliability)
- Test environment
- Cost/ effort
- Test tool
- Time
- Test data
- (importance) Residual risks: issues/
open serious defect, untested.

5.2.5 Factors Influencing the Test Effort (mh, md, mm)


Test effort estimation predicts the amount of test-related work.
+ Product characteristics
+ Development process characteristics
+ People characteristics
+ Test results/ Test outcomes
5.2.6 Test Estimation Techniques
1. metrics-based: based on historical of similar projects, or typical values
2. expert-based: wisdom (predict) by owner or expert ( test manager, PM) use WBS-
Work Breakdown Structure

5.3 Test Monitoring and Control


Test Monitoring Test control
- Compare actual with plan to assess - Take decision (corrective actions):
test progress, quality, cost, time.
+ Re-prioritizing
- Create test reports
+ Changing
+ Re-evaluating

25
+ Set an entry criterion for bug fixing
5.3.1 Metrics Used in Testing
Metrics can be collected during and at the end
of test activities in order to assess:
+ Progress
+ Quality
+ Approach
+ Effectiveness
5.3.2 Contents for Test Reports ( IEEE 829)
- Summary of testing
- Analysis
- (Variances) Deviations from plan
- Metrics
- (Evaluation) Residual risks

5.4 Configuration Management


The purpose is to establish and maintain the integrity of system
All test items are uniquely identified, version controlled, tracked for changes, and related
to each other

5.5 Risks and Testing


Risk: an event could happen in the future which has negative consequences
Risk level = likelihood (probability) x Impact (harm)
Project risk Product risk (quality risks)
A risk to the project’s capability: Scope, A risk to quality of product
cost, Time
Related to management and control: Directly related to the test object
- people - Failure in the software delivered
- tool - The potential that the software/hardware
- Customer could cause harm to an individual or
- Technical company
- Schedules - Poor software characteristics (e.g.,
- Budget functionality, reliability, usability and
performance)
- Work product: SRS, code, - Poor data integrity and quality
design, test documents - Software that does not perform its intended
functions

Actions: PM and Test manager Actions: Tester


26
- Mitigation or reduce risk - 5.5.3 Risk-based testing
+ test techniques
+ levels and types of testing
+ the extent of testing
+ Prioritize testing
+ any activities in addition to testing

5.6 Defect Management


Incidents: the discrepancies between actual and expected outcomes
An incident must be investigated and may turn out to be a defect
Incident reports have the following objectives:
- Provide information to enable fixing defects.
- Provide a means of tracking quality
- Provide ideas for test process improvement
A defect include fields:
- A title and a short summary
- Date
- Identification of the test item (version) and environment
- A description including logs, database dumps, screenshots, or recordings
- Expected and actual results
- Severity (impact)
- Priority (business importance), urgency
- State

27
Chapter 6: Tool Support for Testing
6.1 Test Tool Category

Support management Support for static testing


Test management tools Tools that support reviews
Requirements management Static analysis tools (D)
tools
Defect management tools
Configuration management
tools
Continuous integration tools (D)

Support for test execution and Support for performance


logging (automation test) measurement and dynamic
Test execution tools analysis
Coverage tools Performance testing tools
Test harnesses (D) Monitoring tools
Unit test framework tools (D) Dynamic analysis tools (D)

Support for test design and Support for specialized testing


implementation needs
Test design tools
Test data preparation tools Usability testing
Security testing
Portability testing

6.1.2 Benefits and Risks of Test Automation


Potential benefits of using tools:
- Reduction in repetitive manual work
- Greater consistency and repeatability
- More objective assessment
- Easier access to information
Potential risks of using tools
- Expectations may be unrealistic
- The time, cost and effort may be under-estimated
- Version control of test assets may be neglected
28
- Risk from Vendor, open source, new platform
6.1.3 Special Considerations for Test Automation tool
Data-driven scripting technique Keyword-driven scripting technique

Data files store test input and expected results data files store test input, expected results
in table or spreadsheet and keywords in table or spreadsheet

support capture/playback tools writing script manually


-

29

You might also like