You are on page 1of 103

oks

OWNER: IBM

VERSION: 0.5

Agile DevOps Test


Methodology
Guidebook
IBM AGILE DevOps Test Methodology Guidebook

Revision History:
Date Version Contributors Change Description

21/09/2016 Version 0.1 MARK A. SNEDECOR First Draft Version

Raghavendra, Manjula

21/10/2016 Version 0.2 MARK A. SNEDECOR Updated the Agile testing Phases, Tasks,
and Deliverables
Manjula

25/10/2016 Version 0.3 MARK A. SNEDECOR Reorganized and improved each section.
Updated Metrics details
Manjula

23/11/2016 Version 0.4 Manjula  Updated comments from the following


IOT teams –

China team -
Yong Duo Wang/China/IBM
Feng H Hong/China/IBM

North America team –


Glen Young/Arlington/IBM

From India, AT&T team

 Added new section to cover Service


Virtualization

05/12/2016 Version 0.5 Manjula  Updated comments from Anita Wall and
Mark

2
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Table of Contents

1 Introduction ...................................................................................................................... 5
1.1 Document Purpose .................................................................................................. 5
1.2 Intended Audience................................................................................................... 5
1.3 How to Access the Tools......................................................................................... 6
1.4 Skills Required......................................................................................................... 7
1.5 Available Training Sites and Collateral ..................................................................10
2 Leading Practices............................................................................................................11
3 Introduction to Agile DevOps Testing ...............................................................................12
3.1 Agile DevOps Testing Overview.............................................................................12
3.2 Agile Testing –Principles and Practices ................................................................13
3.3 Agile Testing Quadrants.........................................................................................20
3.4 Agile Testing Vs Traditional Testing ......................................................................24
3.4.1 What is Traditional Testing .................................................................................24
3.4.2 What is Agile Software Testing...........................................................................25
3.4.3 How is Agile Testing Different from Traditional Testing? ..................................25
3.5 Agile Testing - Advantages and Disadvantages ....................................................26
4 Agile Configuration Selection ...........................................................................................29
5 IBM Recommended Operating Model - Large Complex Agile DevOps Testing Projects ....30
5.1 Agile Testing Methodologies..................................................................................30
5.2 IBM Recommended Operating Model Overview ....................................................36
5.3 CTD Driven Test Plan Optimization........................................................................38
6 Agile DevOps Test Framework and Life Cycle – For a Large Complex Testing Project......39
6.1 Agile DevOps E2E Test Framework .......................................................................39
6.2 Agile DevOps Testing Life Cycle............................................................................41
6.2.1 Conceptualization Phase ....................................................................................42
6.2.2 Release Planning Phase .....................................................................................43
6.2.3 Grooming the Backlog Phase .............................................................................46
6.2.4 Sprint Planning Phase ........................................................................................47

3
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6.2.5 Sprint Testing Phase...........................................................................................49


6.2.6 Hardening Sprint Phase ......................................................................................59
6.3 A Day in the Life of an Agile Tester........................................................................62
7 Agile DevOps Test Optimization ......................................................................................64
8 Agile DevOps Test Automation ........................................................................................70
8.1 Agile Test Automation Pyramid .............................................................................71
8.2 Agile Test Automation through Behavior-Driven Development ............................73
8.3 Agile Test Automation – Assessment Criteria .......................................................74
8.4 Agile Test Automation – Leading Practices...........................................................75
8.5 Agile Test Automation Workflow with DevOps Tools............................................80
9 Service Virtualization .......................................................................................................82
10 Agile Testing Tools supporting an E2E DevOps Framework ..........................................83
11 Distributed Agile Testing Process .................................................................................85
12 Agile DevOps Test – Governance and Metrics Framework ............................................87
Appendix 1 – 1 Sizing and Estimations ...................................................................................95
1.1 Agile Testing - Sizing and Estimation, An Introduction.........................................95
1.2 T-Shirt Sizing ..........................................................................................................96
1.3 Planning Poker .......................................................................................................98
1.4 Scaled Agile Estimation Models........................................................................... 100

4
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

1 Introduction
1.1 Document Purpose

The Purpose of this Guidebook is to:

 Provide a high-level overview to Agile testing teams of all methodology / processes that
will be required in order to effectively deliver testing services in a DevOps / Agile
environment.
 Highlight leading practices in the DevOps / Agile space.
 The testing team will be performing testing throughout the Agile Life Cycle for the
applications in scope, regardless of the delivery partner performing the development.
 In an effort to ensure quality and consistency, this guidebook will outline the preferred
processes.
 The scope of this document is limited to the testing phases only and does not outline any
development standards or processes.
 It outlines the set of recommended tools that enables the test team to perform their
required tasks. This document contains information on the location of the recommended
tools.
 Guidance to the functional tester on where to find and how best to use the
recommended tools to accomplish day to day automation testing tasks;
 Guidance to the Test Executives & Test Managers on recommended skills required for
functional and automation roles.

1.2 Intended Audience

Description:

The target audience for this Guidebook includes

 All resources who are responsible for completing or supporting testing activities who
include Software Testing Professionals, Software Quality Experts, and Software
Developers.
 The team of DevOps architects and tools management team responsible for installing,
configuring and maintaining the tools.
 New and existing members who are charged with creating or evolving test automation
artifacts such as automation test plan, test cases and test scripts;
 Test Executives and Test Managers who are charged with building and maintaining test
teams with the talent required to support the automation needs of their organization

5
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Intended Audience: Agile DevOps Test Methodology Guidebook


The target audience for Guidebook includes all resources who are responsible for complet ing or
supporting testing activities, including Primary (S) and Secondary (S):
Resources responsible Team responsible Resources to
for completing or for installing, support the
supporting testing configuring and automation needs of
activities maintaining tools the organization
Test Executives P S P
Test Managers P S P
Test Leads P S P
Functional Testers S S P
Business Users S
Business Analysts S
UAT Testers S
Product Owners S
DevOps Architects S P S
Tools Management Team S P

1.3 How to Access the Tools


• IBM Functional Coverage Unified Solution (IBM FOCUS) (CTD)

- https://star.haifa.ibm.com/FoCuS/index.html

• CTD Link -

https://w3-connections.ibm.com/wikis/home?lang=en-
us#!/wiki/W4bbb8b568eba_4c68_8c32_24febf3609ea/

6
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

1.4 Skills Required

The tables below describe the required skills for testing teams in a DevOps / Agile environment.
It includes the roles, responsibilities, required skills and required training.

Roles Responsibilities Required Skills Augmented


Training
Test Manager o Manage Governance and escalations Test Management – Expert Agile, DevOps
o Manage contract and financials Agile – Expert
o Provide input to the release plans and DevOps – Intermediate
forecast for new scope to the team Resource Planning – Expert
o Manage service delivery Project Management – Expert
o Communicate w ith stake holders to Financial Management - Expert
approve QA processes changes &
improvements
Test Lead o Participate in daily standup to review Test Planning - Expert Agile, DevOps
backlog, ensure they are allocated. Agile - Expert
o Assigns user stories to testers for test DevOps – Intermediate
case design, then review s for accuracy. Resource Capacity Planning –
o Ensures the team have all screenshots, Expert
test case design in correct and meets Ok Test Estimation - Expert
of the story.
o Works w ith product management team,
scrum master, technical leads, dev leads
to jointly drive the project.
o Works w ith teams to provide estimates.
o Creates Test Plan / Strategy at discovery
and updated sprint by sprint for story
updates.
o Enforce definition of Done
o Manages Defect triage
Functional o Create initial Combinatorial model based Modeling – Expert Take up Trainings
Tester (FT) on inputs Test Automation – Intermediate on Modeling, Test
o Perform w orkshop w ith Client SME’s Cucumber – Intermediate automation tools.
verifying initial models. Gherkin – Intermediate Cucumber, Gherkin,
o Refine and finalize combinatorial models Agile – Intermediate Agile
based on SME input. DevOps – Intermediate
o Help w ith creation and maintenance of
planning documentation
o Participate in clarification process
o Raise clarification and track them to
closure
o Create, Maintain and execute automated
scripts
o Escalate issues and concerns to lead and
consultant.
o Help ensure know ledge base is kept up to
date.
o Report status of testing effort daily to lead

7
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Lead o Provide technical leadership and strategic o Agile, Agile,


Autom ation direction in building out the testing DevOps,
o DevOps,
Architect automation framew orks based on a
specific set of guiding principles, o Advanced Object Oriented
Programming,
architectural recommendations, standards
and best practices. o Design Thinking,
o Identifying and navigating industry trends o Algorithms,
and new technology in test automation o Design Patterns,
w here applicable.
o Extreme Programming,
o Liaising w ith development and dev-ops
o Kanban,
architects to set automation standards
and govern compliance around those o Lean Six-Sigma,
standards. The automation architect w ill Scrum Practices
act as a technology evangelist and
educate the test automation analysts
across the enterprise on the guiding
principles, architecture, standards and
best practices.
o Lead a core top talent team of architects
to build a test automation framew ork
including various architectural levels of
parent scripts, child scripts, high and low
level function libraries, test data and
business rules.
o Design and develop enterprise w ide, re-
usable testing productivity components,
framew orks and accelerators based on
the specified recommendations w hile
maintaining focus on details and a w ide
perspective across multiple lines of
business.
o Work w ith Continuous Integration team to
integrate automated testing into our daily
development process and deployment
pipeline.
o Provide technical leadership and strategic
direction for Automation Practice.
o Demonstrate deep thinking and thought
leadership capabilities in combination w ith
a vision for how complex framew ork
design elements can combine and
communicate in a reusable/modular
approach to achieve elegant but focused
implementations/ solutions to complex
testing or validation problems across
multiple enterprise applications.

8
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Test o Work w ith the App-Dev team(s) to Understanding of automation Agile,


Autom ation collaboratively deliver testing automation scripting design elements & DevOps,
Analyst solutions to help the development teams principles, such as:
achieve their testing goals w ithin the Agile Basic Script
 Test Data Programming Skills,
SCRUM sprint cycle.
 Data Sharing Mechanisms Basic Script
o Write and maintain test automation
scripts, keyw ords and components for  Object Management & Debugging Skills
functional UI testing, as w ell as verify that Identification
all softw are functions as desired,  Parent Driver Scripts
according to user requirements and  Child Scripts
established guidelines.
 Constants Definitions &
o Perform ad-hoc manual/exploration Usage
testing.
 Application Specific
o Set high standards of excellence follow ing Business Know ledge,
"Clean Code Principles", champion test Architecture, Business Rules
automation best practices and actively Management Systems
participate in the product design and
 High Level Function
development process.
Libraries Organizational
o Document defects and tracking issue structure
resolutions.
 Parent Driver Script API
o Work w ith the team to continually improve
adherence to the established automation  Environment Settings
best practices, framew ork architecture,  Configuration Management
standards and principles.  Object Repository
 Low Level Functions
 Mobile Execution Interface
 Runtime and Reporting
Engine
 Real-Time Reporting &
Triage Support System
 Error/Exception Handling
 Output Files Shared Data
Dictionary

Tools o Create and manage Docker images for o Experience w ith Window s DevOps/Continuous
Managem ent Installing service virtualization tool and Linux technologies - Integration/Continuo
Engineer o Create SV setup/install task Expert us Deployment
o Create deploy task to deploy virtualized o Experience w ith Docker
interfaces containers - Intermediate Service
o Maintain license o Experience w ith Bamboo or Virtualization Tool
o Perform tool administrative duties that Jenkins - Intermediate
pertain to user permissions and user o Previous experience of
profiles setting up SV tool -
o Work w ith configuration managers, Intermediate
DevOps engineers, netw ork engineers,
development and test engineers as
needed to ensure tool setup and
compliance needs are met

9
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Service o Identify the interfaces to Virtualize o Experience w ith SV - Expert Trained on Service
Virtualization o Identify and review all the information o Experience w ith developing Virtualization
Architect/Engin required for virtualization virtualized interfaces as it
o Define technical steps to deploy relates to out of the box
eer
virtualized interfaces capabilities - Expert
o Work w ith Tools management engineer to
configure Bamboo task for installing SV
o Work w ith Tools management engineer to
configure Bamboo task to deploy
virtualized interfaces
o Assists w ith troubleshooting SV issues

Service o Review all the pre-requisite information o Experience w ith SV - Expert Trained on Service
Virtualization required for Virtualization is complete o Experience w ith developing Virtualization
Developer o Review the test data variations required virtualized interfaces Trained on
for the Virtualized Services including customization and Advanced SV
Specialist
o Create the Virtualized services interfaces not supported out capabilities including
o Test the Virtualized Services of the box - Expert tw o w ay SSL and
o Test all the scripts w ork w ith all the test other specialized
data variations security interfaces
o Provide the developed Virtualized Trained on
Services to the SV architect along w ith customization for
any configuration information the more complex
o Troubleshooting and fixing SV issues interfaces

1.5 Available Training Sites and Collateral


Description:

Before proceeding with this Guidebook, you should have a basic understanding of software
development life cycle (SDLC). A basic understanding of software testing (manual or
automation) will be beneficial.

The section below describes the available training sites and collateral.

 Test Methodology in Agile/Dev/Ops from Agile Academy


https://agileacademy.mybluemix.net/courses/agile_devops_testing/

 Agile/Dev/Ops Practices from Agile Academy

10
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

https://agileacademy.mybluemix.net/practices

 Required Training / Information


 Agile Testing Methodologies for Software Development
 Acceptance Test Driven Development
http://guide.agilealliance.org/guide/atdd.html
 Agile Methodologies for Software Development
http://www.versionone.com/agile-101/agile-development-methodologies-scrum-kanban-lean-
xp/
 Behavior Driven Development
https://dzone.com/articles/brief-comparison-bdd
 Selenium Training
http://qtpselenium.com/home/course/training/selenium-tutorial

2 Leading Practices

Description:

The future of Agile/Dev/Ops Test practices requires a cultural shift in terms of the people,
processes and technology involved in development, Testing and operations.

 Test Case Optimization


 Use Combinatorial Test / Test Case Optimization tools to help solve the
fragmentation problem (e.g., IBM Focus / CTD). Thus achieve reduced # Test
Cases and improved Test Coverage.
 Creating Models by examining the points of variability in a system.
 Generate Behavioral Driven Development (BDD) scenarios from optimized User
Stories
 Apply to both Regression and Progression

11
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Hardening Sprints
 Minimize the use of Hardening Sprints (End-to-End Testing)

 Automation leveraging Open Source Tools


 Plan automation early to align with the testing process and tool/staff capabilities
 Maximize the use of automation to complete regression testing during the Sprint
 Design automation approach to meet project’s long term requirements
 Focus on test design to ensure repeatability and suitability for automation
 Automate smoke test, GUI / mainframe regression, service / component level
testing and component performance testing
 Automated tests should be supplemented with manual testing of Explorat ory and
Showcase testing
 For legacy systems, create an automated set of build and regression tests before
any user stories are coded

 Establishing a Continuous Testing (CT) eco system which involves;


 Integrating the automation suite with Continuous Integration and build tools
(CI/CD) to enable CT and centralized execution and reporting
 Classify the automation suite in multiple layers of tests to enable feedback at
each checkpoint. Tests include:
 Build Verification Tests (Smoke Tests) (Automated):
o Service/ component
o Limited GUI Regression
o Component Performance
 Application Functional Verification Tests (Automated & Manual):
o SIT (Automated Progression and Regression)
o Performance
o Exploratory Testing

3 Introduction to Agile DevOps Testing


3.1 Agile DevOps Testing Overview

12
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Agile testing is a software testing practice that follows the principles of agile software
development. Agile development integrates testing into the development process, rather
than keeping it as a separate and distinct SDLC phase.

Agile Testing involves a cross-functional Agile team actively relying on the special expertise
contributed by testers. Testing is not a separate phase and is interwoven with all the
development phases such as requirements, design, coding and test case generation. Testing
takes place simultaneously through the Development Life Cycle.

Furthermore, with testers participating in the entire Development Lifecycle in conjunction with
cross-functional team members, the contribution of testers towards building the software as per
the customer requirements, with better design and code would become possible.

Agile Testing covers all the levels of testing and all types of testing

3.2 Agile Testing –Principles and Practices

The Principles of Agile testing are –

 Testing moves the project forward :


On traditional projects, testing is usually treated as a quality gate, and the QA/Test group often
serves as the quality gatekeeper. It’s considered the responsibility of testing to prevent bad
software from going out to the field. The result of this approach is long, drawn out bug scrub
meetings in which we argue about the priority of the bugs found in test and whether or not
they are sufficiently important and/or severe to delay a release.
On Agile teams, the product is built well from the beginning, using testing to provide feedback
on an ongoing basis about how well the emerging product is meeting the business needs.
This sounds like a small shift, but it has profound implications. The adversarial relationship
that some organizations foster between testers and developers must be replaced with a spirit
of collaboration. It’s a completely different mindset.

13
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Testing is not a phase :


On Agile teams, testing is a way of life. Agile teams test continuously. It’s the only way to Agile
Test team tests along with the development team to ensure that the features implemented
during a given iteration are actually done. Testing is not kept for a later phase.
Continuous testing is the only way to ensure continuous progress. Agile Testing provides
feedback on an ongoing basis and the final product meets the business demands.

14
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Everyone tests :
On traditional projects, the independent testers are responsible for all test activities. In Agile,
getting the testing done is the responsibility of the whole team. Testers execute tests but the
Developers too do.
The need to get all testing done in an iteration may mean that the team simply cannot do as
much in each sprint as they originally thought. If this is the case, then Agile has made visible
the impedance mismatch between test and dev that already existed. And that means that the
team was not going as fast as they thought. They appeared to be going quickly because the
developers were going fast. But if the testing isn't done, then the features aren't done, and
the team just does not have the velocity they think.
Another way of thinking about this idea is that testing is the "herbie" on the team. The Theory
of Constraints says that the whole team can only go as fast as the slowest part. To go faster,
the team has to widen the throughput of the slowest part of the process. Eliminate the
bottleneck; Everyone tests…..not just the designated testers.

 Shortening Feedback Loops :


How long does the team have to wait for information about how the software is behaving?
Measure the time between when a programmer writes a line of code and when some one or
something executes that code and provides information about how it behaves. That’s a
feedback loop.
If the software isn’t tested until the very end of a long release, the feedback loops will be
extended and can be measured in months. That’s too long.
Shorter feedback loops increase Agility. In Agile projects the software is ready to test almost
from the beginning. And Agile teams typically employ several levels of testing to uncover
different types of information.
Automated unit tests check the behavior of individual functions/methods and object
interactions. They’re run often, and provide feedback in minutes. Automated acceptance
tests usually check the behavior of the system end-to-end. They’re typically run on checked
in code on an ongoing basis, providing feedback in shorter duration. Agile projects favor
automated tests because of the rapid feedback they provide.
Manual regression tests take longer to execute and, because a human must be available,
may not begin immediately. Feedback time increases to days or weeks.
Manual testing, particularly manual exploratory testing, is still important. However, Agile teams
typically find that the fast feedback afforded by automated regression is a key to detecting
problems quickly, thus reducing risk and rework.
In Agile Testing, review working software. That is to say all the roles including Analysts,
Developer, Business team and testers will be involved to review/feedback regularly and
frequently based on the working software. Continuous feedback shortens the feedback
response time and thus the cost involved in fixing it is less.

15
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Keep the Code Clean :


This principle is an example of the discipline that Agile teams have. It takes tremendous
internal discipline to fix bugs as they are found. If it’s a genuine bug, as opposed to a new
story, it is fixed within the iteration. The defects are fixed as they are raised within the same
iteration. This ensures clean code at any milestone of development.
To do otherwise is like cooking in a filthy kitchen: it takes longer to wade through the mess to
do the cooking, and the resulting food may or may not be edible.

16
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Lightweight Documentation :
Instead of comprehensive test documentation, Agile testers
 Use reusable checklists to suggest tests.
 Focus on the essence of the test rather than the incidental details.
 Use lightweight documentation styles/tools.
 Capture test ideas in charters for exploratory testing.
 Leverage documents for multiple purposes.
Leveraging One Test Artifact for Manual and Automated Tests
Rather than investing in extensive, heavyweight step-by-step manual test scripts in Word or a test
management tool, we capture expectations in a format supported by automated test frameworks.
The test could be executed manually, but more importantly that same test artifact becomes an
automated test when the programmers write a fixture to connect the test to the software under
test.

 Test-Last v. Test-Driven
In traditional environments, tests are derived from project artifacts such as
requirements documents. The requirements and design come first, and the tests
follow. And executing those tests happens at the end of the project. This is a “test-
last” approach.
The tests provide concrete examples of what it means for the emerging
software to meet the requirements. Defining the tests with the requirements, rather
than after, and using those tests to drive the development effort, gives u s much more
clear done criteria and shared focus on the goal. This test-first approach can be seen
in the BDD. TDD and ATDD practices.

17
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 "Done Done,” not just done :


In traditional environments that have a strict division between development and test, it is
typical for the developers to say they are “done” with a feature when they have implemented
it, but before it is tested.
The feature isn’t “done” until it’s been tested and any bugs have been fixed. That’s why its
said that a given software release is usually “90% done” for 90% of the project. (Or, in other
words, the last 10% of the effort takes 90% of the time.) Agile teams don’t count something
as “done,” and ready to be accepted by the Product Owner or Customer until it has been
implemented.

18
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

The Concrete Testing practices are –


 Automated Unit / Integration Tests –
o They are code-facing, written by programmers in support of the programming effort.
o They are usually created using one of the xUnit framework.
o Express expectations of the internal behavior of the code.
o Isolate the elements under test.
o They are executed quickly and often, with every change.

 Automated System level Regression Tests –


o The Regression Tests are Business-facing, written by various members of the team in
collaboration.
o They express expectations about externally verifiable behavior and represent
executable requirements.
o They are mostly performed end-to-end and is executed as part of continuous
integration process.

 Exploratory Testing –
Exploratory Testing is all about learning the software, designing tests and executing tests
using feedback from the last test to inform the next. It can Be Rigorous. Two key things
distinguish good Exploratory Testing as a disciplined form of testing:

o Using a wide variety of analysis/testing techniques to target vulnerabilities from


multiple perspectives.
o Using charters to focus effort on those vulnerabilities that are of most interest to
stakeholders..

 Version Tests with Code –

19
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Collaborative Testing –
It is important for testers to collaborate with all the other project stakeholders. It is observed that
isolation usually leads to duplicated and wasted effort. By integrating testing and development,
more solid code is produced more quickly. Certainly there are contexts where independent testing
is required, such as with safety-critical systems. But that doesn’t mean the independent testers
should be the only ones testing.
Testing isn’t a phase but rather a way of working so that at any given point in a project and the
work done to date meets the stakeholder’s expectations that requires a whole team effort.

3.3 Agile Testing Quadrants

As in the case of Traditional Testing, Agile Testing also need to cover all the Test Levels.
There are a variety of Test Types that have different purposes. Some of them are
 Component Tests (Unit Tests)
 Functional Tests (User Stories Tests)
 Non-functional Tests (Performance, Load, Stress, etc.)
 Acceptance Tests

Tests can be fully Manual, fully Automated, Combination of Manual and Automated or
Manual supported by Tools.

To decide on what tests to be performed when, you need to determine whether a test is −
 Business Facing Tests
 Technology Facing Tests
 Supporting Development (Support Programming)
 Verification only (Critique Product)

Combining the above aspects of Testing Types, the Agile Testing Quadrants are derived.

20
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

The business-facing tests are meant to verify functionality from user perspective. Technology-
facing tests are meant to ensure structure and design of the code as well as technological
capabilities such as performance or security. On the left-hand side, there are types of tests that
are meant to support and guide the end-to-end development process. And finally, there are those
tests that critique the delivered product, and these are the ones that ensure fulfillment of both
functional and non-functional user requirements.

To further expand the concept, consider these different types of tests in test quadrants as you
develop your coverage strategy. On the left, you will see a common approach for applications
and the types of tests they use. You'll notice that the largest quantity of tests tends to be in
Quadrant 2, which is where system tests and integration tests are found. The teams begin to shift
their focus to have more tests in the quadrant one level, which would include things like unit tests,
component tests and use of static analysis. We would also like to see fewer tests in Quadrant 3.
One reason is that Quadrant 3 tends to be slightly more difficult t o actually automate.

Exploratory and usability type tests are sometimes difficult to automate. Another area where we
could potentially have fewer tests is the area of acceptance testing. If our system testing is done
in a way that can demonstrate that the acceptance criteria are already met and customer has
confidence in those tests, they will be more comfortable having a smaller number of acceptance
tests that they need to execute.

Now Let’s know when to start and when to stop testing. In order to be pr oduction ready at the
end of each iteration, we need to be testing as soon as the project begins. Well, we never really
stop testing, but we certainly want to know that we've done sufficient testing to be production
ready. We need to constantly assess risks for any deviations from the plan. We need to be
considering test metrics that demonstrate thoroughness of our tests, and we need to constantly
reassess the inherent costs and defect discovery associated with our strategy for test coverage
across all the testing levels.

21
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

The Test levels are often defined in definition of done document which are logically related to
each other and indicates the broad test coverage area. Since, testing in agile is no longer a
separate phase by itself, all test levels overlap with each other. The user stories which contain
requirements and design description and development along with testing activities overlap with
each other. During a iteration, here is the sequence of tests:

 Unit testing, typically done by the developer, but sometimes tester also helps
 Feature acceptance testing, which may be broken into couple of activities:
o Feature verification testing, which is often automated
o Feature validation testing, which involves everyone and mostly done manually

Automated tests are often used for running parallel regression testing to test, if any of the
previously developed features might have broken. The continuous integration framework also
helps checking if any builds are failing to code compilation errors. The system level test may
also get started as soon as user stories are done.

There are other tests like performance, reliability, usability tests which may also run in parallel to
the system tests. Acceptance testing may also include alpha and beta tests based on the type
of the product being developed, either in iteration or at the end of the release. User Acceptance
Testing at the end of the iteration is a best practice. In regulated environments, certain
regulatory testing also may be norm to satisfy the audit compliance rules.

Definition of done suggests the exit criteria of an application delivery or the condition when Testers
can mark a user story as complete. There are various test levels incorporated in definition of done
in Agile software development. The following list gives examples of various test levels.

 Unit testing

o 100% unit test coverage with reviews


o Cyclomatic code complexity and analysis of code using tools like SONAR
o Defects in ‘acceptable’ state to the Product Owner
o Unit tests code reviewed
o All unit tests checked-in
o All unit tests automated
o Performance characteristics are within agreed limits

 Integration testing

o Defects found reported and counted.


o No major regression found
o All regression tests are automated and checked in into SVN.
o Acceptance criteria tested for both positive and negative tests based on agreed
parameters
o Quality risks identified and in acceptable

22
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 System testing

o All stories in a release are tested end-to-end


o All user persons covered, if applicable
o Testing done in staging environment or Production like environment
o Performance testing done
o Quality risks covered and closed, if applicable.
o Defects in “Acceptable” state to the Product Owner.
o Regression tests are automated and checked-in
o Release interfaces thoroughly checked.

 User Story

The definition of done for user stories may be determined by the following criteria:

o Coding tasks completed.


o Code reviews completed.
o Exploratory Testing completed and signed.
o Regression test reviewed and passed.
o Unit testing – written and passed.
o Code reviews conducted.
o Technical Design Specification updated.
o Defects are in an “acceptable” state to the Product Owner.
o User story accepted by the product owner.

 Iteration

The definition of done for the iteration may include the following:

o Regression tests run and passed


o Smoke / automation tests run (if applicable)
o Demo / Review completed
o Retrospective completed
o Documentation is approved and stored.

 Release

The definition of done for a release, which may span multiple iterations, may include
the following areas:

o Coverage: The extent of coverage is determined by new or changed contents for


the release and its complexity, size and associated risks
o Quality: The number of defects found per day or transaction is called defect
intensity and the number of defects compared to number of user stories is called
defect density. These both parameters should remain within permissible limits.

23
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

The consequences of these limits which may raise residual risk may be fully
understood and accepted
o Time: Release go/no go business decisions as per the pre-set delivery date may
be evaluated.
o Cost: The positive return on investment which is calculated development and
maintenance cost of the product may be significantly lower than the projected
total sales of the product. The escaped defects after product has been released
may yield lower return on investment.

3.4 Agile Testing Vs Traditional Testing

3.4.1 What is Traditional Testing


Traditional testing is a late, manual, sequential or phased approach, in which progress is seen
as flowing steadily downwards (like a waterfall) through the phases of conception, initiation,
analysis, design, construction, testing, production/implementation and maintenance.
Testing starts after completing Development and Build phases. It faces the Testing “bottleneck”
at end.

When the project running on traditional approach, starts to run out of time and money, testing is
the only phase left. This means good projects are forced to cut testing short and quality suffers.

Since working software isn't produced until the end of the project, you never really know where
you are on a Waterfall project. That last 20% of the project always seems to take 80% of the
time. With this you've got

 Schedule risk because you never know if you are going to make it until the end
 Technical risk because you don't actually get to test your design or architecture until late
in the project.
 Product risk because don't even know if you are building the right until it's too late to
make any changes.

And finally, most importantly, it's just not a great way for handling change.

24
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

3.4.2 What is Agile Software Testing

Agile testing is a software testing practice that follows the principles of agile software
development. It is an early and automated process. The goal of Agile testing is to deliver and
often.

Agile testing involves all members of a cross-functional agile team, with special expertise
contributed by testers, to ensure delivering the business value desired by the customer at
frequent intervals, working at a sustainable pace. Agile development recognizes that testing is
not a separate phase, but an integral part of software development, along with coding.

Agile teams use a "whole-team" approach to "baking quality in" to the software product. Testers
on agile teams lend their expertise in eliciting examples of desired behavior from customers,
collaborating with the development team to turn those into executable specifications that guide
coding. Testing and coding are done incrementally and interactively, building up each feature
until it provides enough value to release to production. Agile testing covers all types of testing.
The Agile Testing Quadrants provide a helpful taxonomy to help teams identify and plan the
testing needed.

3.4.3 How is Agile Testing Different from Traditional Testing?

The following points make Agile Testing different from the testing done in traditional SDLC
methods

Attribute Agile Testing Traditional Testing


Testing Agile Testing is not a separate phase Testing is a separate phase. All levels
Phase and Occurs concurrently with and types of testing can begin only after
development. Testers and developers the completion of development.
work in one team. Testers and developers work together
but they are belonging to different
teams.
Testing Testing is a best practice. Continuous Testing is often overlooked. Testing is a
Levels testing with overlapping test levels. timed activity and test levels cannot
overlap.

Early Testing Testers are involved in release planning Traditional Testing is requirement-
and iteration planning, which adds value specific testing. The requirements are
by identifying testable user stories, already outlined by the stakeholders.
outlining the acceptance criteria, Thus, testers begin with test planning
conducting estimations, defining test directly, outlining test strategy, coming
levels to be performed, etc. up with high level test scenarios, writing
test cases, etc. They are required to
complete testing within a stipulated
timeline which sometimes leaves them
with less testing time if the development
phase extends.
Collaboration The success of an Agile project depends Testers may be assigned to testing
on close collaboration between various tasks at the module- or feature-level.

25
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

cross-functional teams. Agility enforces They fully concentrate on the given


the idea of ONE team and ONE goal. All module and may not have insight into
the various stakeholders – Product other modules and potential
Owners, Developers, Testers, Technical interdependencies.
and Domain Experts – come together to
function as a single team with the
primary goal of realizing a product that
meets the end user expectations.
Customer Because the product is implemented and The actual product is not visible to end
Involvement evolved in each iteration, the product is customers until later when the internal
available to the end-customers in every testing is done. Thus, product issues
release (continuous integration). related to usability, functionality, etc.,
Customers can give early feedback are encountered pretty late in the
which can be incorporated into upcoming development cycle.
iterations.
Cost Agile teams are smaller, and activities The team involved is medium to large,
tend to be more transparent among the and hence transparency is limited.
group. Testers, developers and business Project artifacts are transferred from
representatives coordinate on a regular one team to another, e.g., design to
basis, thus identifying early defects in development to testing. Hence the
requirements, design etc. and thus scope of identifying early defects is
minimizing the corrective maintenance minimized. The cost to fix such defects
costs at a later stage. is higher in the later stages.

3.5 Agile Testing - Advantages and Disadvantages

Benefits of Agile Testing

 Testing requirements are discussed and refined as a team (during stand -ups/Scrums)
allowing the combined team to better address the business/technical aspects of the
requirement. This enables overall alignment and prevents misunderstandings.
 The Agile process often requires having entry and exit criteria for Stories (a compression
of things to do in a particular release/iteration). Agile testing ensures that each
requirement is well defined and measurable, allowing Test specialists to determine
whether the requirement was actually completed or not.

26
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Test team participates in the big-picture requirements-writing stage, thus ensuring that
testing estimates aren’t overlooked.
 Automated tests are fully leveraged to implement regression testing.
 Quality becomes the combined team’s responsibility, rather than just solely that of the
testing team. The entire team agrees to the testing strategies, test cases and defects
prioritization plan.

Proven benefits of Agile software Testing

 Agile Testing offers an early and automated process. The goal is to deliver and often.
 Creates a repeatable and reliable Testing process
 Thousands of Tests continuously executed.
 Lean, Waste Free, Improved workflow, Low WIP Backlog, No Deadlocked Test Queries.
 Eliminates the big bang integration in the 11 th hour.
 Constant Readiness state & CM Control.
 Evaluates system-wide changes throughout project
 Most Agile testing tools are “Free” Open source.
 10x more efficient / effective than traditional testing.

 Agile Testing Cost of Quality –

 Agile Testing is 10x better than code inspections.


 Agile Testing is 100x better than traditional testing.
 Agile Testing is done earlier “and” 1000x more often.

27
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Agile Testing Disadvantages and Mitigations

Agile testing proves to be the best testing methodology only if the requirements are clear to
the project sponsors. If the big-picture requirements are unclear, the details can become
muddied. For new products, the software architecture normally takes a path bas ed on the
initial requirements. If the requirements frequently change (as allowed for in Agile), the
following scenarios can occur:
 Challenges in estimations and sizing requirements. Sometimes testing gets short
shrift since it’s logically the last task in completing the user Story. Therefore, any delay
in the prior development task risks impacting test timelines.
 Test team is sometimes prevented from executing a test case for the whole
iteration, leaving the team struggling to finish the task.
 Not asking the right questions. It is very dangerous for Test team not to ask
questions, especially at the point where the user Story is picked up for implementation.
Daily team meetings can avoid this problem
 Addition of new user Stories into the current iteration . Test team should be included
in the addition of the new user Story, to build up appropriate commitments and
estimations in order to avoid misalignment and protracted timeframes.

28
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

4 Agile Configuration Selection


The diagram below describes the factors that affect what type of Agile methods can be applied
to a project. It outlines key considerations that need to be carefully reviewed / considered before
deciding how to move forward with a project. At IBM we see most of the projects falling under
Agile Distributed and Scaled. In this document, let us focus on the Agile DevOps testing
methodology for Distributed and Scaled projects.

Agile Complexity
Application Development Complexity Deployment Process
Model Level

Agile Pure Low  Single Isolated Application  Product Owner will sign off for UAT after
 No internal or third party interfaces Demo of the new features have been
 Low complexity completed in the Sprint
 Non-Mission critical Application  No hardening sprints needed for E2E testing
 All functionality is contained within the or left over regression testing
same architecture
 Single Scrum teams
Agile Low -  1 -2 Interfaces  Product Owner will sign off for UAT after
Scaled Medium  1 -2 Third party interfaces Demo of the new features have been
 Medium complexity completed in the Sprint.
 1 - 2 Mission critical Application  Hardening Sprints may be required due to
 2 -3 functionality not contained within the complexity and size of the scrum teams
same architecture  High probability for hardening sprints needed
 Scrum teams between 3 – 5 to complete integrated regression testing
 3rd party integration testing may be required
 End-to-End testing may be required

Agile Medium -  2 -3 Internal and 3rd party interfaces  Hardening Sprints probably required due to
Distributed High  Medium complexity complexity and size of the scrum teams, and
and Scaled  2 -3 Mission critical Application to complete integrated regression testing
 2 -3 functionality not contained within the  UAT may be required, but should be
same architecture integrated with SIT
 Scrum teams between 5 – 10  3rd party integration testing most likely
needed
 End-to-End testing will be required

Agile ERP High  Mission Critical Applications  Multiple hardening Sprints required due to
Distributed  3+ internal and 3rd party interfaces complexity and size of the scrum teams, and
and Scaled  High complexity business processes to complete integrated regression testing
 Functionality not contained in the same  UAT will be required, but should be
architecture integrated with SIT
 Large # of Scrum teams > 10  3rd party integration testing must be done
 A lot of End-to-End testing required

29
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

5 IBM Recommended Operating Model - Large


Complex Agile DevOps Testing Projects

5.1 Agile Testing Methodologies

The testing practices are well defined for every project, whether Agile or not, to deliver quality
products. Traditional Testing principles are quite often used in Agile Testing. One of them is
Early Testing that focuses on -
 Writing Test Cases to express the behavior of the system.
 Early Defect Prevention, detection and removal.
 Ensuring that the right test types are run at the right time and as part of the right test level.

In all the Agile Methodologies discussed, Agile Testing in itself is a Methodology. In all the
approaches, Test Cases are written before Coding. Here we will focus on Scrum as the Agile
Testing Methodology. The other commonly used Agile Testing Methodologies are –

 Test-Driven Development (TDD) : Test-Driven Development (TDD) is based on coding


guided by tests.

 Acceptance Test-Driven Development (ATDD) : Acceptance Test-Driven Development


(ATDD) is based on communication between the customers, developers and testers and driven
by pre-defined Acceptance Criteria and Acceptance Test Cases.

 Behavior-Driven Development (BDD) : In Behavior-Driven Development (BDD) testing is


based on the expected behavior of the software being developed.

30
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

The industry is moving to Business driven testing strategies to increase quality and spe ed at a
lower cost. The key lever to achieve this is by removing redundant effort and Automation is the
key mantra.

31
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Now Let us understand more on BDD approach. The UAT in Agile Testing is handled with
a BDD approach.

Agile Testing demands a high level of automation to achieve velocity. BDD (Behavior Driven
Development) has emerged as a leading technique in Agile Testing projects. BDD has evolved
from ATDD (Acceptance Test Driven Development) and describes scenarios in a non -technical,
user-friendly way, which gives Business users, analysts, and developers a better description of
how the function to be developed should actually behave. It is more about a mindset rather than
tool. It describes a cycle of interactions with well-defined outputs, resulting in the delivery of
working, tested software.

BDD is a level of behavioral abstraction above the code implementation. BDD frameworks acts
as a “bridge” between Business & Technical Language. BDD shifts from thinking in “tests” to
thinking in “behavior”. It focusses on the behavior of an application from the point of view of its
stakeholders. Here the tests are written first to collaborate between Business stakeholders,
Business Analysts, QA Team and developers. It is driven by Business Value and sets a
benchmark in simple plain descriptive English type grammar.

32
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

BDD with Agile/Scrum

• Identify project where team can work together and get support from each o ther(Business
person(business value and risk), developer(solution) and a tester(problem))

• Usage-centered design

• In order to achieve this you need to understand who exactly will use it and why.

• Enable team for BDD and cucumber tool prospective

• Enable team for IBM specific tools / framework for BDD implementation

• Enable team with Agile methodology

• Establish tool set & configuration for the team

• Prototype using ubiquitous language

• Cucumber BDD use Gherkin Given-When-Then language

• GIVEN, an initial context

• WHEN, occurrence of an event

• THEN, expected outcome

• Parameters in Quotes, " "

• Connectives - And and Not (only in then condition)

• Development and test automation through examples

• Use example and set expectation before development

• Use same example(cucumber feature file) to integrate with OTF tool for
automation

33
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Clients are adopting BDD as its practice is enabling them to produce more valuable software with
fewer defects at a faster rate. BDD along with open source tools such as Cucumber and Selenium
increases collaboration and automation in agile delivery

Cucumber is a popular open source tool that runs automated tests written in BDD format.
Gherkin is the language Cucumber uses to define test scenarios. Cucumber works with
automation tools like Selenium to provide web browser automation. Cucumber works with
SOAP/UI and many other open sources tools, programming languages enabling automation API
and much more.

34
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Generate Behavior Driven Development (BDD) test scenarios

The Descriptions can be used by Test designers to communicate with testers in order to build
better test plans and provide history for a new planner to understand / execute on the
application

 Cucumber BDD, descriptions take into account user actions and test steps,
which then mapped application objects to execute automated tests using
Selenium

 Free form description can be used to generate custom test cases for manual test
execution

35
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

5.2 IBM Recommended Operating Model Overview


Below is the Business Driven Development Model for Large Complex projects integrated with
automation. It includes from Test Design to Automated Test Cases. Below are the sequence of
steps followed to do testing in the IBM Recommended BDD with automation approach as
compared to BDD Manual approach

 Create the “Done” code for the given User Stories / Requirements.
 Create Test Models by examining the points of variability in a system and Perform Model
Based Test Planning
 Generate Behavior Driven Development (BDD) scenarios from optimized User Stories
 Create Optimized Test Plan for SIT / UAT using IBM FOCUS tool.
 Compare the results with previous sprint by generating Delta Test cases using Focus tool
 Generate Automation definition using custom Automation framework.
 Automation includes both regression and progression. Progression enabled automation
should start from Sprint 0. Automated tests should be supplemented with manual testing
of Exploratory and Showcase testing. The Optimized test cases are iteratively executed
manually / using automated test suite (Selenium / UFT / Perfecto / Crowd).
o Automate smoke test, GUI / mainframe regression, service / component level
testing and component performance testing.
o For legacy systems, create an automated set of build and regression tests before
any user stories are coded.
 Perform Defect management to fix and record the defects.

36
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Now let us see the IBM’s Integration Architecture that drives automation across the testing
lifecycle (QA and Test Case Automation)

37
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

5.3 CTD Driven Test Plan Optimization

Description :

CTD driven Test plan optimization is essential to improving QA cost and enabling agile
development. IBM’s approach of Test Plan Optimization using IBM FOCUS tool to delive r lowest
test case volume with highest Test coverage.

38
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6 Agile DevOps Test Framework and Life Cycle – For a


Large Complex Testing Project

6.1 Agile DevOps E2E Test Framework

The IBM recommended Agile Test Framework below shows an E2E Agile DevOps Testing that
is used for interaction across the client organization from a continuous integration, continuous
testing and continuous deployment perspective. However, it should be noted that the principles
of DevOps as applied to the development teams and various departments or applications can
also be applied to the Testing department as well. In the sense that test automation scripts,
when first checked into the BitBucket code repository, should be tied to a task so that it can be
triggered as part of a smoke build test. Changes to test automation script code may also trigger
the tasks tied to certain smoke test runs as part of a check in process. Such as running of the
affected test(s) to ensure that the test(s) themselves are not broken.

Since the testing needs to grow geometrically with each sprint and testers need to ensure that
the new changes implemented do not affect the other parts of the application and existing
functionalities, Regression Testing holds a very important place in the agile testing scenario.

The recommended automation tools represent an adoption of more progression -based testing,
i.e., automation testing that occurs earlier in the SLDC versus regression -based testing.
Identifying defects in the layers of code that are introduced earlier in the SDLC reduces project
risk. Applying automation testing to component, service and API levels will lessen the burden of
required UI automation that is typical of regression-based testing. In addition, refining the scope
of planned tests through test case optimization is a risk- and cost-reduction practice that is
highly recommended.

39
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

40
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6.2 Agile DevOps Testing Life Cycle

The diagram below describes the Agile DevOps Testing Lifecycle. It provides an overview of the
Agile Testing Lifecycle, highlighting the Phases, Major Activities, Key Deliverables and Key
Players. The Entry / Exit Criteria is detailed for each phase. It also highlights the quality gates put
in place to ensure the entry / exit criteria are met before moving to the next phase.

Now Let us understand each of the phases in detail; list the activities involved, responsibilities of
Test team and artifacts created during each phase.

41
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6.2.1 Conceptualization Phase

Description:

A Conceptualization meeting is held where the PO, Dev and Test lead create / decide which
Narratives and Epics need to be included in the release; thus creating the release, or product,
backlog which is a prioritized (1-N) list of all of the work being requested for a project. It is
imperative this list is broken down based on the highest business value items to the least valuable.
Because the scrum teams will work from this list in order of priority, it provides focus and allows
the teams to ensure they are always delivering the most important features and functionality.

Once the Epics and Narratives are created / chosen, the PO, Dev and Test will assign user stories
that need to be associated to the Epics. The PO will identify the Minimum Marketable Feature
(MMF - is the smallest set of functionality that is valued by customers and returns value when
independently released) user stories. The PO will then determine which Potentially Shippable
Product (PSP) they would prefer the Epic / User Stories to be completed in a release.

Backlog items will be represented as user stories. Most product backlogs will be a mixture of
functional, non-functional, technical and knowledge acquisition user stories. After the initial
release planning session, project teams will look ahead approximately 3 to 4 sprints

A Test Strategy is created by the Test lead which includes the testing objective, Testing
Framework, methods of testing new functionalities, total estimated time and resources required
for the project, and the testing environments required.

42
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6.2.2 Release Planning Phase

Description :

Release Planning is the activity where epics and features are discussed and understood,
decomposed into smaller stories; which are then estimated in story points. At the end whole team
(consisting of Dev, BA, Test) commits to the value of the next release, with a release being defined
as one or many iterations within a specific timeframe. Release planning provides the program
with a baseline schedule with ultimate milestone of release deployment by the Agile team (Both
Dev and Test).

The Product Owner (PO) presents the current product or project status and provides a view of
the product/project roadmap. The PO, collaborating with the team (both Dev and Test), defines
the epics, features and user stories targeted for the release.

The Test Lead also participates in User Story Creation, contributes ideas on possible behavior
of the system and creation of testable user stories. This helps in understanding the system in
real environment and thus getting clarity on what actually they want as the outcome. This results
in faster freezing of requirements and also reduces the probability of changes in the requirements
later on.

Test team participate along with the Dev team in estimating the user stories and provide the Test
Effort and plan the testing activities for the release. Each user story will have a relative estimate
assigned during the initial release planning session, and the program/project team will align user
stories per sprint based on the combined average velocity (the number of story points a team can
complete in a sprint) for all of the scrum teams. Thus, the resulting release plan will show at a
high-level the intended scope by sprint and the approximate number of sprints it will take to cover
off on the intended scope. The User Stories now have been created and referenced under the
appropriate PSP. Once the test team has reviewed the backlogs and agrees that it is manageable,
test team creates the Test Plan.

Test Planning should start during the release planning and updated during each sprint. Test
Planning should cover the following tasks –

environment, test tools, test data and configurations.

such as predecessor tasks, expertise and training.

43
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

tools, activities, tasks, teams, test types, test levels and constraints.

the customer/user importance and dependencies.

The resulting artifact is the release backlog. The Dev and Test team executes a release backlog
refinement – further decomposition, high level estimation, prioritization, dependency identification
placing stories into iteration buckets.

Sprint Zero involves preparation activities before the first sprint. A tester needs to collaborate
with the Dev team on the following activities -

Identifying scope

Dividing user stories into sprints

Creating system architecture

Planning, acquiring and installing tools (including testing tools)

Creating the initial test strategy for all the test levels

Defining test metrics

Specifying the acceptance criteria, also called the definition of “Done”

Defining exit criteria

Creating Scrum board

setting the direction for testing throughout the sprints

The team builds an iteration plan for the first couple of iterations and the roadmap for the rest of
the iterations. Dependencies are understood and a there is a plan to address them.

Every sprint-end need not have a release. A release can be after a group of sprints. The main
criteria of a release is to deliver business value to the customer. Release Planning session at the
beginning of the project need not produce a detailed release plan for the entire project. It can be
updated continually, as relevant information is available.

44
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Objective:

The objective is to ensure stakeholders are aware and agree about the scope of the product and
identify if the scope more than likely will fit in the timeframe requested by PO. The team decides
on the sprint length with release planning as an input. Release Planning i s the basis of test
approach and test plan for release. It provides a view, based on velocity on when increments of
a working product will be delivered.

45
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6.2.3 Grooming the Backlog Phase

Description:

The Prioritization Grooming Meeting is held to groom the backlog to help streamline sprint
planning meeting; It consists of the PO, Dev & Test. This usually includes adding new stories
and epics, extracting stories from existing epics, and estimating effort for existing stories.

PO identifies the candidate user stories (based on priority) for next sprint planning . The Dev and
Test team representatives help PO to alter list based on technical feasibility.

User stories must also have an acceptance criteria and a definition of done (DoD). Acceptance
Criteria is the set of requirements pertinent to one user story that must be met for it to be
completed. Definition of done is a set of criteria that is common across multiple related user
stories for them to be considered closed.

The User Stories and the Defects are updated with the Acceptance Criteria given by the PO.

The Team Grooming meeting is held which consists of PO, Dev & Test, to review prioritization
of the Acceptance Criteria. This is also done where the team assigns points to the User Story
based on the Acceptance Criteria.

Objective:

The Objective is to ensure there is a good understanding on what the next priority is for the
sprint as well as ensure PO, Dev & Test teams agree on how the feature will be implemented as
well as assign points to the User Story.

User stories provide a statement of need instead of solution or requirement, enabling the agile
teams to find the best solution they can deliver as oppose to have it prescribed to them.

They also provide emphasis on the value or outcome that is delivered at the end of the iteration.

Acceptance Criteria and Definition of Done are important because they provide a clear view to
the whole team of the conditions that need to be satisfied in order to declare the user story
done. Acceptance Criteria is just one aspect of the Definition of Done.

46
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6.2.4 Sprint Planning Phase

Description:

Once the User Stories and Defects have been groomed and are determined Ready -Ready, the
User Stories can be assigned to a Sprint.

A Sprint Planning meeting is held at the beginning of every Sprint. The Scrum Master will
facilitate this meeting. The goal of this meeting is for the Dev and Test teams to determine their
plan for the sprint in terms of what user stories will be committed to, the tasks and hours
required to code and test each story and what will be shown at the sprint demo.

During sprint planning, it is imperative for the Product Owner to identify a sprint goal, discuss
the business/stakeholder objectives for the sprint and the business value expected to provide a
context for the team activities during the sprint. The sprint backlog is created with the user
stories picked up from the product backlog for implementation in that particular sprint.

User stories representing both functional and non-functional requirements, are not testing work
products in principle. But in Agile projects, the testers participate in the user stories creat ion.
Testers write the User stories that bring in value to the customer and cover different possible
behaviors of the system. Testers also ensure that all the User Stories are testable and

47
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

ensure the Acceptance Criteria. The Dev and Test teams can also identify user stories to
capture continuous improvement efforts. Both should have a good understanding of the
technical details that is involved in delivering the story so that they know scope of the work and
have same understanding of what the story is about. The test team should know how the story
will be tested and if there are any impediments to test the stories.

In sprint planning, the Sprint Backlog is created with the user stories picked from the product
Backlog for implementation in that particular sprint. The Dev and Test team should decide on
the Effort Estimation for each of selected user stories; Must know what sprint goals are and thus
contribute to the prioritizing process. This priority order ensures the scrum team(s) focus on
delivering the highest business value items first.

Testers update the test plan with estimates for testing effort and durations in the spr int.
This ensures provision of time for required testing during the time - boxed sprints and also have
accountability of the testing effort.

During Sprint Planning, the Functional testers should -


 Determine the testability of the user stories selected for the sprint
 Create acceptance tests
 Number of testing types need to be decided and discussed
 Estimate time for each user story test case creation and execution
 Break user stories into different testing tasks
 Decide each story test coverage
 Acceptance criteria for user stories needs to be defined
 Understand and plan for user stories automation and support various levels of testing
 Identify Test Automation

Objective:

The Sprint Planning Phase identifies the User Stories & Defects which will be delivered (Done-
Done) in a Sprint, as well as development, and testing of the tasks assigned in the Sprint.
Baselining of user stories occurs at each sprint planning meeting.

48
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6.2.5 Sprint Testing Phase

Description:

Once Release and Sprint planning are complete, the team is ready to execute the Sprint. The
goal during the Sprint execution is to produce potentially shippable working software that
provides business value to the customer. During Sprint testing, the following takes place.

Daily Standup (Daily Scrum) - During Sprint Testing, each morning a Daily Standup will be
held lead by the Scrum Master. Each team member will go around the room and state their
status on the task (whether they are done, in progress, if anything is holding them up, and if
they are not done, when they plan on being done). The meeting is time-boxed. The test team is
also given few min to state their status.

When a Sprint Testing begins, as the developers carry on story analysis for design and
implementation, testers perform test analysis for the stories in the sprint backlog. Tester create
the required test cases – both manual and automated tests.

49
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

All the members of the Scrum team should participate in testing.

 The developers execute the unit tests as they develop code for the user stories. Unit tests
are created in every sprint, before the code is written. The unit test cases are derived from
low level design specifications.
 The Functional testers support the unit testing. The test team perform functional and non-
functional features of the user stories;
 They mentor the other members in the scrum team with their expertise in testing so that
the entire team will have a collective accountability for the quality of the product.
 Test execution is performed where both tester and developer work hand in hand. Defect
are logged in Defect Management tool which are tracked on a daily basis. Defects can be
conferred and analyzed during scrum meeting. Defects are retested a s soon as it is
resolved and deployed for testing.
 Test results are collected and maintained.

Test team is responsible for developing automation scripts and schedules automation testing
with Continuous Integration (CI) system. Automation receives the impo rtance due to short
delivery timelines. Test Automation can be accomplished by utilizing various open source or
paid tools available in the market. This proves effective in ensuring that everything that needs to
be tested was covered. Sufficient Test coverage can be achieved with a close communication
with the team. Review CI automation results and send Reports to the stakeholders. Executing
non-functional testing for approved user stories.

The test team will collaborate with customer on acceptance tests, gather data specification,
automatically design and optimize test cases by building Test models in IBM Focus tool, Create
BDD scenarios in Gherkin format, automate and commence execution in Selenium, defect
identification / fixes, demo and retrospective.

Co-ordinate with customer and product owner to define acceptance criteria for Acceptance
Tests. At the end of the sprint, customer and/or end users carry out User Acceptance Testing
and provide feedback to the scrum team. This forms as an input to the next sprint.

Note: The functional testers in collaboration with the business will be responsible for building
the models and generating the BDD scenarios.

After the tasks status have been reviewed, the Dev team will give a demo to the team and the
PO of the features that were completed. A Root Cause Analysis discussion is held immediately
after the Sprint Review. The PO, Dev team and Test will go over the Defects introduced during
the sprint as well as the Major Defects reported by the business during the Sprint. The goal is to
identify why the Defects were introduced.

50
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

As most of the Testing in Agile Projects is automated, the Tools generate the necessary Test
Results Logs. Testers review the Test Resu lts Logs. The test results need to be
maintained for each sprint / release. A Test Summary can also be prepared that contains -

es and the corresponding Resolution

After the Sprint Review meeting is completed, the Sprint Retrospective will be held. Testers
also contribute to the Sprint Review and Retrospective Reports. The typical contents are -

Objectives:

The Sprint Testing Phase identifies the User Stories & Defects which will be delivered (Done -
Done) in a Sprint, as well as development, testing the tasks assigned to the Sprint.

51
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

52
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Description

The diagram below gives a Release wise and E2E Test activities managed during Sprint
Testing. It also describes the multiple variations of scrum teams during the Sprint testing in a
large application development project. It also highlights multiple deployment components that
takes place during large application development projects. This is a good example of complex
Agile Project.

53
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Description:

The diagram below describes at a high-level the Agile Test Approach, highlighting the principles
around the Agile lifecycle during the Sprint Testing phase. The purpose of this diagram is to
provide the testing team a pictorial view of the activities that occur before, during and after a
Sprint lifecycle.

54
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Description:

The workflow below describes the Sprint testing process flow. It highlights the Agile testing
lifecycle in a DevOps environment.

The diagram will serve as a guide to the testing team which details the list of activities and tasks
that need to be performed during the Sprint Testing Phase.

The process starts with the functional tester(s) analyzing the user stories to determine if they
are testable. It also explains what goes into the Continuous Integration (CI) / Continuous
Deployment (CD) orchestration bucket from a testing perspective (Smoke Tests, Service /
Component / API level Tests, Limited GUI / Mainframe Regression & Performance Testing
where applicable).

It also describes the Agile Testing Lifecycle Overview, highlighting the Phases, Major Activities,
Key Deliverables and Key Players.

The combined team, including both development and testing, takes responsibility for
analyzing the User Stories / Requirements. Together, they define the Sprint goal.

55
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

The Test team defines the testing scope (i.e., test plan). It is then validated and approved
by the entire team and the client. Simultaneously, while the development team starts the
implementation of modules (in the very first Sprint), the test team begins work on the test
case design. These are properly documented either in a testing tool or in an Excel
spreadsheet that is handed over to the development team and project sponsor from the
business side to review. This is to ensure that test coverage is as complete as possible.

Once the test case review and any modifications are complete d for a particular Sprint, the
test team then begins testing on the QA environment. Defects found during testing are
logged properly in a defect tracking tool. Depending on the severity and priority of defects,
fixing them can be delayed but then is taken care of in upcoming Sprints. At the end of each
Sprint, the team determines, along with the project sponsor, which defects are to be fixed in
the current iteration. This iteration continues until all planned Sprints are completed.

Test team along with the development team and business organization, defines which main
flows (test cases) will be automated. When code is ready to test (after the end of each
Sprint), Test team works with development to execute test cases on the development
environment, in order to identify the early stage defects so developers can fix them during
the next round, on a priority basis. This process is then repeated throughout the
development process. Automated test cases are run daily throughout the SDLC.

White-board/stand-up meetings are conducted daily involving members of all teams


associated with product development, support and testing. This helps to resolve the issues
faced by team members and provides a clear picture of progress in both the coding and
testing areas. Agile promotes the introduction of requirements at all stages/iterations of the
SDLC; however, the testing team determines when to end this process to ensure product
stability (see Figure below).

Manual Testing: Regression


The manual testing team plays an important role in determining product quality. Once the
majority of coding is completed. the last planned Sprint), the manual regression test cycle starts.
This is key since it helps to determine the stability of the application before it is pushed to
preproduction deployment. Code and defect fixes continue during this phase, but this is to add
value to the product. Preproduction deployment is dependent on the approval of the test
manager’s testing report; deployment only occurs if the report depicts green status for all
application modules. Although a dedicated testing team exists in the Agile environment, unit
testing by an individual developer is beneficial to find early defects and minimize the rework of
both the testing and development team.

Automated Testing: Regression


Automation is a critical component of Agile testing. It would otherwise be impossible to keep pace
with the Agile development schedule. Automation is also used to run regression testing. The
combined team (developers, product owners and testers) usually predetermines, at the start of
the project, which parts of the software will be tested using automation. Continuous
integration/builds, unit, functional and integration test execution as well as continuous or
automated deployment are common areas where automation may work better than traditional
tests.

56
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

The entire project team agrees up front about which of the main flows will be automated.
Automated tests consist of unit tests, capable of verifying even the most minute segment of
software. Automation is required primarily to determine the stability of modules developed in
each Sprint. Any defect found is reported and fixed, based on its priority.

Furthermore, it is possible to execute the test set multiple times per day, per hour or even
more frequently if needed.

Best Practices in Agile testing –

Effective Agile projects generally address the following important elements:

 Include testers and QA managers as members of the Agile development team.


 Include testers as active contributors in planning and requirements analysis
 Promote the importance of testers and encourage continuous feedback sharing with
the programmers and the customer.
 Testers actively participate in meetings to define the main business flows.
 Testers work on short iteration activities alongside developers.
 Encourage traceability between the requirements, test cases and bugs.
 Testers contribute to user Story improvements.
 Leverage the specialized skills of test-driven development, including unit testing,
continuous integration and unit level.
 Leverage automation testing as a key way to do regression testing.

Description:

The workflow below describes the Sprint Execution process end-to-end. The diagram illustrates
how you build models, how you refine the model, how to generate BDD scenarios and how to
automate the scenarios. The functional testers, in collaboration with the business, will be
responsible for building the models and generating the BDD scenarios.

Note: The required training around Modeling, BDD using Gherkin / Cucumber, Selenium should
be completed in order to understand this section.

57
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Description:

Below is a list of Entry and Exit Criteria as part of Continuous Integration and Development
testing.

58
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6.2.6 Hardening Sprint Phase

Description:

After all tasks / new features have been developed and tested, defect fixes merged from
previous releases completed, and code frozen, the Hardening Sprint can commence. In a
hardening sprint, the team stops focusing on delivering new features or architecture, and
instead spends their time on stabilizing the system and getting it ready to be released.

PO, Dev and Test will decide how many sprints will be needed for the Hardening Sprint. The
Hardening Sprint will be performed in an Integration environment that is a replica of Prod . Full
Regression, Integration testing, End-to-End testing, Security testing and Performance testing
can be performed as part of the Hardening Sprint.

Note: The use of Hardening Sprints has to be evaluated per release, depending on the size and
complexity of the release.

The target area of the regression test is the complete set of (end -to-end) business functions that
a system encompasses.

Regression testing essentially checks if the previous functionality of the application is working
coherently and that the new changes executed have not introduced new bugs into the
application. These tests can be implemented on a new build for a single bug fix or even when
there is a significant change executed in the original functionality. Since there can be many
dependencies in the newly added and existing functionalities, it becomes essential to check that
the new code conforms with the older code and that the unmodified code is not affected in any
way. In agile development, since there are many build cycles, regression testing becomes more
relevant as there are continuous changes that are added to the application.

For effective regression testing in agile development, it is importa nt that a testing team builds
regression suite right from the initial stages of software development and then keeps building on
it as sprints add up. A few things to determine before a regression test plan is built are:

 Identifying which test cases should be executed.


 Identifying what improvements must be implemented in the test-cases.
 Identify the time to execute regression testing.
 Outline what needs to be automated in the regression test plan and how.
 Analyze the outcome of the regression testing

Along with this, the regression test plan should also take into account Performance Testing to
ensure that the system performance is not negatively affected due to changes implemented in
the code components.

In the agile environment, Regression Testing is performed under two broad categories;

59
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Sprint level Regression testing: This regression test is focused on testing the new
functionalities that are implemented since the last release.
 End to End Regression testing: This test incorporates end-to-end testing of ‘all’ the
core functionalities of the product.

Considering that the build frequency in agile development is accelerated, it is critical for the
testing team to execute the regression testing suite within a short time span. Automating the
regression test suite makes sense here to ensure that the testing is completed within a given
time period and that the tests are error free. Adding the regression testing suite to the
continuous integration flow also helps as this prevents the developers to check-in new code
before automatically evaluating the correct working of the existing functionality.

Typically, there are two rounds of Regression Testing in the integrated environment. The first
round entails going through and testing the whole product to find as many issues as possible.
Once all issues are resolved and tested, the second round of Regression testing begins. If an
issue is found during the second round of Regression Testing, and if the sce nario was not
included in the test, then the PO team needs to determine if the issue is high and if to include
the fix in the next release. If the issue was found following the execution of Automated
Regression Scripts or Manual Test Cases, then it needs to be resolved and tested. Dev then
needs to determine what the fix impacts and what all needs to be re -tested during regression
testing. If the issues impact everything then the second round needs to start all over again (PO
needs to approve). If the issue can be isolated, then the Dev team will test all areas not
impacted and once the fix is resolved, then the team will re-test all impacted areas.

Objective:

The objective is to ensure the quality of the end-to-end process of the product.

60
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Description:

The below image depicts a sample complex Agile project from project inception through
Hardening Sprints. It highlights the different stages of the Agile Testing lifecycle step-by-step
and the activities that occur during each of the phases.

61
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

6.3 A Day in the Life of an Agile Tester

Description - The below diagram shows the list of testing activities that an Agile T ester
performs in a day..

62
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Description:

The diagram below describes the testing activities that take place in a DevOps Waterfall
environment

63
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

7 Agile DevOps Test Optimization

Testing is an important but expensive part of the development process of systems. This is
usually a labor-intensive process that requires a great deal of time and resources.

An approach to lowering the cost of system testing is the use of test suites generated from
combinatorial designs. This approach involves the identification of parameters that define the
space of possible test scenarios. It allows to select test scenarios in such a way a s to cover all
the pairwise (or t-wise) interactions, also known as levels of interaction, between these
parameters and their values.

Test Optimization is a solution that provides flow through, intelligence -driven automation from
business requirements through execution to quality outcomes. This approach supports the
velocity of Agile and Dev/Ops models.

What are current challenges?

• Requires re-recording of scenarios on every change


• Recording / Framework is application specific
• Difficult to tell which scenarios to be re-recorded, where there is no functionality change
• Recording is done manually and error prone to what is expected

Area of challenge:

Area of Challenge :

Transform

Optimize

Resolve

Quality

Useable

Efficiencies

64
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Transform - How do I fundamentally change the automation equation?

Description – The diagram shows IBM’s Combination Test Design (CTD) to provide a n
optimized Test solution.

Optimize - How do I optimize and test only what needs to be tested?

Combinatorial test design, (CTD), is an effective test design technique for coping with the
verification challenge of increasingly complex software systems. CTD takes a systematic test
planning approach to modelling the things that need to be tested. It uses advanced
Mathematical Algorithms to optimize the number of test cases while ensuring coverage of
conditions and interactions. CTD is based on the observation that in most cases, the
appearance of a bug depends on the interaction between a small number of features of the
system under test. In CTD, the test space is manually modeled by a set of attributes, their
respective values, and restrictions on the value combinations. The aggregate of attributes,
values, and restrictions is called a combinatorial model.

IBM Functional Coverage Unified Solution (IBM FOCUS), IBM’s CTD tool, is an advanced
test planning tool for improving the testing of an application. IBM FOCUS uses Combinatorial
Test Design (CTD) to generate an efficient and optimized test plan that provides consistent
coverage across the test space at a known depth, while significantly reducing the required
resources. IBM FOCUS is independent of the application's domain, and can be applied at
different levels of testing. IBM FOCUS can also read existing tests, analyze their function al
coverage, select a subset of the tests that maintains the same coverage, and generate new
tests to close the coverage gaps. IBM FOCUS requires a user definition of the test space, and

65
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

provides advanced review and debugging capabilities to verify that th e test space was defined
correctly.

IBM FOCUS includes an attribute weighting capability. However, in many cases it cannot be
used to skew attribute coverage across test cases because it adheres first to the coverage
requirements, and only then to the weights. Because the coverage requirements impose a
minimal number of occurrences for each value and the total number of tests is also minimized
by CTD, in some cases the requested distribution will not be reached without inflating the test
plan. IBM FOCUS avoids inflating the test plan.

Benefits of CTD

• To optimize for different scopes and types of testing.


• To articulate the risks as changes to the testing approach are needed.
• To reach quality goals or close quality gaps.
• Optimize number of test cases needed to achieve our objectives.

A CTD model is created by examining the points of variability in a system. It contains:

• Attributes: Points of variability


• Values: Different states of an Attribute
• Restrictions: Rules that determine which combinations of values are included and which are
excluded from the model
• Interaction levels: Level of combinations of attribute values that should be tested together

66
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Reduce - How do I release new Products faster?

IBM Functional Coverage Unified Solution (IBM FOCUS), IBM’s CTD tool ensured reduced
and Optimized Test Plan with maximum and consistent Test Coverage across the test space at
a known depth, while significantly reducing the required resources.

Quality - How Do I deliver Quality at a lower cost

Using Test Optimization techniques you can deliver Quality of Test Plans at a low cost.

67
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Useable - How do I deploy a reusable and easy to use framework?

Efficiencies - Where can I find further efficiencies in my test enterprise?

Search Tag and Model (STAM) -

 Quickly deal with thousands of existing tests:

 Based on machine learning and text analytics


 Clustering, semi-automatic text search, and tagging

 Semi-automatic:

 Test analysis, reduction, transition, and transformation


 Preparation of test data for test automation tools

68
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

High level Process:

Value Proposition:

The value proposition of STAM consists in reducing the huge manual effort associated with
generating CTD attributes and values

Future Releases:

STAM will use machine learning / cognitive based algorithms to do the processing of existing
test cases. This will allow a more comprehensive support of formats that can be analyzed by
STAM. Thus, test cases not conforming to a strict language syntax can also be used to
generate CTD attributes and values.

To improve quality and “shift left,” we need to find defects and ambiguities as early as
possible:

1. In a diverse set of artifacts representing requirements, specifications, designs, test


plans, and test cases
2. Looking for correctness, ambiguity, omissions, and consistency within and across a
large numbers of documents
3. With the ability to check for defects frequently -- on demand, and triggered by changes
or schedules

Please refer to the CTD Training link attached below –

69
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

8 Agile DevOps Test Automation

Description –

Automation testing is given high importance in Scrum teams. The testers devote time in creating,
executing, monitoring and maintaining of automated tests and results. As changes can occur any
time in scrum projects, testers need to accommodate testing of changed features and also the
regression testing involved. Automation testing facilitates managing of test effort associated with
the changes. Automated tests at all levels facilitate achieving continuous integration. Automated
tests run much faster than manual tests at no additional effort. The manual testing focuses more
on exploratory testing, product vulnerability, predicting defects.

Agile Test Automation by the very nature of the technology is not exploratory in nature since the
main role of automation testing is saving time and reducing costs. Automation testing is not
meant to come up with new and innovative defects. Automation testing aims at mostly
conformation of the already existing.

Automated tests should be:


• Concise – Test should be as simple as possible and no simpler.
• Self Checking – Test should report its results such that no human interpretation is necessary.
• Repeatable – Test can be run repeatedly without human intervention.
• Robust – Test produces same result now and forever. Tests are not affected by changes in the
external environment.
• Sufficient – Tests verify all the requirements of the software being tested.
• Necessary – Everything in each test contributes to the specification of desired behavior.
• Clear – Every statement is easy to understand.
• Efficient – Tests run in a reasonable amount of time.
• Specific – Each test failure points to a specific piece of broken functionality (e.g. each test
case tests one possible point of failure).
• Independent – Each test can be run by itself or in a suite with an arbitrary set of other tests in
any order.
• Maintainable – Tests should be easy to modify and extend.
• Traceable – Tests should be traceable to the requirements; requirements should be traceable
to the tests.

Automation of Testing activities

Automation of testing activities reduces the burden of repeated work and result in cost savings.
Automate
 Test Data Generation
 Test Data Loading
 Build Deployment into Test Environment
 Test Environment Management
 Data Output Comparison

70
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Recommended skills mix of test automation team.

 Subject Matter Experts


– Plan and Design Tests
60% – Manage Test Data

 Implementers
30%
– Build scripts based on modules
and utilities

 Coders
10% – Write modules and utilities
– Could be developers “on loan”

Key Characteristics of Functional Test Automation

 Test suites supporting unattended execution


 Recovery Management for un-interrupted script execution
 Provision to dynamically alter test conditions for every run
 Extended functionality: Test evidence, metrics, customized reporting

Test Automation Approach

 Plan automation early to align with the testing process and tool/staff capabilities
 Design automation approach to meet project’s long term requirements
 Focus on test design to ensure repeatability and suitability for automation

8.1 Agile Test Automation Pyramid

The agile test automation pyramid is a graphical strategy guide for implementing automated
software testing. This model splits types of testing into three layers based on the return on
investment (ROI) offered by automating that particular type. The bottom layer (the largest part of
the pyramid structure) includes unit testing, representing the idea that automating it offers the
best ROI to the organization. Unit tests involve testing small units of code. They are the least
expensive to write and maintain, and they provide value to the team multiple times per day.
Component tests, in this particular model, provide the next greatest benefit and user interface
testing the least.

In automated testing, software tools execute pre-scripted tests on a software application or


program component to ensure it is functional. Automating testing makes it possible to run tests

71
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

quickly and repeatedly and the software involved can also provide reports and compare results.
Automation helps deal with problems resulting from manual testing, including missed deadlines,
quality issues and human error. The order of testing can also impact its success and that of
the agile development project as a whole.

Test automation pyramid must be adhered to


 Must mandate and monitor TDD participation
 Invest in more Service Virtualization (for Simulation and Regression Testing)

The diagram below describes the IBM’s approach of Agile Test Automation principles which
depicts -

 Automation must be finished in each Sprint


 Keep the automation test bed current
 Both service level and GUI automation are in play

72
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

8.2 Agile Test Automation through Behavior-Driven Development

• Next Gen Automation platform provides a standardized method to reuse-recycle code and
scripts. Reusable code is made available at a common level for easy implementation.
• Tester-friendly script-less Automation reduces the complexity of automating scripts.
• Integration with CTD ensures that a unique set of detailed test cases is taken forward for
automation of test execution with assured coverage
• Early automation to ensure that automated testing is the default option
• Drive toward Behavior Driven Development & generation of test automation

Value Proposition -

• “Each one automates” philosophy embeds automation in the DNA of the tester within the
Sprints; promotes faster ROI by incremental automation and reduces costs
• Smarter automation reduces implementation time & allows faster extension into new
applications & services
• Focus on automation of processes and application components for better business function
validation
• Integrat automation to leverage virtualized interfaces and environments to give early testing
advantage
• Embeds Testing Automation as the test engine for Dev Ops and Agile client service (easy
integration with Jenkins, UrbanCode Deploy, RBF, etc

73
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

8.3 Agile Test Automation – Assessment Criteria

This section provides guidance on how to determine if an application is a high -value automation
candidate. The selection criteria to choose test automation candidates are presented in 2 parts.
The first set of criteria is meant to filter out potential candidates based upon basic foundational
prerequisites. The second set is business selection criteria that can be applied only if the
potential candidate satisfies the foundational prerequisites.

A. Foundational Prerequisites
1. Are the automation tools in place for this application?
For example: Is there a tool in place for automating the IVR?
2. Does the data exist that is required to drive predictable, expected results?
3. Is the application planned to be sunset or planned to be migrated to an other
application, including its data?

B. Business Selection Criteria (Main Drivers are Quality | Speed | Productivity)


1. Is the application considered critical? For example:
a. Does the application have customer facing or significant customer-impact,
i.e., a customer portal, IVR?
b. Is this a new product offering, a highly regulated area, an area where the risk
of widespread business outage could lead to significant revenue or
reputation/brand impact?
2. Does the application undergo frequent, ongoing functional releases and global roll-
outs? For example:
a. Does the application's functionality change at least monthly?
b. Does this product have a current, ongoing delivery roadmap/release forecast
(versus near-term sunset plans)?

3. Is the application part of a DevOps initiative?


4. Are the QA costs growing in terms of headcount or test cases? For example: Will
the application require a high volume of tests that would be costly to manually
execute?

Sample Test Automation Strategy

The attached file is an example of test automation strategy that includes automation process,
test methodology, framework infrastructure, maintenance approach, test results analysis, roles
& responsibilities, environmental requirements, risks & mitigations that can be customized and
applied to the client applications.

Example Test
Automation Strategy.doc

Sample Automation Assessment Tool

74
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

The following attached file is a representative Excel based evaluation tool to allow for evaluating
the client internal applications to determine a fitness for testing automation and also priority of
automation for the purposes of identifying high value automation target applications.

Sample Automation
Criteria Tool.xlsm

8.4 Agile Test Automation – Leading Practices

Migration toward Script-less Automation

Script less automation is somewhat of a misnomer. The idea is that there exists a n interface that
allows non-technical testers to record a work flow, and somehow behind the scenes the tool will
figure out what it is that the user is doing and write a script that does the same thing over and
over again so the test can be run repeatedly on a regular basis. While this does work in most
cases in principle, the practicality quickly becomes nearly impossible to deal when you have
many of these types of automation tests the whole system quickly becomes completely
unmanageable. The reason is because whatever tool is automatically generating the script code
from the recorded actions is still doing so with little to no built-in intelligence. So every time the
UI changes in any form or fashion, even if it's just some simple attribute changes on the web
objects that recorded test case must be either re-recorded or appropriately edited to ensure that
it will run again. When you consider the pace, volume and spread of changes across an
organization it is easy to see how this can balloon out of control.

Almost any tool that claims to provide script less automation does not take the above situation
into consideration and the business quickly discovers that someone technical enough to write
code must still be in a chair writing automation code to make their script less components work.
In short, a script coder is still needed no matter what.

Therefore the key is not to achieve script less automation, but migrate towards script less
automation by applying various principles of framework design, data driven, keyword driven,
modular components, keywords/actions and sub-keywords that can be assembled to the needs
of your AUT. Separation of concerns and standardized function signatures in combination with
data structures and data sharing mechanisms should be utilized to isolate / limit changes and
impacts while making the entire testing system more flexible and adaptable to the needs of your
department, organization or application.

Adoption of Open Source Tools

Open source tools have taken the testing industry by storm in just the last year. This is because
the pace at which new devices are being added to the world market has been termed as 'Device
Explosion'. As new devices and new browser and new operating systems enter the universe of

75
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

possible combinations, the testing tools must accommodate all or certainly many of those
possible device/browser/OS combinations.

Most of the licensed tools cannot keep pace with all new devices because their own internal
development teams simply cannot vet every single device with their testing platform fast
enough.

However, with an open source tool, you have a whole planet of developers making contributions
to the open source testing tool vetting devices, building plugins, writing emulators, etc. So the
open source tool can post updates that enable testing on all the new devices much faster and at
an acceptable pace that allows them to be adopted into the enterprise testing matrix quicker.

Common Tool Sharing

Many large organizations will make their tool selections according to what best fits their needs
at the time, or maybe there are not even any official tool selection criteria, but just whatever the
latest trend is at the time the choice must be made. This limits the sharing and integration that
can be done across departments or lines of business.

The solution is to do a comprehensive tool selection process taking into considerations a broad
spectrum of criteria, then standardize those tools and enforce the standardization of those tools
across the organization, with built in tool communication protocols and interfaces such that all
departments and lines of business can follow a strong and well defined DevOps approach.

Progression-Based Testing / Shift to API & Service level Testing

Most companies will devote the majority of their automation test effort testing the user interface
before an application is released. In order to reduce the risk of a defect making it into
production there must be a large number of UI test cases. It should be noted that UI automated
test cases are inherently unstable, because the slightest change in the UI can have drastic
results in the success or failure of a given test script. So in order to keep the automated test
scripts in good working order a fare amount of effort must be expended in maintaining the
scripts once they are built. As your list of test scripts grow so does the time that must be
expended on maintaining them. Very quickly most organizations find they are in a situation of
diminishing returns where the costs of maintaining the scripts outweigh the cost of re -writing the
script from scratch using a better framework approach.

The solution to above problem is to significantly increase number of test cases and automation
scripts around API testing, Web Service testing and Component level testing. In fact majority of
automated test script code should be written in these domains. These types of testing scripts
will be much simpler to write, significantly easier to maintain and drastically more stable.

Diminished UI Automation Testing

76
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

As your business begins to place higher emphasis on API testing, Web Service testing and
Component level testing you will find that the number of UI scripts that you must write can
decrease while still giving you an equal amount of code test coverage and an equal or less
amount of business risk that Severity Level 1 & 2 defects will make it into production.
Furthermore, since API, Web Service and Component level testing are all very stable test
scripts the amount of time your test script writers expend maintaining the UI tests will decrease
drastically since your business has a much reduced list of UI test cases. The responsibility of
extensive UI testing will fall more squarely within UAT.

Everyone Automates

Adopting tools that allow non-programmers to contribute to the automation effort, and allows for
automation to take place earlier, i.e., closer to the requirements phase, is key to increasing test
coverage. Current BDD (Behavior Driven Development) tools, like Cucumber, make it simple
for business analysts for business-minded functional testers to create automation that is written
in a natural language like Gherkin.

Simulate Interfaces to Accomplish Earlier E2E Testing

There is value in performing end-to-end testing early and often in the test cycle. The early
detection of interface-related, integration defects improves quality and time-to-market. Using a
service virtualization tool to simulate responses to 3 rd party or downstream applications or
interfaces will allow testers to exercise functionality within the same sprint in which it is built
versus waiting until dependent services become available.

Model-Based Practices

Once you have a very strong automation framework in place that has a wide variety of modules,
actions, keywords and components then it becomes important to extend the usage of all of
these disparate pieces into using them to build a model of your enterprise application. That is in
the sense that your application has various dependencies, branches and overlaps, extensions,
interfaces, hooks, etc. If your test system can model or map to all of these then you'll have a
very powerful capability to discover defects in your application early before those defects make
it into production or pre-production. Furthermore, you'll be able to maintain your testing stack of
scripts with ease and little effort.

Supportive Environments for Test Automation -

The Continuous Build/Test/Deploy Provides 3 Environments:

 Active Development Environment - Local on developers machines


 Integration Environment - Code, Scripts, Test Data – Held & managed on a server that
supports version control, code check-out & check-in, and a GCI process

77
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Test Production Environment - Where test scripts are actively running against either an
active in-development project, or active in-production project.

Considerations of a Fully Automated Process -

Benefits

 The location of in-development test automation code that the developers are working on is
separate from where the code is stored, which is further separate from any
production/release tests that are running.
 All approved code changes are tracked with a change ID.
 Change ID’s can be tracked through another issue tracking system with additional change
details and discussion as a community effort (Agile, Scrum, Extreme Programming, Social
Software Development, etc.).
 A good version control system will support GCI process.
 Following a GCI process allows for defect discovery earlier in the development process and
holds the developers accountable for the issues that they are creating by accidents made
during the check in process.
 This is a move towards Shift Left as a trend for the whole industry.
 Change ID history is kept in the long term repository and can be referenced.
 Less error prone due to automated checks and balances and verification points.
 Less time for defect discovery.

Risks

 When there are many change requests all at the same time then the system can become
very slow.
 When there are many change requests that include dependencies the whole automated
ADLC system can come crashing down due to build failures.
 Requires a configuration manager role that can help educate and fill the gaps in managing
the automatic build, test & deploy migration steps.
 Integration of changes to the build must be coordinated among engineers due to
dependencies, e.g. code/configuration changes/data, etc.
 Must consider hardware growth in addition, to support the auto -generated VMs.

Pre-conditions on One-Click Automation Execution

 Active network connection.


 Test Orchestration Tool must be in place.
 Test Scripts should:
 Exist

78
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 Be stable i.e., generate consistent results from run to run


 Be integrated into the test orchestration tool
 Test Data should be prepared and available for the test script.
 Test automation platform installed and configured on the orchestration server.
 Collector of test results needs to be hooked up to the target reporting serv er.
 Target test environment (Virtual or Physical) should be created or generally available.
 An automated process or method in place for resetting or decommissioning the test
environment at the end of the script(s) execution.

79
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

8.5 Agile Test Automation Workflow with DevOps Tools

The following steps describe the typical flow for setting up and maintaining automation scripts,
including the tools involved and the interactions between the automation engineers and the
DevOps engineers.

Continuous Testing - Step by Step approach

1. Testing team schedules meeting with the DevOps team to give them an overview of the
testing activities that has to be automated
2. Testing team then connects with DevOps team to setup required testing tools in the
environment
3. Once testing tools are setup, Testing team gives an overview to the DevOps engineer on
how the testing scripts will look like and how will the scripts be executed from command
line
4. DevOps creates a repo in BitBucket for testing team to push their scripts
5. Testing team creates testing scripts and push them to BitBucket
6. Testing team defines the tests that has to be executed at different stages
7. Testing team gives demo to the DevOps team on how to use testing scripts and on the
manual process to update test cases in test tools like JIRA/Zephyr
8. Contract between the testing team and DevOps team is signed on the format of the
scripts, their trigger point and the output of the scripts.
9. Once the contract is signed between the teams, DevOps team tries to trigger test cases
from command line of the server.
10. Once DevOps is able to execute these tests from command line, they generally add
them to the bamboo tasks
11. DevOps team further write scripts or configures plugins to update test results in the test
tools like JIRA/Zephyr
12. Testing team confirm the execution of tests automatically

Introducing new test cases

1. DevOps team generally need not to be aware of new test cases as long as the trigger
point & output remains the same or there are no changes in the contract
2. In case the process to trigger test cases is changed, DevOps team should be informed.
The Testing team and DevOps team again signs a contract on the format of scripts, their
triggers and required libraries
3. Once the contract is updated, DevOps team implements the changes
4. Once the changes are implemented, testing team confirms the successful execution of
test cases

Branching Strategy for Test Scripts

1. DevOps team generally creates *test-main-[testing type] branch, and test-dev-[testing


type]
2. Testers push their changes to the test-dev-[testing type] branch

80
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

3. Continuous Testing is integrated with test-main branch.


4. Generally, testers create pull request for merging code from test-dev-* to test-main-*
branch
5. Automatic trigger is setup on the pull request to execute the job and posts results in
BitBucket pull request UI page
6. At least, 2 code reviewers must approve the pull requests from test-dev-* to test-main-*
branch wherein one reviewer is from DevOps team and other is the lead from testing
team
7. Reviewer from DevOps team needs to make sure that all changes are as per the
contract before accepting the pull request
8. After merging the changes, DevOps team verifies by manually running the job and
testers confirms the successful execution of the test cases

Failure in test cases due to the changes in code

1. Testing team lead must be one of the approver to approve the pull request from dev
branch to the product level main branch
2. Reviewer from testing team generally verifies that automated test cases are not failing in
the lower level branch

*Note: test-main refers to the main code stream (branch) of the test automation code on the
DevOps server. Likewise, test-dev refers to the main code stream (branch) of the unit test code
on the DevOps server.

81
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

9 Service Virtualization

This section outlines the guidance strategy for Service Virtualization. It contains all relevant
information regarding the recommended tool, best practices for setting the SV tool, and industry
wide leading processes to use the tool.

Service Virtualization acts as a catalyst for DevOps by simulating constrained or unavailable


systems across the software development lifecycle (SDLC). This allows developers, testers and
performance teams to work in parallel to accelerate application delivery, as well as to “s hift-left”
the application testing to improve application quality. CA Service Virtualization was previously
known as CA Lisa. It alleviates data synchronization issues for interfaces, lack of availability of
test systems and interfaces to test systems where there is a charge such that end to end
automation of a larger number of test cases can be executed.

How Service Virtualization is being used

Component Level Testing: This is testing an application that doesn’t have a user interface.
This is another way of automating the progression and regression testing of an application.
Component level testing calls functions with supplied parameters and compares the results to
the desired value. It speeds up automated testing as many of the automated component test s
can be generated automatically, and through the capture of a baseline, the tester or developer
can run delta to compare tests to make sure there were no un-intended code changes to the
module or service [in a Service Oriented Architecture (SOA)] under te st.

Stubbing: This is the simulation of other application interfaces, components, services, data
bases, or modules that test team’s "applications under test" requires to effectively test that
application, early in the life cycle, where it is cheaper and faster to remove defects. This is the
process of simulating an interface when an application under test has dependencies on other
external services that are not readily available.

RACI Supporting the Recommended Operating Model

82
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Responsibility (R ), Accountable (A), Consult (C ), Inform (I)

Tools Management Engineer


Solution Architect (Project)

Project Manager (Project)


SV Developer/Specialist
SV Architect/Engineer

DevOps Engineer
Roles

QEAD
Activities
Initial launch of SV
Initial discussions to onboard SV R C C C I
Select and Mobilize trained resources R I I I
Define implementation design & architecture of SV I R C C C
Initial setup of SV application I A R R C
Define virtualized interfaces requirements template for the project teams I R C
Define virtualized interfaces deployment process I R C C C
Setup automated process to deploy virtualized interfaces I C C R R
Implementation Design Documentation I R C C C
Process Documentation I R C C C
Project On-Boarding documentation I R C C C
Operations Documentation I R C C C
Communication & Launch Plans R C C I I C C
Project On boarding
Submit virtual interfaces requirement in the pre-defined template I I I R R
Evaluate project requirements I R C I I
Coordinate with project team on the requirements, scope of work & SLAs R I C A I
Setup Bamboo task to deploy virtualized interfaces for the project I C A R
Create virtualized interfaces I A R I I I
Deploy SV instance for the project I I C I I R C
Deploy virtualized interfaces from Bamboo C C I I R R
Verification of the setup I A C C C R C

10 Agile Testing Tools supporting an E2E DevOps


Framework

83
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

This section helps in developing a Cohesive Sprint Tools strategy from both a tactical and
strategic perspective that will provide a basis for improved productivity and execution.

The tools discussed in this section, along with standard DevOps practices, will help the Client
optimize automation efforts while achieving their business goals, Increased Flexibility and
Shortened cycle times to create environments.

• Tool Selection Methodology –

For each tool category, perform market research, via Gartner & Forrester, to determine a short
list of market leaders that also meet DevOps tool chain integration requirements.

 Create criteria for each tool category based upon of the following:
o Client stated requirements
o Industry Best Practices for tool functionality
o Automation Best and Leading Practices
o Vendor & product qualifications (feature diversity, continuity, reputation)
 Rate each tool against the criteria using:
o Direct expert experience
o Market research
o Vendor confirmations

• Tools Strategy –

 Leverage and expand upon existing toolsets within the Client environment and evaluate
additional tools & capabilities to make tactical recommendations and to optimize .
 Develop tools framework covering testing capabilities and linking with related process
areas such as Dev/Ops.
 Implement tools framework

The below Tools Framework gives sample Test Automation tools that fits into the Agile/Dev/Ops
Categories.

84
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

11 Distributed Agile Testing Process

A Basic Agile Framework –

85
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Enabling Distributed Agile Testing

86
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

A primary location would exist where the product owner and solution architect are located with
interactions to the mulitple secondary sites as described in the diagram below.

12 Agile DevOps Test – Governance and Metrics


Framework

Governance Model -

The Governance Model provides the linkage between the Test team (QA) and the
Stakeholders

• Develops the communication plan between QA Team and its stakeholders


• Develops the meeting plan between QA team and its stakeholders
• Establishes the metrics and reporting framework as a basis to provide transparency
• Establishes the Quality Index to judge overall quality and improvement

IBM’s approach aligns governance and metrics to drive a central QA Mandate.

Governance institutes metrics & reporting accountability at every level which in turn
drives accurate data & measurements that flow through the enterprise.

87
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Capabilities Value Proposition

 Governance structure braced with a  Mature Client’s QA organisation through


cascading, comprehensive metrics program steady-state operational effectiveness
to track and manage organizational maturity measures and IBM defect prevention and
 Metrics framework covering operational, value metric thought leadership
financial, and quality facets to drive toward  Institution of a Quality Dashboard aligned to
a central QA objective and measure Client’s QA priorities to enable critical
success. business decisions and
 Metrics dashboard design and technological operational insight at every organizational
deployment recommendations. level
 Enterprise Quality Index comprised of key  Incentivize a team commitment to
quality prevention and business impact enterprise quality by stakeholders through
indicators Quality Index implementation

Agile Testing Metrics Framework -

Description :

Below diagram depicts a complete software metrics program that encompasses multiple
dimensions. A comprehensive metrics program covers three distinct focus areas (Operations,
Financial and Quality) Depending on the QA objective and the level of governance, the degree
of coverage and types of metrics will vary across dashboards.

88
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Description :

Organizing a Complete metrics Frame work - A typical Testing Center of Excellence has metrics
that align to Value and Cost from Financials perspective; Quality Assurance (Defect
Prevention) from Operations Perspective and Quality Control (Defect Detection) From Quality
perspective

89
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Agile metrics are different from Waterfall metrics –

Below are the Key Agile DevOps Test Metrics which covers the Quality Control and
Quality Assurance aspects.

90
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Metrics Name Definition Formula

Agile Financial Value Metrics


CSAT is often determined by a single
question in follow-up surveys along the
Customer Satisfaction may be measured directly by
Customer lines of “How would you rate your overall
survey and expressed as a percentage, such as
Satisfaction satisfaction with the service you
Percent of Customers Completely Satisfied. It is a
(CSAT) received?” This is often graded on a
measure of the customer’s satisfaction with the
scale of one to five, with a score of one
service received.
representing “very dissatisfied” and five
representing “very satisfied.”
Business Owners determines Business value planned
Delivered % of business value achieved Actual
for the release. Business Owner and the development
Business Value Business value achieved/ Planned
team determines Actual Business value achieved.
Planned vs Actuals expressed as a percentage. Business value.

Satisfaction with Vendor is often


Satisfaction with Similar to CSAT, Satisfaction with Vendor assesses
determined by surveys evaluating the
Vendor the quality of vendor contributions and service from
performance of the vendor from the
the perspective of the business.
business’ perspective.

Agile Financial Cost Metrics


Ratio of Spend in Dollars of the project
Monthly Spend Monthly Spend vs. Budget compares the amount
for the month / project budget for that
vs. Budget spent on the project against the resources allotted.
month
This metric identifies how much is spent in QA relative Dollar Spend on QA for a particular
QA Cost as a % of to the total spend for IT. Higher percentages may project or set of projects / total dollar
Projects indicate over-spending in QA and inefficient spend for a particular project or set of
processes. projects
Work Effort serves as a measurement of productivity
in terms of cost per test case. It is a rough indicator of Actual effort in hours * blended rate / total
Work Effort
test team output and should be considered against the number of test cases
complexity of the project or story points.

Agile Quality Control (Testing) Metrics


Defect Severity is a classification of defects to indicate
Defects By the level of negative impact on the application under
test. Defect by Status is a classification of defects to
Severity, By No. of Defects, Severity Level of defects,
indicate the no. of defects within the corresponding
Status Defect Status, and Reporting Date
status. The different defect status to be used will be in
accordance to the standard set by the in its Defect
Management Process.

91
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Delivered Defect (Defects reported by Product Owner) /


Density (per A) Defects reported by Product Owner (Completed User Story Points per
Iteration/Release) B) Completed User Story Points per iteration / release iteration / release)

Defect Volume Total # of defects from system test and


(Total # of user acceptance test, and defects
# of Defects from all independent tests conducted
Defects) identified in post-delivery phase for each
release.
The Defect Age will be measured in terms of number Date defect was opened versus date
of days and will be a simple average of age of all defect was fixed (or)
Defect Age defects in a sprint. Used to measure the effectiveness Sum of all number of days the valid
of team to resolve the outstanding issues. This is defects were open for / Total Number of
negative indicator and should be tending to lowest defects
amount possible.

Agile Quality Assurance (Testing) Metrics


Defect Removal A) Number of defects resolved in the iteration/ (No. of defects resolved / No. of defects
Efficiency (rate) release found) per iteration / release
B) Number of defects found in the iteration/ release
This metric measures the ability of the resolution team
Defect Reopen Total number of defects reopened per
to improve its internal processes and knowledge to
Rate iteration/release / Total number of defects
minimize reoccurrence of failures in processes, tools,
per iteration/release
and skills.
Defect Injection A) Total No of in-process defects (Total In-Process + Defects reported by
Rate B) Defects reported by Product Owner Product Owner) / (Completed User Story
C) Completed User Story Points per iteration / release Points per iteration / release)
This metric allows teams to determine the efficiency of
(Total No. Of Rejected Defects)
Defect Rejection the testing team in terms of the number of invalid
/(Total No. Of Defects Reported By The
Rate defects that have been raised by the testers. It
Testing Team)]*100
measures the quality of the staff in correctly identifying
defects.
Could also be measured in story points. Measures
how well a story was completed within a sprint, and Follow-on stories created after user
Follow-On Stories
how much change was generated. A low number demo/ stories committed for sprint
implies that the team (including the product owner) is
getting the stories done right the first time.
First Time Right –
# of test cases that ran without error the
Initial Code Measures the quality of the artifacts entering test
first time / total test cases

Defects / $1M
Assesses the overall number of defects in proportion Total defects / $1M portfolio spend
Spend
to project investment

92
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Requirements # Requirements ambiguities or defects /


Measures the integrity of requirements writing and
Quality total # of defects
communication

Total # of requirements modified, added,


Requirements
Reflects the number of alterations to requirements or or deleted per project or sprint / total
Stability Index volatility after sign-off number of requirements signed off after
requirements gather per project or sprint
Test Case
Identifies how many test cases were created per story
Requirement # test cases completed / story point
point
Coverage
Number of user story points completed in comparison
Story Point to the total user story points committed in a sprint by a
Number of story points accepted by
Estimation team. This metric provides information on whether a
product owner / total story points
Accuracy team is under or over committing every sprint. Also
committed for a sprint
demonstrates teams’ progress in estimating their work
effectively.
Product Owner Face time with Product owner during
Higher is better. Demonstrates the product owner is
Facetime Sprint/number of story points
actively involved w/ the team.

Measures coverage of test case execution by Total Test Cases Executed / Total Test
determining the ratio or percentage of test cases that Cases Planned per Sprint
Test Case
have been executed in comparison to the actual (Number of test cases passed in the
Execution Ratio current iteration/ release) /
number of test cases that had been initially planned
for execution. (Number of test cases identified in the
current iteration / release)

Agile Operations Metrics


Measures test cases that were automated out of the
% Test Cases total automatable test cases executed. A rough proxy
# of automated test cases executed /
Automated for automation progression. Care must be taken to
total test cases executed
ensure test cases in the denominator are automatable
versus total.
Functional Test
# Test cases passed during a particular
Execution Pass The number of functional test cases that passed
timeframe
Rate
The standard burndown chart measure
task-hours or story-points over time, not
Used to measure the actual and estimated amount of issues. It should contain the following: X
Sprint Burndown
work to be done in a Sprint. axis to display working days, Y axis to
Chart
display remaining effort, ideal effort as a
guideline, and real progress of effort.
Effort Deviation This metric is used to measure deviation of planned (Actual Effort ( In PH) / Planned Effort (In
effort to actual effort PH))* 100

93
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

The objective of this metric is to reduce the schedule


Schedule
variation by tracking it from beginning stage of the ((Actual End Date – Planned End Date) /
Variance project through the end of the project, thereby (Planned End Date – Planned Start
reducing time overruns. The Schedule Variation metric Date)) * 100
is mainly used as an indicator for capability to meet
milestones.
Time to Value - Entire duration between the time when a story is
Σ (Deliver time - Start Date of Sprint) for
Story ‘received’ in team backlog to the time when the same
each release/ Total # of releases
is considered ‘delivered’ by the product owner.

94
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Appendix 1 – 1 Sizing and Estimations

1.1 Agile Testing - Sizing and Estimation, An Introduction

Sizing is a relative number, based on relativity of one item vs another (or all) item/s in the
backlog. Sizing takes into account - relativity, complexity, effort and doubt of items in the
backlog to have an initial good enough coarse granular view of what that item entails. Here is
where we use techniques such as Planning Poker. It is done prior to the Sprint planning to
determine as to what could potentially fit into the time box.

Estimating is task level view of what we can get done work wise. It is the number of ideal effort
hours it will take to complete a specific task, in order to get the acceptance or achieve the
"Done" state of that item in the backlog. These are tracked daily in terms of hour s or ideal days
(how much we have left to do)

Estimation – Using Story points

Estimation is the practice whereby whole teams collaborate to agree on the relative size of user
stories. This is commonly done using story points.

Why do we need this?

 Agile teams need to understand the relative size of the user stories they can commit to.
 The aggregated story points from all user stories completed in an iteration, becomes the
actual team velocity for that iteration.
 Velocity is used to plan future iterations and forecast when user stories can be
completed. Teams can average their velocity over multiple iterations.

Who participates?

 The agile whole team including the product owner, iteration manager, subject matter
experts, analysts or other team members as appropriate.

How do we do this?

1. The team agrees in advance what technique will be used for estimation: story points -
Planning Poker or Fibonacci-, T-shirt sizing, or other. It is recommended that teams use
one technique for all their iterations for consistency in the estimation.
2. The team agrees on how much time to discuss each story (suggested: 3 mins per
iteration-size story in a 2 week iteration)
3. The estimation occurs in Release Planning and can be refined or continued during
Backlog Refinement. Estimation during Iteration Planning can be done but as an
exception.
4. Each member provides their view on the sizing in a way the other team members can
see it.

95
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

5. The team discusses the outliers and tries to converge to a number or size.
6. Remember compromise (meet half-way) is key and the sizing is “relative”
7. Once agreement is reached, the sizing is documented.

Hints and Tips

 The number of story points is influenced by factors like how big the story is and how
complex it is.
 Stories sized too large to fit into an iteration will need to be decomposed to small stories.
 Story points and velocity are specific to a team, as such, they can not be compared
across teams.
 For information on Planning Poker: https://www.planningpoker.com/
 For information on T-shirt sizing:
https://www.mountaingoatsoftware.com/blog/estimating-with-tee-shirt-sizes
 Estimation discussion sometimes can provide additional clarity to the acceptance criteria
of the user story and its assumptions. Use common sense when you time-box your
estimation.
 Story point estimation is a team view of the size of a story, it is not that of an individual.

1.2 T-Shirt Sizing

In agile webinars, when we talk about capacity planning with agile methodology: How does it
work, and why does it matter? Here are three questions asked about estimating the size of user
stories:

 What is T-shirt sizing?


 Is a medium story twice as big as a small story?

96
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 How does estimating stories help with release planning?

These are important and related questions, because a team’s historical throughput of estimated
stories (velocity) is what determines the duration of a release and therefore the likely delivery
date.

Now Let’s take a look at the answers to these three questions.

What is T-shirt sizing?

T-shirt sizing is a way to practice relative sizing. By comparing stories, you can break them into
buckets of extra-small, small, medium, large, and extra-large.

Estimating in relative buckets is more important than estimating absolute time or effort. We want
to understand how things compare to each other in a rough sense, and not waste time on false
precision.

Is a medium story twice as big as a small story?

Each team will establish the size of a story relative to the others. For one t eam, a medium may
be twice as big as a small; for another team, it may be three times as big. The important thing is
that the relative sizes are consistent over time. Once this is established, we can use our
estimations to achieve predictable releases.

How does estimating stories help predict a release date?

We estimate the relative size of a story and then track a team’s actual time to delivery (velocity).
That is, how many small, medium and large stories do we complete in an average iteration?
Based on established velocity, we can predict how long a new set of relatively sized work will

97
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

take us. For example, if I’m training to run a marathon, I know I average about one mile in 10
minutes. Therefore, I can estimate that it will take me about four hours (26 miles x 10 minutes
per mile = 260 minutes, or a little over four hours) to complete the marathon. My first mile might
be faster than my last, but based on my training I know my average (velocity) and I can make a
good general prediction.

1.3 Planning Poker


Planning poker is a crowd-based technique for estimating the amount of effort or relative size of
a story. The goal is NOT to develop an indisputable estimate, but rather obtain a collaborative
estimate in a fast and cost-effective way, i.e. capitalizing on the ‘wisdom of the crowd’.

What do we do?

 Prerequisite – All work activity is ‘carded’.


 Each player has a set of estimating cards that contain ‘story points’. Multiple methods of
sequencing story points can be used. The simplest is: 1, 2, 3, 5, 8, 13, 20, 40 and 100
(Recommended sequence).
 Remember story points are NOT units of time, they are relative sizings. As the size (and
number) increases, so does the level of uncertainty.
 The facilitator reads the first story aloud. Any questions related to the story are
answered.
 Each player selects a card representing their estimate of the first story. They share their
estimates simultaneously.
 If all estimators selected the same value, that becomes the estimate. If not, the
estimators discuss their estimates. The high and low estimators should especially share
their reasons.
 After a discussion, the team re-estimates and the facilitator notes down any assumptions
that have been agreed upon. (Discussions are time boxed to keep things moving
forward.)
 After a couple of rounds, the estimates will either converge or the team will reach an
agreement based on the majority or estimate average, and the e stimate will be written
down on the story card.
 Continue with Story 2. This and all subsequent estimates will be relative to the first
story.

For example, you can use a “perfect day” as 1 point. (i.e. 8 uninterrupted hours)
Points is a relative measure that can be used for agile estimation of size. The team decides
how big a point is, and based on that size, determines how many points each work item is.
To make estimation go fast, use only full points, 1, 2, 3, 5, 8, and so on, rather than fractions
of a point, such 0.25, or 1.65 points. To get started, look at 10 or so representative work
items, give the smallest the size of one point, and then go through all other work items and
give them a relative point estimate based on that point. Note that points are used for high-
level estimates, so do not spend too much time on any one item. This is especially true for
work items of lower priority. Learn from past estimates. Retrospectives are a time for the
team to incorporate insights from past iterations, including the accuracy of their estimates.

98
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

Why is it important?

 By using Planning poker, the group avoids being influenced by hearing a number before
everyone shares their estimate. Planning poker should force people to think
independently and propose their numbers simultaneously.

Who attends?

 The iteration manager, product owner, the core team, and any extended team members
that will be engaging in the work

Tips for geographically-dispersed teams

 Utilize a video-conference, e.g. Skype, and a collaboration tool like Sametime chat to
collect estimates
 Respect time-zone differences

A short presentation (~9 minutes) is recorded on Planning Poker that has been used with teams
for years. It can be found here:

http://w3.tap.ibm.com/medialibrary/media_view?id=141183&back=search&backTo=%2Fmedi
alibrary%2Fsearch%3Fqt%3D2015+development+roadmap+enablement

External Links:

 https://www.mountaingoatsoftware.com/agile/planning-poker

99
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 http://guide.agilealliance.org/guide/poker.html
 http://renaissancesoftware.net/papers/14-papers/44-planing-poker.html
 http://www.agileadvice.com/2012/06/04/referenceinformation/the-planning-game-an-
estimation-method-for-agile-teams/
 http://theagilepirate.net/archives/109
 https://www.planningpoker.com/ - Great resource for distributed teams

1.4 Scaled Agile Estimation Models


Key challenges associated with traditional velocity-based planning that become exacerbated as
agile projects begin to scale up

Key Challenges:

 A single story point is unlikely to represent the same amount of work across teams and
sprints.
 Bottom-up story point data is frequently not available for estimating work during program
and portfolio planning.
 Yesterday’s model requirements may not hold for one or many of the multiple teams
involved in large-scale agile projects.

SAFe Method

 Story point velocity must be normalized to a point, so that estimates for features or epics
that require the support of many teams is based on rational economics
 Requires all teams use 2-week sprints and assumes about 20% time for planning,
demoing, company functions, training and other overload.

100
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 This leaves 8 workdays for each member in a 2-week sprint (further adjusted for any
personal vacation, company holidays, part time work, etc.)
 The algorithm used by SAFe for normalizing teams to a common, starting story point and
velocity baseline is as follows:
 For every developer tester on the team, give the team eight points (adjust for part
timers).
 Subtract one point for every team member vacation day and holiday.
 Find a small story that would take about a half-day to code and a half-day to test and
validate (as team effort). Call it a 1.
 Estimate each story relative to that one.
 SAFE uses a hybrid scheme of relating relative sizes (story points) to ideal time units
(IDD). Normalized story points in SAFe represent work in ideal time unit. One
normalized story point is equivalent 1 IDD. SAFe ties 1 standard story point to 1 IDD for
all teams.

Mike Cohn’s Method

 Start all teams together in a joint planning poker session for an hour or so.
 Have them estimate ten to twenty stories.
 Then make sure each team has a copy of these stories and their estimates and that they
use them as baselines for estimating the stories they are given to estimate.
 Let the agile team work for a few sprints, and then start using its average velocity to
forecast, assuming that the velocity has stabilized.

Calibrated Normalization Method (CNM)

 CNM can be applied to small (even single-team) agile projects, very large agile projects
consisting of multiple portfolios and programs, and enterprises with a large number of
independent projects.
 CNM deals with various challenges in estimating large-scale agile projects, and offers
advantages over centralized and semi-distributed methods.
 CNM is not tied to any specific agile scalability approach or framework, such as
SAFe. However, CNM can be used in conjunction with SAFe.
 SAFe’s normalization method, 1NM, can be considered as a special case of CNM.

101
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

 CNM promotes local, decentralized, and autonomous decision making at the team level
by allowing teams to use their own story point scales.
 CNM offers a solid foundation for consistent estimation across large -scale agile projects.

Calibrated Normalization Method (CNM)


Bottom-up estimation (from teams to programs up to portfolios)

Key Features/Attributes:

Step 1: Decide the Normalization Basis for an enterprise - All teams, projects, programs and
portfolios across the enterprise should agree on the ideal hour equivalent of 1 Normalized Story
Point (NSP). The number of hours decided is called the Normalization Basis.

Step 2: Estimate the relative sizes of stories using relative sizing techniques - Relative
sizes of stories are commonly estimated in story points by using techniques such as Planning
Poker. Because a story point only has meaning in the context of the team that did the
estimation, it is referred to as a Team Story Point (TSP).

Step 3: Determine the Calibration size for each team

 In CNM, each team calibrates the size of one TSP by using a sample of up to 3
stories from its sprint backlog for each sprint. This process determines the
average number of hours per TSP, or Calibration Size, for a team.
 Calibration Size = (Total estimated hours of effort for up to 3 sample stories) /
(Total Team Story Points for same sample stories) = Team Hours per Team
Story Point (TSP)
Step 4: Normalize the story points and enter NSP for each story in agile project
management tool - Using what is referred to as the Point Conversion Factor, which is the ratio:
Calibration Size / Normalization Basis.

Now that story points are normalized, the team can enter and use these va lues in their agile
project management tools to ensure that all story point roll-ups, progress bars, math and reports
are meaningful and correct across large-scale agile projects with several teams, programs and
portfolios

102
IBM Confidential Copyright IBM Corporation 2016
IBM AGILE DevOps Test Methodology Guidebook

103
IBM Confidential Copyright IBM Corporation 2016

You might also like