You are on page 1of 29

Metrics for Agile Project Version 1.

Metrics for Agile Project


Version 1.0

Bangalore
February 2009

For TCS Internal Use Only i


Metrics for Agile Project Version 1.0

This is a controlled document. This should be strictly used for


internal purpose. Many external sources have been referred to
come up with this document. This document should not be shared
external to TCS. Person sharing the document external to TCS
will be solely responsible for any consequences that may arise.

This document must not be copied in whole or in parts by


any means.

For TCS Internal Use Only ii


Metrics for Agile Project Version 1.0

DOCUMENT RELEASE NOTICE

This Metrics for Agile Project, Version 1.0, is released for use within Tata Consultancy
Services Ltd with effect from Mar, 2009.

This manual is subject to TCS Document Control Procedure.

Approved By: ____________________ Date: ____________________

(DAG )

Authorized By: __________________ Date: ____________________

(Sub OU Head, TCS Bangalore)

For TCS Internal Use Only iii


Metrics for Agile Project Version 1.0

DOCUMENT REVISON LIST


Client : TCS - Internal
Document Name : Metrics for Agile Project
Document Details : This document helps the Agile and quality team to define metrics for an
Agile Project.

Version Primary Author(s) Reviewed By Description of Date


Version
1.0 Joja R.S. Sridhar Vanka Initial 23 Feb 2009
Senthilkumar13 S
Arunkumar Ayyavoo

Revision History

Rev Revision Revised by Page Action Rationale for


No. No. Taken revision
(Add/Modi
fy/Delete)

For TCS Internal Use Only iv


Metrics for Agile Project Version 1.0

List of Abbreviations and Acronyms

BMI Backlog Management Index


CMM Capability Maturity Model
CMMI Capability Maturity Model Integrated
COE Centre of Excellence
COQ Cost of Quality
CR Change Request
DAG Delivery Assurance Group
EBV Earned Business Value
FP Function Point
IPMS Integrated Project Management System
iQMS Integrated Quality Management System
KLOC Kilo Lines of Code
MDP Metrics Deployment Plan
MR Maintenance Request
OU Head Operating Unit Head
PCB Process Capability Baseline
PCE Phase Containment Effectiveness
PIP Process Improvement Proposal
PL Project Leader
PMR Project Management Review
PPQL Project Productivity and Quality Leader
PSU Project Start Up
PWU Project Wind-up
QAG Quality Assurance group
RCA Root Cause Analysis
ROI Return On Investment
RTI Response Time Index
SEPG Software Engineering Process Group
SLA Service Level Agreement
SMP Software Metrics Program
SPC Statistical Process Control
SPI Software Process Improvement
SubOU Head Sub Operating Unit Head
TAT Turn Around Time
TDCE Total Defect Containment Effectiveness
UMP Uniform Metrics Program

For TCS Internal Use Only v


Metrics for Agile Project Version 1.0

Table of contents
1. INTRODUCTION ................................................................................................................................ 7
1.1 WHY METRICS FOR AGILE PROJECTS ................................................................................................. 7
1.2 OBJECTIVES ...................................................................................................................................... 7
2. OVERVIEW OF AGILE METHODOLOGY - SCRUM AS EXAMPLE....................................... 8
3. EXECUTION OF AGILE SCRUM .................................................................................................. 10
4. METRICS SELECTION FOR AGILE PROJECTS ...................................................................... 11
4.1 METRICS FROM TCS SMP AND ITS APPLICABILITY IN AN AGILE PROJECT .......................................... 12
4.2 SPECIAL METRICS FOR AGILE PROJECTS........................................................................................... 17
5. APPENDIX A...................................................................................................................................... 21
5.1 DEFINITIONS AND GENERAL INFORMATION ...................................................................................... 21
5.1.1 Burn Down Chart ................................................................................................................ 21
5.1.2 Velocity ................................................................................................................................ 21
5.1.3 Business Value metrics ...................................................................................................... 21
5.1.4 Agile Vs IQMS ..................................................................................................................... 22
5.2 EXAMPLE: AGILE SCRUM PROJECT AND METRICS DEFINED ............................................................... 24
5.2.1 Overview of Projects .......................................................................................................... 24
5.2.2 Distributed Agile – Development Life Cycle and Execution Model .............................. 25
5.2.3 Metrics Management.......................................................................................................... 26
6. REFERENCES ................................................................................................................................... 29

For TCS Internal Use Only vi


Metrics for Agile Project Version 1.0

1. Introduction

This document details the metrics to be collected by projects using Agile (hereinafter referred to as Agile project), such as, eXtreme
Programming (XP), Scrum, DSDM and others. The Agile team can adopt and customize some of the metrics mentioned in the Software
Metrics Program (SMP) to reflect the progress and quality of work. Automation of various stages of the project is the key to the success of
Agile project and metrics collection.

Though there are different Agile techniques, all follow a similar process to achieve the goals. Hence, the same metrics could be used across
various Agile methods like Scrum, XP, DSDM and others. For making this document simple and intuitive, we have considered Scrum as an
example.

1.1 Why Metrics for Agile Projects

Agile projects are executed under a stringent time schedule. Additionally, the project progress and outcome is constantly monitored and
evaluated by the customers. Hence, Agile projects should not capture whole set of matrices as considered in a typical project that follows
IQMS. However, to measure the efficiency of Agile project and to understand the team’s effectiveness during the project life cycle, the
following aspects need to be captured:

1. Project status with respect to the outcome defined for a particular iteration
2. Productivity improvement of the team as iteration progresses.
3. Quality of outcome as iteration progresses

1.2 Objectives

The objective of this document is to depict various metrics possible in an Agile project execution cycle. Also, this document gives a
detailed view of various TCS-SMP metrics that could be used in Agile project. The document considers the uniqueness of Agile execution
and offers a list of special metrics outside the purview of TCS-SMP.

For TCS Internal Use Only 7


Metrics for Agile Project Version 1.0

2. Overview of Agile Methodology - Scrum as Example

Agile techniques follow iterative incremental process of software development. Agile methodologies generally promote a project
management process that encourages frequent inspection and adaptation, a leadership philosophy that encourages teamwork, self-
organization and accountability, a set of engineering best practices that allow for rapid delivery of high-quality software, and a business
approach that aligns development with customer needs and company goals.

A Scrum is a team pack in Rugby, everybody in the pack acts together with everyone else to move the ball down the field. Scrum states
that the systems development process is an unpredictable, complicated process that can only be roughly described as an overall
progression. Cookbook, step-by-step approaches do not work because they aren't adequately defined and don't cope with the
unpredictability of systems development. Scrum requires active, thoughtful development and management. Assessing, adjusting, and
thinking are the characteristics of a successful scrum. Scrum enables component based development and tolerates on the job learning. As a
matter of fact, scrum is a knowledge creating process, where tacit knowledge is created and shared as the work progresses. Collaborative
teamwork ensures knowledge sharing and creation.

| Why Scrum is Powerful


y Focus on team's work only
y Daily communication of status
y Enables low-overhead empirical management
y Makes impediments visible
y Ability to make decisions and remove impediments in real-time
y Easy understanding of framework
y Access to the output product in short cycles

| Why Scrum

y Our customers and users conduct business in an ever changing, competitive environment. They can never provide a “final spec”
because their needs are constantly evolving. The best we can do is to evolve a product as their needs evolve.
y Minimized time-to-market via frequent releases.
y Scrum formalizes "empirical", chaos-tolerant software development practices. Originating from industrial process control and
biochemical process research.
y Scrum enables a project team to determine when the system is "good enough" for its application. Unnecessary effort to create a
system more robust than its environment demands doesn't have to be expended.

y Increased productivity (depending on team, environment, project, Agile experience, and so on.).
y Continuous development process improvement.
For TCS Internal Use Only 8
Metrics for Agile Project Version 1.0

y Improved communication within the development team and between the scrum team and customer.
y Scrum builds "Successful team" attitude where everybody likes work in.

| Terminology used in Scrum:

Product Backlog A prioritized list of high level requirements.


The person responsible for maintaining the Product Backlog
Product Owner by representing the interests of the project stakeholders.
A time period (usually 2 to 4 weeks) in which development
occurs on a set of backlog items that the Team has
Sprint committed to.
Sprint Backlog A list of tasks to be completed during the sprint.
The person responsible for the Scrum process, making sure
Scrum Master it is used correctly and maximizes it benefits.
A cross-functional group of people responsible for managing
Team itself to develop the product.
Burn Down Chart Daily progress for a sprint over the sprint's length

| Scrum Roles and Responsibilities

A chicken and a pig are together when the chicken says, "Let's start a restaurant!".
The pig thinks it over and says, "What would we call this restaurant?".
The chicken says, "Ham n' Eggs!".
The pig says, "No thanks, I'd be committed, but you'd only be involved!".

Define the team consisting of pigs (people who are assigned work) and chickens (people who are interested / involved).

y Each scrum focuses on one, self-contained area of work


y All staff performing work in this area

The three committed (Pig) roles are

y Product Owner

For TCS Internal Use Only 9


Metrics for Agile Project Version 1.0

ƒ Achieves initial and ongoing funding for the project by creating the project’s initial overall requirements, ROI objectives
and release plans. This list of requirements is called Product backlog.
ƒ Ensures the most valuable functionality is produced first and built upon by frequently prioritizing the product backlog.

y Scrum Master
ƒ Conducts the scrum meeting
ƒ Empirically measures progress, makes decisions, and gets rid of impediments which may slow down or stop work.
ƒ Person who asks all pigs three questions (should also be a pig) in the Daily scrum Meeting. The questions are
| What did you do since last scrum
| What got in your way of doing work
| What will you do before the next scrum
ƒ Must be able to make immediate decisions
ƒ Better to ask forgiveness than ask permission (The scrum master should ensure that team is taking responsibility of their
work)
ƒ Must resolve work impediments ASAP
ƒ Identifies initial backlog

y The Team
ƒ Self managing, Self Organizing and cross functional
ƒ Responsible for figuring out how to turn Product Backlog into an increment of functionality within an iteration (sprint) and
managing their own work to do so.
ƒ Collectively responsible for the success of each iteration

The chicken roles are only interested in the project. Only Pig roles have the authority to do what is necessary for the project success.

3. Execution of Agile scrum

The prioritized Product catalog is the starting point of an Agile Scrum project. Change in product backlog shows the change in business
requirements. All work is done in Sprints. Each sprint is an iteration of up to 30 consecutive calendar days. Each sprint is initiated with
sprint planning meeting, where the product owner and team get together to collaborate about what will be done for the next sprint. A
Sprint meeting is usually restricted to eight hours. The output of this meeting is the Sprint backlog. Every day the team gets together for
For TCS Internal Use Only 10
Metrics for Agile Project Version 1.0

a 15-minute meeting called a daily Scrum. Each team member answers three questions. The purpose of this meeting is to synchronize the
work of all team members and to schedule meetings that the team needs to forward its progress.

At the end of a sprint, a review meeting is held. This is a four-hour time boxed meeting. In this meeting, the team presents what was
developed during that sprint. After the sprint review and before the next sprint planning meeting, the Scrum master holds a sprint
retrospective meeting with the team. At this three-hour boxed meeting, the Scrum master encourages the team to revise, within the
scrum process framework and practices, its development process to make it more effective and enjoyable for the next sprint. The sprint
planning meeting, the daily scrum, the sprint review and the sprint retrospective constitute the empirical inspection and adaptation
practices of scrum.

During each sprint, the following activities are taken care.

ƒ Frequent risk mitigation plans are developed by the development team. Risk mitigation and management (risk analysis) at every stage
and with commitment.
ƒ Burn down chart – is a publically available chart showing the remaining work in the sprint backlog. Updated every day, it gives a
simple view of the sprint progress.
ƒ Continuous identification of impediments provides management and team with daily opportunity to re-engineer, and to track the
effectiveness of the re-engineering - as work attempts to progress within the re-engineered environment.

When more than one scrum team works simultaneously in a project, it is referred to as a scaled project and the mechanism employed to
co ordinate the work of these teams is called scaling mechanism.

4. Metrics selection for Agile Projects

The TCS SMP details all the metrics that should be considered by projects, depending on project requirements and project type. The Agile
project could use many of these metrics and also some unique metrics to measure how much capability they deliver in their shorter
iterations. In addition to some of the applicable metrics from the TCS SMP, the team should use special/unique metrics that truly reflect the
progress and quality.

For TCS Internal Use Only 11


Metrics for Agile Project Version 1.0

4.1 Metrics from TCS SMP and its applicability in an Agile project

The consensus about the metrics may be arrived at, based on the discussion with the Agile team. The selection of metrics from the TCS
SMP should be as per the following criteria:

¾ Metrics data should be easily measurable and should have high degree of accuracy
¾ Metrics should be meaningful and understandable to the whole team to keep consistency
¾ Metrics should give emphasis to customer concern or feedback
¾ Metrics should give clear action item on analysis and this in turn should help the team improve the process
¾ Metrics should be defined by focusing on the customer’s business goal or expectation.

The definition of metrics and formula are as per the SMP. Therefore, the same is not repeated in this document. The following table describes the
metrics taken from the SMP and it’s applicability for different types of projects.

Development/enhancement

Maintenance Project

Agile scrum Project


Goals Metrics Remarks

Testing Projects
Project

Improve Effort slippage In an Agile project effort for each sprint is fixed. If a feature or user
project story can not be completed in a sprint, it is dropped after mutual
planning Y N Y N discussion. Hence the effort slippage is always Zero percent.
Improve End-Timeliness Delivery date can not change, but the scope can change to deliver
project iterations on time.
planning Y N Y N
Improve Schedule slippage Delivery Date can not change, but the scope can change to deliver
project iterations on time. On time Delivery metrics make more sense if there
Y N Y N
For TCS Internal Use Only 12
Metrics for Agile Project Version 1.0

Development/enhancement

Maintenance Project

Agile scrum Project


Goals Metrics Remarks

Testing Projects
Project
planning is a commitment to deliver due to some business emergency,
otherwise, it is not required.
Increase Total Defect Containment Effectiveness This metrics is used to compare an Agile and Non-Agile project. Agile
defect (TDCE) project usually will have very less production errors due to the test
containment driven approach.
This metrics is calculated using the formula – (Pre delivery defects
/(Pre delivery defects + Post Delivery Defects)). In certain cases, the
project can not measure the pre-delivery defects. This is because, in
Agile, the Developer and tester teams work hand in hand, and
measuring the pre-delivery defects may affect the behavior of the
team (against the principle of team effectiveness in Agile). Therefore,
need to look at this on a case to case basis. Pre-delivery defects can
be controlled by the following metrics: The number of failed test
cases in a build, Percentage of Build failures, Build verification test for
functionality (Critical test cases need to be developed to verify this).
This will ensure that the existing functionality is not affected due to
Y N N Y the changes made.
Decrease Defect Density It could be Defects per story point or any other size as applicable.
software User acceptance defects density can be measured if the pre-delivery
defect density defects are not captured.
Y N N Y

For TCS Internal Use Only 13


Metrics for Agile Project Version 1.0

Development/enhancement

Maintenance Project

Agile scrum Project


Goals Metrics Remarks

Testing Projects
Project
Increase Review effectiveness Beneficial for projects doing reviews or paired programming. This
defect metrics is found to be good in most of the Agile projects.
containment Y N N Y
To improve Productivity (Velocity) This is mentioned in detail in the special metrics for Agile.
estimation Velocity = number of units completed in an iteration
Average productivity = velocity/number of team members
(Number of features delivered versus planned
Percentage of features delivered versus planned
Y N N Y Number of tests/test points completed)
Manage Cost % Review Effort This metric can be captured
of Quality Y N N Y
Reduce % Rework Effort This metric can be captured. The post delivery defects can be easily
rework Y N N Y captured compared to pre delivery defects.
Improve Size deviation
Estimation and It is an incremental iteration, so, this metrics is not required. Usually,
Planning size will not get affected.
Y N N N
Improve % Effort Spent on CRs The CRs can be counted as post delivery CRs after iteration. Usually,
Planning within an iteration there will not be significant changes to the
requirements. If any significant change is required, it will be
discussed and planned within the iteration. There are cases where
many CRs are raised after delivering iteration. This metrics will be
useful to measure the extent of understanding of the requirements at
Y N N Y the very beginning of the iteration by the customer.
For TCS Internal Use Only 14
Metrics for Agile Project Version 1.0

Development/enhancement

Maintenance Project

Agile scrum Project


Goals Metrics Remarks

Testing Projects
Project
Increase In process Review Efficiency This will be taken care as part of the retrospective meeting of an
defect Agile project. Therefore, it need not have to be explicitly measured.
containment Y N N N
Increase Phase Containment Effectiveness (PCE) Track phases which are the source of maximum defects. Helps to
defect focus on areas requiring attention and establish mitigation steps,
containment thereby addressing the root cause. Frequency of metrics collection
can be at the end of iteration. This metrics can be discussed during
the iteration retrospective.
This metrics has to be decided on a case to case basis as usually one
SDLC cycle will be completed within iteration. The Agile project can
Y N N Y measure this, but has to be decided after discussion with the team.
To improve Defect Free Delivery This metric gives the ratio of the number of deliveries made with no
the quality of high severity defects in acceptance testing out of the total number of
the application deliverables. In Agile project we can take number of features
delivered delivered with “done” status out of the total number of features
Y N N Y delivered.
Improve % Bad fix
customer This can be measured for Agile Maintenance Projects. This metrics
service N Y N Y can be collected for a project at an overall level.
To improve Incident Arrival Rate This can be measured for Agile Maintenance Projects. This shows the
planning N Y N Y stability of the system over a period of time.
Improve BMI -Backlog Management Index (if
customer queue is maintained by TCS) N Y N Y This can be measured for Agile Maintenance Projects. This metrics
For TCS Internal Use Only 15
Metrics for Agile Project Version 1.0

Development/enhancement

Maintenance Project

Agile scrum Project


Goals Metrics Remarks

Testing Projects
Project
service help in capacity planning.

Improve % Compliance to SLA This can be measured for Agile Maintenance Projects based on inputs
customer like; how the team is structured, and also the project characteristic.
service N Y N Y
Improve Turn Around Time This metrics measures Date/Time of completion- Date /Time of
customer reporting. Can be measured depending on the project characteristic.
service N Y N Y This is Maintenance metric.
Improve Response time Index This metrics considers the working hours spent on problem requests.
customer Can be measured depending on the project characteristic.
service N Y N Y This is Maintenance metric.
Improve %Defects Rejected Testing Project metrics (No. of bugs rejected by development
Testing team/Total No. of defects raised against the development team)
Accuracy N N Y Y
Improve Test coverage
Testing Testing Project metrics (% of requirements covered with test cases)
Effectiveness N N Y Y
To improve Testing progress Testing Project metrics (units completed for the given period / total
planning units (units mean scenarios, test case, etc)
N N Y Y
Improve No. of Test conditions identified and
Process documented by criticality Testing Project metrics (helps in estimating later phases / similar
Quality N N Y Y projects)

For TCS Internal Use Only 16


Metrics for Agile Project Version 1.0

Development/enhancement

Maintenance Project

Agile scrum Project


Goals Metrics Remarks

Testing Projects
Project
Improve Frequency of common test cases
Testing Testing Project metrics (helps in deciding whether to automate those
Efficiency N N Y Y parts if feasible)
To improve Number of test cases (test coverage) Existence of large number of test cases indicates greater testability
the testability and stability of the code. Most of the Agile projects use tools to
and stability of ensure this.
the application
N N N Y

Note :
“Y” – Indicates “Applicable Metric”
“N” – Indicates “Not applicable Metric”

4.2 Special Metrics for Agile projects


The special metrics of Agile projects can be defined by considering the Agile Team’s effectiveness, as it is an important success factor of an
Agile project. The applicability of Business Value metrics for Agile project is also detailed in this section. The mandatory special metrics for all
Agile projects are:

1) Burn Down Chart (Refer Appendix A)


2) Velocity (Refer Appendix A)
For TCS Internal Use Only 17
Metrics for Agile Project Version 1.0

Agile Team’s effectiveness (How effective is the team in meeting user requirements)
Metrics should truly reflect the capabilities of an Agile team, yet, not diverge too much from the metrics collected from other projects

¾ Team’s ability to show how much of the originally planned functionality was actually delivered (Delivery date can not change
but the scope may change to deliver iteration on time. So it is extremely important to measure this metric)

¾ Customer Feedback to be measured and tracked to arrive at process improvements for the IT and business relationship (More
frequent customer involvement with both authority and responsibility to make decisions on the scope and functionality for each
iteration are the characteristic of the Agile project.)

¾ Team’s ability to ensure defect free delivery (Agile teams continuously integrate their code, do daily builds, and insist on early
testing. This kind of emphasis on early and frequent testing ensures fewer bugs to the production environment. But quality
metrics has to be measured to ensure defect free delivery with the objective to reducing the defect.)

¾ If the team prioritizes and reprioritizes the tasks at the start of iteration, the prioritization of tasks must be value driven. The
team can report the value (in terms of increased revenues, cost savings or other determinants) that they deliver in this case,
rather than progress against the plan.

The Operational excellence metrics for Agile projects can be broadly classified as shown in the following table:

No Category Metrics Benefit

1 Cost Management Cost Benefit analysis (ROI) The business will always need to know the
Functionality used versus Billed rough cost of a project, how long it will take,
and what value will be delivered. Metrics should
be collected to arrive at these values.
Understanding the business value of each
Business Value metrics (Refer Appendix A) requirement and prioritising accordingly is
required.
Usually, this will be captured by the business
team and not the development team.
The functionality used versus billed shows the
capability of the business team in identifying
the required functionalities at an early stage.

For TCS Internal Use Only 18


Metrics for Agile Project Version 1.0

2 Project Management Burn Down Chart (Refer Appendix A) Burn Down Chart shows how much work
remains to be done over time rather than what
has been completed. This would acknowledge
that the requirements on an Agile project will
continue to change.

Shows the release plan for the project and the


requirements to be delivered for each plan. It
occurs in real time and slippages are easily
Release burn-up Chart visible, enabling early addressal. This is created
at the beginning of iteration (Iteration 0) and
managed/updated at each release
3 Productivity Velocity – Number of units completed in an This metrics helps to measure the time period
iteration for the Agile team to stabilize.

Average productivity – Velocity /number of


team members

Number of features delivered /planned


Percentage of features delivered /planned

Number of tests or test points completed

Number of story points completed

4 Quality Post Delivery Defect Density: Number of Real cost of fixing a bug is directly proportional
defects found in production (by priority severity to how early we can find it.
and source) per Size (Size can be in number of
features or test points).

Number of impediments: Number of impediments helps to measure the


management effectiveness as how quickly
Note: All other metrics coming under this issues are closed (Aging metrics for the same
category can be taken from section 5 in this can also be considered). In Agile projects, all
document. issues are to be closed within iteration.
Shows the areas in which the team has been
receiving the most number of impediments.

For TCS Internal Use Only 19


Metrics for Agile Project Version 1.0

Useful to prioritize and perform root cause to


minimize risk.
This metrics can be captured on a periodic
basis as decided by the scrum master

Agile Metrics values /estimation process / Iteration duration and so on can vary from team to team. Consolidating and comparing these
metrics across projects may not be very beneficial. These metrics are used for metrics analysis and improvement within the project.
Customer Satisfaction and Post Delivery Defect Density metrics can be compared across all projects in a company. All other metrics can
be analyzed to improve the internal process within the team.

For TCS Internal Use Only 20


Metrics for Agile Project Version 1.0

5. Appendix A
5.1 Definitions and General Information
5.1.1 Burn Down Chart

Burn Down Chart shows how much work remains to be done over time, rather than what has been completed, acknowledging that the
requirements on an Agile project will continues to change.

The vertical axis of the graph represents the work to build the prioritized requirement where work is usually reflected in person days. The
horizontal axis represents the time with the units in months to match the iterations planned and completed.

5.1.2 Velocity

Velocity tracks the team’s ability to deliver work units in a fixed timeframe. However, there is no standard for the size of an individual
story or unit of work (For example, hours of work per standard story). Therefore, velocity is a relative measurement. It can show how a
team progresses over a period of time, but it can not be used as a benchmark across companies, or even across projects within a single
company.

5.1.3 Business Value metrics

Measurement of business value delivered is a key part of every organization regardless of the methodology used. Agile projects can easily
provide this measurement, as they are constantly reviewing and prioritizing the requirements hand in hand with the customer.

Most Agile projects use value-driven prioritization of requirements to determine the scope of iteration. If requirements can be relatively
weighed by priority, the team can report the value that it delivers in iteration. At the start of the project, the joint business and IT team
rank requirements by priority and assign a relative weight to each. Thus, each requirement can be estimated to deliver a certain
percentage of the overall value of the project. At the end of iteration, the team can report not only the number of days burned on the
project plan and the number of features delivered, but also the percentage of overall business value that those features encompass.

For TCS Internal Use Only 21


Metrics for Agile Project Version 1.0

Earned business value (EBV) can be used to track the value of the requirements being delivered. Agile earned business value is calculated
as:
EBV = the percent of value delivered divided by the percent of cost consumed

When this metric is greater than one, the work should continue. When this metric is less than one, costs may be exceeding the value
delivered, so the team (IT and business together) should evaluate the remaining requirements and determine which are truly necessary.
However, for projects that has significant architecture and infrastructure work upfront, the earned value metric stays at less than one in
the initial iterations of the project, as during the initial stage certain activities which lead to the success of the project but do not directly
deliver business value.

Agile projects assume that requirements will change, and studies have shown that many non-Agile projects deliver features that were
identified as requirements at the start of a project but that are never used. Therefore, teams that deliver only features that are necessary
to the customer’s business have very low waste avoidance (WA). WA metric can be calculated as:
WA = 1 – (the number of stories implemented from initial requirements backlog divided by the number of total stories in the
initial requirements)

This metric is useful for fixed-price projects or for projects with a strong upfront cost estimate. The team can determine a cost-per-
iteration metric by dividing the overall cost of the project by the number of iterations. Then, dividing the cost-per-iteration by the average
number of features or story points that the team can deliver (the team’s velocity) in an iteration, gives the average cost of an individual
feature. Therefore, the team can transform the WA metric into a cost savings metric and thus communicate to the business in terms that
they can appreciate.

5.1.4 Agile Vs IQMS

Agile methods fit well with Level 2 CMMI processes, and the Agile engineering practices fit with Level 3 fairly smoothly. Level 3, 4 and 5
are ensured into the Agile project process with the help of IQMS. The bridge between Agile and CMMi is taken care through processes
such as contract review, unified project plan, peer reviews and causal analysis and so on. Therefore, IQMS bridges the gap between CMMi
and Agile processes. For achieving this, Agile has to adopt just enough documentation rather than minimal documentation, repositories in
place to store knowledge gained on the job Wikis, portals to enable sharing; regular video conferencing, telecons, webinars, net meeting,
VoIP.

For TCS Internal Use Only 22


Metrics for Agile Project Version 1.0

Agile Practices and TCS iQMS


Practice or Concept RUP xP Scrum iQMS
Business Owner Buy-In √√ √ √ √ √√ Excellent
√ Good
Project team/Customer
Interaction √√ √√ √ ? Doubtful Poor
Project team/Management
Interaction √ √√ √√
Project team/Support Interaction √ √ √ √
Risk Management √√ √ √√ √
Large Teams √√ √ √√

Small Teams √√ √ √
Architecture-Centric √√ ? ? √
Validation-Centric √√ √ √
Quality of Code √ √√ √√

Executing Agile methods with development approach adapted by TCS, addresses most of the Process Areas at CMM 3. The only notable
exception being the organizational areas such as Organizational Process Focus (OPF) and Organizational Process Definition (OPD) which
can be addressed independently as required.
Instead of being an overhead, CMMI tends to reduce risks in Agile development by providing practices under its various process areas
such as Integrated Teaming, Organizational Environment for integration, Supplier Agreement management, Product Integration, and so
on. Combining both the customer centricity of Agile methods with the process rigor of CMMi, truly represents the best of both worlds, with
no additional overhead. For Agile companies, CMMI Generic Practices can be used to institutionalize Agile practices. On the other hand,
Agile development is an operational tool to identify improvement opportunities in a CMMI 5 company.

For TCS Internal Use Only 23


Metrics for Agile Project Version 1.0

5.2 Example: Agile scrum project and Metrics defined


TCS has been the IT partner for a leading Financial Services Provider in the UK for the last 10 years. TCS successfully delivered various
developments, migration and testing projects across business divisions of the customer. In the current strategic relationship, TCS reached a
new milestone by seamless execution of development and testing engagements in the challenging Agile environment. This has been achieved
through high focus on the quality with the help of a clearly defined metric and governance program. The purpose of the case study is to give
an overview of the metrics followed in projects, along with the execution methodology.

5.2.1 Overview of Projects

Project Scope Technology Approach


Policy Administration Application SOA Architecture Newly Developed
System Development and C# (GUI) Agile Methodology from the startup of the project
Testing
JAVA (Business Services) Co-sourced development model
DB2 (Database) Onsite-Offshore Model with equally established roles at onsite
Mainframe (Batches) and offshore
Followed 4 weeks Iteration with multiple scrum
Claims Management Application Client-Server Architecture Existing complex system
System Development and Visual Basic 6.0 Waterfall to Agile Methodology
Regression Testing
Oracle and DB2 (Database) Complete ownership of development
Onsite-Offshore Model
Followed 4 weeks Iteration with single scrum
External Applications Application SOA Architecture Existing Application Suite
Development JAVA (GUI and Business Agile Methodology already in-place
Services) Complete ownership of development from offshore
Oracle and DB2 (Database) Complete offshore model with the direct co-ordination with
the client manager from offshore
Followed 4 weeks Iteration with single scrum

For TCS Internal Use Only 24


Metrics for Agile Project Version 1.0

5.2.2 Distributed Agile – Development Life Cycle and Execution Model


TCS and the customer jointly defined the processes and methodologies to effectively execute projects in the distributed Agile environment,
address the smooth coordination of onsite and offshore teams, and overcome the challenges caused by time zone difference. Processes are
tailored by blending the XP and scrum Agile methodologies with focus on co-sourced and distributed onsite-offshore model.

For TCS Internal Use Only 25


Metrics for Agile Project Version 1.0

5.2.3 Metrics Management

TCS Relationship and Quality group has jointly developed customized metrics program considering the distributed Agile development. We
conducted brain storming sessions with the Quality team and the project managers to arrive at the metrics. The following metrics are collated
and presented to the customer on a periodic basis:

Metrics for Development Projects

Goal Metric Unit Procedure/Tool Collection Reporting Details


used frequency frequency

Improve Burn Down Chart % X Planner and Excel Every day Monthly (End For details refer Appendix A
project Tracking sheet of each
planning iteration)

Improve Number of Number X Planner and Excel End of each task Monthly (End No of revisions to the requirement will be tracked
project Revisions to Tracking sheet of each against each task. Analysis will be done at the end of
planning requirements iteration) iteration.

Decrease Defect Density Defects/Siz EXCEL templates of End of every Once in two It should be (No of defects/ size of the deliverable)
software e the branch release months *100.
defect s Refer Section 1.1 of SMP

To Reduce the Velocity % Project specific End of every Monthly (End For details refer Appendix A
cost templates iteration of each
iteration)

To Reduce the Review % IPMS / EXCEL End of every Monthly (End (No of defects found during review/(no of defects found
cost effectiveness templates of the iteration of each in review and testing))*100
branch iteration) Refer Section 1.9 of SMP

To Reduce the Code Coverage % J Unit, N Unit etc… End of each task Monthly (End This is calculated using the tool and is an important
cost of each measure to login each task.
iteration)

Metrics for Testing Projects

Goal Metric Unit Inputs Procedure/T Collection Reporting Remarks


ool used frequency frequency

Decrease Defect arrival rate % Defect IPMS / EXCEL End of Delivery Monthly (Number of Defects found in each release across
software by severity log templates of different severities) - Trend Analysis
defects the branch

For TCS Internal Use Only 26


Metrics for Agile Project Version 1.0

Goal Metric Unit Inputs Procedure/T Collection Reporting Remarks


ool used frequency frequency

Improve %Defects % Defect IPMS / EXCEL After the RCA Monthly No of bugs rejected by development team/Total no of
Testing Rejected logging / Defect of the defect bugs raised against the development team
Accuracy tool logging tool by
development
team
To improve Test script % Test Plan EXCEL Every release / Every release (No of defects found by executing regression and
planning effectiveness and as per client functional test scripts in one functional area) / (The no
actual driven of defects found during UAT in same functional area +
status periodicity No of defects found by executing regression and
functional test scripts in one functional area)
Improve Test coverage Number Test EXCEL Monthly / as Monthly helps in traceability
Process (manual Cases per client
Quality &automated) driven
periodicity
Improve Productivity Number Test EXCEL Monthly Monthly Number of test cases executed per iteration.
Testing Cases
Efficiency

Common Metrics (Taken from the TCS SMP)

Goal Metric Unit Inputs Procedur Collection Reporting Statistical Output Remark
e/Tool frequency frequency Method (where the s
used used metric is
reported)
Manage PMR % PMR status Manual / Monthly Monthly Trend Q25
PMR Compliance of projects EXCEL Analysis
Process templates
PMR of the Quarterly Quarterly Trend
Coverage branch Analysis
Manage NCR per Ratio NCRs APT / Monthly Monthly Trend Q25
Audit Entity, EXCEL Analysis
process templates
Avg No. of Avg Open NCRs of the
days for number branch
Open NCRs

For TCS Internal Use Only 27


Metrics for Agile Project Version 1.0

Goal Metric Unit Inputs Procedur Collection Reporting Statistical Output Remark
e/Tool frequency frequency Method (where the s
used used metric is
reported)
Audit % Audit
Compliance Status of
projects,
Audit
Schedule
Audit % Audit Quarterly Quarterly
Coverage Status of
projects,
Audit
Schedule
Process PIPS received Index PIPs on Manual Monthly Monthly Trend ISR
Innovation per 100 PAL Analysis
associates

PIPS % Quarterly Quarterly Trend


conversion Analysis
rate % - SPI
Manage # of Number Projects EXCEL Monthly Monthly Trend Q25 Projects
Customer Customer report in templates Analysis will
Satisfaction Complaints CA Report of the upload
branch complaint
s into
IPMS as
soon as
they are
received
from
customer
s

For TCS Internal Use Only 28


Metrics for Agile Project Version 1.0

Goal Metric Unit Inputs Procedur Collection Reporting Statistical Output Remark
e/Tool frequency frequency Method (where the s
used used metric is
reported)
Customer % IPMS IPMS Half Yearly Half yearly Trend Q25 Projects
Satisfaction Analysis will
Index upload
CSS data
into IPMS
as soon
as they
are
received
from
customer
s
CSS % Status of Manual Monthly Half yearly Trend Q25
Coverage CSS from Analysis
Projects
Manage COQ % IPMS IPMS / Monthly Quarterly --- Q25
Cost of (Appraisal & UMP /
Quality Preventive EXCEL
Effort) templates
of the
branch

6. References

1. Ken Schwaber - Agile Project Management with scrum


2. Ken Schwaber - The Enterprise and scrum
3. Liz Barnett- Agile Metrics for Agile Development Projects- Forrester
4. www.controlchaos.com
5. Blogs.conchango.com- Colin Bird’s Blog
6. Dr Peter Lappo, Henry C.T. Andrew - Assessing Agility
7. Org_it_(TCS-TP-017)_Agile_Process_Handbook[1]

For TCS Internal Use Only 29

You might also like