You are on page 1of 20

For: Application Development & Delivery Professionals

Agile Metrics That Matter


by Diego Lo Giudice, September 9, 2013 Key Takeaways The Precision Of Upfront Software Development Estimation And Planning Is A False Assumption Project managers and dev teams are forced to accurately forecast how long a development effort will really take and then draw up a concrete plan. Then project governance and metrics must respect the forecast and follow the plan in excruciating detail! In truth, we cant forecast software with high precision and changes will happen in due course. Business Value Is Easy To Define, But Hard To Relate To Operational Delivery Metrics Measuring business value metrics is an emerging art. Few firms define or track businessspecific value metrics to help understand if software development creates value for customers and the business and to prescribe corrective actions if its not. Establishing that relationship in all contexts, from business to development teams, is not easy. Agile Teams Remix Old Metrics With New Ones The iron triangle (cost, scope, and deadline) wont go away, but Agile teams take a more flexible approach to those metrics, fixing costs and deadlines but keeping scope variable. They focus on metrics like cycle time, technical debt, and mean time to reduction that shift things toward speed to delivery, keep quality high, and improve processes. Agile Teams Should Define And Balance Metrics Based On Complexity With metrics, one size doesnt fit all. Were constantly asked which are the metrics that count, it turns out that the ability to use certain metrics depends on the complexity-people, practice, and technology -- of your project. Use Forresters framework to determine what that complexity is and which metrics really count for you.

Forrester Research, Inc., 60 Acorn Park Drive, Cambridge, MA 02140 USA Tel: +1 617.613.6000 | Fax: +1 617.613.5000 | www.forrester.com

For Application Development & Delivery Professionals

September 9, 2013

Agile Metrics That Matter


Benchmarks: The Agile And Lean Playbook
by Diego Lo Giudice with Jeffrey S. Hammond, Kurt Bittner, and Rowan Curran

Why Read This Report Tomorrows great application development and delivery (AD&D) leaders will be those who focus on delivering constant value and incremental improvement to their businesses. As changes to development processes in general and Agile in particular enable faster development of modern applications, measuring the value that teams deliver is becoming more important. Traditionally, application leaders have managed on the basis of cost, scope, and effort, but these basic measures arent as useful in an Agile context. This report exposes the metrics that successful Agile teams use to measure their progress and explores why traditional measurement approaches often lead development teams astray. To succeed in a modern application world, AD&D leaders need to understand that a one-size-fits-all approach to metrics does not work; instead, they need to gauge the complexity of their projects, the availability of best practices, and the skill levels of their development teams in order to successfully select the right metrics to measure project progress.

Table Of Contents
2 Modern Application Development Requires A Modern Set Of Metrics 10 Determine Your Metrics According To Project Complexity
recommendations

Notes & Resources


Forrester interviewed a number of vendors, industry experts, and clients for this research, including 407 ETR, Ci&T, CollabNet, Dominion Digital, Evolve Beyond, IBM, Rally Software, Scrum.org, Serena Software, Tom Gilb & Kai Gilb, XebiaLabs, and a global automotive manufacturer.

15 Define, Adapt, And Combine Your Metrics To Continuously Improve


What it means

Related Research Documents


Navigating The Agile Testing Tool Landscape July 18, 2013 Use A Metrics Framework To Drive BPM Excellence September 21, 2012 Justify Agile With Shorter, Faster Development February 8, 2012

16 Link Metrics To Clear Goals And Manage The Life Cycle 16 Supplemental Material

2013, Forrester Research, Inc. All rights reserved. Unauthorized reproduction is strictly prohibited. Information is based on best available resources. Opinions reflect judgment at the time and are subject to change. Forrester, Technographics, Forrester Wave, RoleView, TechRadar, and Total Economic Impact are trademarks of Forrester Research, Inc. All other trademarks are the property of their respective companies. To purchase reprints of this document, please email clientsupport@forrester.com. For additional information, go to www.forrester.com.

For Application Development & Delivery Professionals

Agile Metrics That Matter

modern Application development Requires A MOdern Set Of Metrics Application software development is evolving into a strategic business process for many enterprises that develop customer-facing systems of engagement.1 In this new role, AD&D pros have a great opportunity to shift conversations with business stakeholders toward value delivered instead of zerosum haggling over cost, scope, and deadlines. But thats easier said than done. Weve spoken with a number of experts and practitioners, and one theme ran through all of our interviews: Measuring the value of software development and relating it to business impact and/or business value is really hard. Our Agility Path program helps clients track business metrics as revenue per employee along with Agile operational IT metrics. You need to have a big human brain sitting between business and IT to link those metrics and reflect how changes on the IT side affect business value. Agility Path will make that link easier. (Ken Schwaber, president, Scrum.org) In the context of our business transformation to introduce more automation and digital equipment on our highways, we adopted Agile everywhere we could and linked it to our enterprise architecture teams new efforts. While its easy to see that teams are very busy and show just how busy they are, its hard to quantify how much value they are delivering. (Keith Mann, group architect, 407 ETR) Measuring the value delivered by development teams isnt exactly a new concept. Tom and Kai Gilb have been working on the EVO (as in evolutionary) method for project management (PM) since the early 1980s and were among the first Agilists to define a PM method that focused on value for stakeholders in addition to delivering working software.2 New thought leaders are building on EVO; while case studies prove the benefits of focusing on value delivery, low adoption of EVO by development teams show that most mainstream Agile teams arent thinking along these lines yet.3 The reasons for this adoption gap lie in different concepts of value delivery what the business considers to be valuable is not always what Agile developers focus on when they think about delivering value. All too often, developers like to concentrate on stuffing as much new code and as many new features as possible into every release without regard to the level of business value generated. Traditional Development Metrics Assume Slow, Controlled, Predictable Change Software development leaders have historically focused on a core set of metrics that measure various aspects of whats come to be known as the iron triangle: deadline, cost, and scope, with quality occasionally added to the mix. Many application delivery organizations still base their measurement programs on these metrics. The main goal in using these traditional metrics is to manage projects against an estimate that sets the deadline and the scope (or alternatively, the cost) in stone. The assumption is that project managers and development teams can accurately forecast how long any given development effort really takes. But the truth is that:

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

Software development is not a predictable process. We often see naive comparisons between

software development and cooking or a factory assembly line. But we know that we can follow a cooking recipe step by step and get a predictable outcome, and industrial processes are scripted to the smallest degree. With software development, its more like ordering the salmon and getting calamari instead. In modern application development, the outcome is not known beforehand and might be different from the planned result because the ingredients keep changing! (Think Iron Chef, with surprise ingredients at the start that may even change partway into the process.) Most engineering disciplines operate under relatively manageable uncertainty. By contrast, software development and delivery are dominated by human creativity, market change, complexity, and much higher levels of uncertainty. (Walker Royce, chief software economist, IBM)4

You cant always forecast deadline, scope, and effort with precision. Barry Boehm noted that

the further a project progresses, the more accurate the estimates for the remaining effort and time become.5 NASA came to the same conclusion: At the beginning of the project life cycle before gathering requirements estimations have an uncertainty factor of 4x. This means that the actual duration can be either four times or one-quarter of the initial estimations (see Figure 1).6 There is an exception to this: small, repeatable projects whose requirements dont change.

One size of development metrics cant fit all. Over the next few years, we expect that

enterprises will use at least two different development approaches, depending on whether they deliver fast-changing systems of engagement, more stable systems of record, or software-critical systems of operation. In addition, enterprises must consider the size, scale, and complexity of the organizational structures within their software delivery organization when selecting metrics that matter. Its no surprise that different organizations will focus on different metrics the ones that work for them. Given the level of complexity and size of our organization, we are happy when we can prove at the PMO level that we are delivering the features the business requires on time and on budget; thats value for the business. We aim for more, but are not that mature yet. (Program manager, global automotive manufacturer)

Measuring on false assumptions creates mistrust and skepticism. Unfortunately, many

stakeholders demand precision and detail early on because it gives them the illusory comfort of progress. As the gap between reported and true progress reveals itself, stakeholder trust erodes. Weve tried to estimate budgets with a precise process but always ended up with different (actual) numbers and lost trust that IT could ever fix this. We now review and adjust estimates as the projects unfold and are having a better conversation with IT. (Business manager, large EMEA asset management firm)

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

Figure 1 Uncertainty Turns Into Certainty In Different Ways For Different Roles

Source: Karmona Pragmatic Blog (http://blog.karmona.com)


94261 Source: Forrester Research, Inc.

False Precision Distracts Teams From Delivering Value When development teams start with a false level of planning precision, its easy for the results to go awry. One reason: project managers and app-dev leaders create and use metrics like percentage of project budget spent and percentage of scope delivered versus estimated to determine whether projects are on track. And most project governance revolves around measuring variance from plans in excruciating detail especially when projects use waterfall processes. As a result:

Fixed costs, deadlines, and effort sacrifice real business needs. When the most important
metrics are a fixed date of delivery and a fixed scope or size, teams tend to take shortcuts as deadlines gets closer. They ignore defined requirements, sacrifice testing time, and focus on delivering as much working code as possible to meet a hard deadline. The alternative making late scope changes and missing cost and deadline objectives is also unpalatable. Unfortunately, 70% of respondents to our Q1 2013 Application Life-Cycle Management Online Survey claim that, when projects fail, its due to new and changing requirements. If

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

requirements change, percentage complete and percentage of budget spent metrics have little true value.

Rigid plans discourage or defer change even when its the right thing. Change management

in traditional PM methodologies is a formal process that requires several steps, including (but not always limited to) a change request, an estimation of effort, and approval by a change review board. Traditional PM methodologies like PMI allow for change, but developers cannot flexibly accommodate it in short cycles. Stakeholders cannot quickly reprioritize requirements, because theyre not directly involved in whats going on. The net result is that change usually gets pushed to future project work or releases; in the meantime, the company implements the wrong thing.

Dotting the Is and crossing the Ts becomes a primary goal . . . Instead of focusing on

delivering a working minimum viable product (MVP) with core business features, traditional processes use metrics that measure if everyone is following the plan that was laid out at the start. Because the team assumes that the initial plan was correct, the value of completing tasks is flat and checking boxes equals progress.

. . . so teams manage to the letter of the law instead of the spirit of the law. All too often,

when teams focus on the process instead to the result, they end up managing to metrics in unforeseen ways. The team can meet the metric, but the result may be the opposite of whats intended (see Figure 2)! We have found that measuring throughput and output are extremely dangerous, because they incentivize the wrong behavior time and time again. For example, measuring the number of features delivered leads to more features being delivered. The additional features create waste or wasted effort and may not lead to better business outcomes. (Gabrielle Benefield, CEO, Evolve Beyond)

Figure 2 Traditional Metrics Can Lead To Misbehavior


Metric Number of features delivered Potential misbehavior Might not lead to a better business outcome Might lead to unnecessary development as well as higher maintenance costs because of a greater probability for defects Delays time-to-market for the necessary features

Number of defects xed Creation of unnecessary, low-quality code and higher maintenance (external provider doing both) Number of lines of code delivered Number function points delivered More lines of code written necessitates more testing No innovation or architecture improvement Number of incidents handled Reduction in the number of open bugs
94261
2013, Forrester Research, Inc. Reproduction Prohibited

Teams focus on patching instead of deep refactoring; systematic faults go unaddressed Defects spike: Bugs are retriaged but not xed. System degradation and higher incident rates
Source: Forrester Research, Inc.
September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

Agile Projects Use Different Measures What have we learned from the first 50 to 60 years of measuring software development projects? For many organizations, it seems like the answer is Not much. Changing deeply rooted metrics that were created to support waterfall projects is like convincing Italians to replace fusilli and linguine with hamburgers and ribs. But take heart! An increasing number of organizations are thinking differently: They accept that, no matter how good a team is, it just cant estimate all development projects with the same precision. Thats especially true when new technologies, like mobile apps or scale-out public cloud infrastructure, are involved. These teams are also changing their measurement practices to reflect the belief that its a waste of time to spend weeks or months specifying requirements up front. No matter how good the team is, requirements will change frequently during the initial stages of development. Many of these teams are the ones adopting Agile practices most aggressively; in the process, they are changing the focus of the metrics they use to measure success. In our interviews, inquiries, and discussions, weve found an emerging set of metrics grouped around measuring progress, quality, efficiency, and the realization of value and benefits (see Figure 3). Within those four categories, we see that:

Every Agile project measures velocity. Teams usually define velocity as number of story points

from backlog delivered per sprint.7 Properly sizing user stories and tasks is the key to effectively determining velocity, but theres no standard sizing approach.8 Most Agile teams calculate story points through planning poker, using a Fibonacci series for story size.9 Other use ideal days, development hours, T-shirt size, or more complex, formal approaches like function points or estimated lines of code. But be careful when measuring velocity; as with any metric, it can create its own measurement antipattern.10 Velocity is great when teams use it right: It can give useful information on capacity once the team becomes consistent, and when peaks happen, it gives an indication of potential problems like bugs or too much rework. On the other hand, velocity is based on team subjective sizing methods; when velocity is used to compare the productivity of different teams, it can become a dysfunctional metric. (Larry Maccherone, VP of analytics, Rally)

Agile projects extend quality metrics in new ways. A good sign of an effective Agile team is a

high ratio of automated test coverage. This requires an increased focus on quality metrics, especially those that relate to the level of test automation. Typical metrics that we see used are regression testing automation coverage, the number of automation execution failures, or the frequency of failure of commit tests.11 The latter will help you understand if your automated tests are doing a good job of capturing defects during feature development. The number of defects in production, and how that number trends over time, are also good overall indicators of quality.

Using burndown charts creates a proxy for progress. Burndown charts track how fast the

team is delivering user stories and tasks from the backlog. When product owners attach value

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

information to the user stories, burnup charts show how the software development team is actually delivering value as it progresses through sprints and releases (as long as the team only measures working software delivered). Burndown and burnup charts are effective visualization tools showing how activities, quality, efficiency, and values are trending as teams get close to delivery dates.

Tracking technical debt creates a basis for tradeoffs. As projects progress, development teams

discover defects, improper designs, new requirements, and places where it can improve the code. Collectively, these create technical debt that the team needs to address. Measuring the amount of accumulated technical debt gives teams a good indication of when they need to refocus and turn from adding new features to refactoring existing ones. An item of technical debt should include the sizing and potential cost of the fix or the potential value of retiring it. If its placed on the backlog, the product owner will prioritize it just like any other user story or epic, depending on its importance and granularity. Measuring technical debt encourages a more mature conversation with business stakeholders when problems occur.

Measuring business value metrics is an emerging art. Thought leaders in some of the

organizations we spoke with are busy tracking business-specific value metrics as part of their efforts to measure value delivery (see Figure 4). While the metrics are specific to the circumstance, the common goal is to define metrics that help understand if software development is creating value for the business and its customers and, if not, what corrective actions the firm needs to take. We are now able to measure our projects contribution to the value of processing an insurance claim one of our highest digital value streams. We now know the unitary cost of each claim and are focusing on reducing it. (Leader of the project management office (PMO), large insurance company)

One possible source of business value metrics is the cycle time for various business processes, used as a way to measure a teams improvement efforts. Cycle time is a Lean concept that works like a chronometer, registering the time that it takes to get from point A in a value stream or process to point B. An example of a cycle time metric for product development might be from concept to cash. But its not always easy to do; it might be clear when a business makes the first penny out of its value stream, but not necessarily when the idea was first conceptualized so an alternative metric in this case might be cycle time from build to deploy (efficiency).

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

Figure 3 Agile Teams Metrics Commonly Span Progress, Quality, And Efficiency But Rarely Value
Customer-reported defects Failure rates Customer experience Defect density Number of defects per sprint per release Number of completed passed tests Modularity (scrap trend) Adaptability (rework trend) Regression testing automation coverage Number of automation execution failures Number of commit test failures Velocity Capacity Cycle time (plan to test) Accumulated ow Burndown Technical debt Build rates Delivery frequency Critical practices employed Uncertainty reduction (reduction of cost to complete variance) ROI, TEI, EBITD Revenue per employee Sales, market size Customer satisfaction Cycle time (concept to market) Number of successful transactions Earned value Repeat visits Cost Business agility Renewal rates Customer usage Validated learning Risk Mean time to repair Cycle time (frequency of production deployment per sprint) Cumulative ow

Quality

Value/ bene t realization

Progress

E ciency

94261

Source: Forrester Research, Inc.

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

Figure 4 Whats Boiling? Value And Business Metrics In Agile


Approach EVO (Evolutionary Method for Project Management, Tom Gilb and Kai Gilb [www.Gilb.com]) The Lean Startup (Eric Ries [www.thelean startup.com]) Description Focuses on delivering real measurable stakeholder value incrementally; based on quanti ed objectives, fast frequent iterations, value delivery to stakeholders, measurement, and learning. Core to EVO: Stakeholders are clearly identi ed and stable, and product values are de ned. The Lean Startup assumption is that an organization dedicated to creating something new works under conditions of extreme uncertainty (true for a startup but also for a Fortune 500). Validated learning is a rapid scienti c experimentation approach (build, measure, learn). The Lean Startup measures progress by the fact of how much uncertainty gets reduced as the product is developed and how much youve really learned about what the customer really wants. The Agile Path is a continuous, two-step feedback loop. The rst step is gathering and analyzing the key business and process data needed to assess the current state of a company in each of its critical function areas. The next step is using this data to identify where improvements are most needed to have the most immediate and positive impact on the companys performance. Key to this process is breaking down large, usually systemic problem areas, into manageable chunks that can be swiftly, e ectively, and quanti ably addressed and tracked using metrics. Type of metrics De nes value decision tables where business goals, stakeholder value, and product values are speci ed, measured, and improved.

Ries key economic metric is validated learning. It measures actual progress without resorting to vanity metrics.

Agile Path (Ken Schwaber [www.scrum.org])

These are broken down into enterprise metrics that re ect the business value a company generates and foundational metrics that measure value achieved by the software development functions and re ect the quality of the software, the presence of waste in the code base, and speed to market. The enterprise metrics include: Revenue per employee, cost/revenue ratio of relevant domains, employee satisfaction, customer satisfaction, investment in agility The foundation metrics include: Release frequency, release stabilization, functional usage (unused code), maintenance budget

Bayesian analytics (IBM) (Filling in the blanks: The math behind Nate Silvers The Signal and the Noise and Economic Governance of Software Delivery)
94261

IBM is developing an analytic solution to predict ROI, ROAE, NPV the likelihood of an agile project meeting its Measure of mass (or use case points) desired outcome and timely delivery of agreed Change throughput (work items over upon scope and quality. time/changes over time), backlog The tool combines a variety of predictive trends analytic techniques including Monte Carlo simulation Bayesian probably, elicitation of expert opinion, and machine learning using project execution data.

Source: Forrester Research, Inc.

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

10

Figure 4 Whats Cooking? Value And Business Metrics In Agile (Cont.)


Approach Value Engineering (Ci&T) Description Type of metrics

Value Engineering focuses on understanding Planned value/e ort ratio (e.g., 30% of and stating the key business drivers that support the e ort delivers 60% of the business the value proposition of the product being built value) and prioritizing the list of features (the backlog) Adaptability: the percentage of in terms of features individual contributions to features not present in the initial to those business drivers. The process creates backlog objectivity in the discussion of the value that project features contribute, helping PMOs optimize the backlog for business value and facilitating convergence in multistakeholder initiatives.
Source: Forrester Research, Inc.

94261

DETERMINE YOur METRICS ACCORDING TO Project Complexity Inquiries from our clients and the interviews for this research make it clear that, when it comes to metrics, one size does not fit all. In a very complex, fast-changing problem domain, it will be hard to establish a clear value stream and align all of the project resources with it. In such a situation, determining metrics like the unitary cost of a business process, like the cost of processing a claim or the cycle time from concept to cash, will be difficult. In contrast, if you work at an organization where transparency and trust predominate and theres a culture of developer discipline, then more ambitious and effective metrics are possible. Before you start determining which metrics matter to you, make sure that you have a clear idea of what goal you want to accomplish.12 To select the right metrics, you should first determine your projects complexity profile and then identify metrics that align with the profile. Determine Your Projects Complexity Profile . . . Complexity theories have been around for years, but David Snowden has created a framework that we think works especially well when thinking about software development. The Cynefin framework defines four types of complexity domains: simple, complicated, complex, and chaotic (see Figure 5).13 Weve used the domains in the Cynefin framework to help define and select metrics appropriate for each problem domain youre likely to encounter. Our approach takes into consideration three key factors: the level of certainty or uncertainty of requirements, team cohesiveness (how long the team has worked together and whether teams are working in a cross-functional or siloed organization), and the project teams technology capabilities.14 To assess your projects relative level of complexity, answer the questions in our assessment tool and determine (see Figure 6):

How well do you know the requirements? How much uncertainty is there around your
business requirements? If youre building a new mobile app or modernizing an existing

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

11

application, the answer is probably A lot. If youre adding a new feature to a well-running system or youre emulating a competitors capability, then your challenge might be more straightforward. If you have an exact idea of what you need to build and best practices to implement your requirements are in place, its more likely that traditional development metrics and processes will work.

How cohesive is your project team? Here you need to assess whether project team members

have worked together on multiple sprints or if the team was just assembled for the purposes of the new project. An additional complexity to consider is whether the team is actually a crossfunctional team fully dedicated to the same project or is instead composed of team members that belong to different organizational silos (e.g., development, testing center of excellence, PMO) and are only virtually assembled for the purposes of this project. Teams that work together over time tend to develop high-trust relationships and establish a track record of successful estimation and delivery. If they dont, they tend to get reassigned to other projects or filtered out of the organization.

How well does the project team understand the technologies it uses? When it comes to

assessing technology complexity, there are three fundamental aspects to consider: the level of architecture complexity; the level of integration and dependency the application will need to run; and the newness of the technology itself, including how much exposure to and experience with the technology existing team members have, such as using mobile tools, using SOA principles, and dealing with complex APIs.

You can extend Forresters Cynefin-based assessment tool to include additional assessment drivers beyond project complexity. Keep in mind that the metrics you select will very much rely on the cultural context of your organization. We recommend that you prioritize metrics that promote transparency and trust in the way they are used.

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

12

Figure 5 The Cynefin Framework

Probe-sense-respond

Complex
Unknown

Sense-analyze-respond

Complicated
Mostly known

Disorder

Act-sense-respond

Chaotic

Turbulent, unconnected

Sense-categorize-respond

Simple
Known

Source: Cognitive Edge


94261 Source: Forrester Research, Inc.

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

13

Figure 6 Assess Your Cynefin Complexity


The spreadsheet associated with this gure is interactive. Requirements uncertainty Requirements are totally unknown and keep changing. Requirements are mostly unknown. Requirements are mostly known. Requirements are well-known and repeat themselves. Project team context The team consists largely or completely of new members and is not cross-functional. The team consists largely or completely of new members but is cross-functional. The team consists partially of new members and is not cross-functional. The team consists partially of new members and is cross-functional. The team does not have new members (>8 sprints or 2 releases together) but is not cross-functional. The team does not have new members (>8 sprints or 2 releases together) and is cross-functional. Technology capabilities The team has not mastered the technology and the technology is complex (major integration issues, complex app landscape). The team has not mastered the technology but the technology is simple (web app, little integration, simple app landscape). The team has mastered the technology and the technology is complex (major integration issues, complex app landscape). The team has mastered the technology and the technology is simple (web app, little integration, simple app landscape). Note: Only one True response is permitted in each of the three groups; each item within a group is weighted di erently. The tool calculates the nal score based on which items receive True responses; the nal score determines whether the project is simple, complicated, complex, or chaotic.
94261 Source: Forrester Research, Inc.

True or false?

True or false?

True or false?

. . . And Then Identify Metrics That Align With The Profile Once youve determined the complexity of your project and mapped it to the domains of the Cynefin framework, it will be easier to select the best metrics to measure success, progress, and necessary improvements (see Figure 7). As your projects move from simple to complex, the use of value-based and progressive metrics will vary quite significantly. One the other hand, we find that quality and efficiency metrics remain relatively stable across all Cynefin domains:

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

14

For simple projects, concentrate on business value metrics. When projects are simple and

straightforward, its not too difficult to compute business value metrics, such as ROI, the number of new customers, or percentage increase in sales, and use them to measure the project. Simple projects usually have known or stable requirements, and Agile development techniques are not as valuable in simple projects as they are in the complicated, complex, and chaotic domains. In fact, traditional metrics like the number of function points developed or number of completed features over time will work just fine. Another useful measure would be to examine whether the team is completing the right features over time by analyzing the number of completed features versus the number of new clients or increase in revenue.

Complicated project measures shift from business value to progressive quality and efficiency.
Complicated projects are good candidates for Agile delivery tactics, because requirements are less stable and uncertainty is higher. You might still be able to use some value metrics, like cycle time from ideation to cash generation on a specific project or revenue per employee, if your company has a culture of tight financial governance, but it wont be as straightforward and easy as it is with simple projects. In the complicated domain, velocity trends over multiple sprints can be correlated to customer satisfaction or sales increases, but the relationship might not be easy to create. From a quality perspective, complicated projects should focus on automation coverage while maintaining a focus on the mean time to repair (MTTR) for efficiency metrics.

Complex projects should employ strongly progressive metrics. Focus on metrics that

measure the health of your projects as well as measuring improvement toward both business and technical goals. While value metrics might be hard to correlate to more technical metrics, you can still offer transparency to show how predictability is improving and how you are truly listening to business and client requests by measuring validated learning. Because changes are anticipated in due course on Agile projects, you can use the rate of incoming changes and reaction to them to track and prove business agility. Technical debt is also a useful metric to share with the business for complex projects.

Chaotic projects should focus on reducing risk. Projects in the chaotic domain are really

experiments whose goal is figuring out potential causes and effects. A focus on validated learning as described in the Lean Startup approach is a good way to connect with the business and your end customers through an act-sense-respond approach. Attack the riskiest project items first and focus on feedback-oriented metrics like customer usage profiles, adoption rates, and the results of multivariate testing.

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

15

Figure 7 Select The Metrics That Match Your Profile


Project domain complexity Simple

Type of metrics that count Value: Business value metrics like ROI, percentage increase in sales, percentage increase in customers, unitary costs, etc., in addition to customer satisfaction. Quality: Number of defects in production; percentage of test cases covered Progressive: sizing (FP, SLoC) over man-hours or days; number of completed features. Operational: MTR

Complicated Value: revenue per employee; cycle time (concept to market); customer satisfaction Quality: number of defects in production, defects in development, percentage of automation, percentage of test cases covered Progressive: velocity over sprints, cycle time (build to deploy) Operational: MTR, risk Complex Value: predictability, customer satisfaction, customer usage, validated learning Quality: number of defects in production, percentage of automation, percentage of automation regression tests, number of automation execution failures, number of commit test failures Progressive: risk reduction, velocity over sprints, cycle time (build to deploy), technical debt, burndown Operational: MTR, cycle time (frequency of production deployment per sprint), cumulative ow, risk Value*: customer adoption rates, customer usage pro les, validated learning, sales percentage, number of customers, innovation accounting, multivariate testing Quality*: number of defects in production Progressive*: risk reduction (e.g., rate change, team cohesiveness) Operational: Measure e ciency only once risk is manageable.

Chaos

*The focus should be on emergent, feedback-based metrics.


94261 Source: Forrester Research, Inc.

R e c o m m e n d at i o n s

define, Adapt, And Combine your metrics to continuously improve Metrics and measurement programs cannot be the same for all types of projects and all types of organizations. As the complexity of your development environment evolves:

Shift from iron triangle metrics to value or benefit realization metrics. The

introduction of value metrics is where Agile and Lean and many of the thought leaders interviewed for this research are heading. Work to connect the dots between business value metrics and IT project efficiency or progressive metrics. If you have simple projects, start there; experiment and learn how to use value metrics to measure how development affects your business bottom line. Consider using multivariate testing to connect business value to development when more traditional approaches dont work.15

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

16

Use quality metrics to measure if youre doing the right things as well as doing things

right. You will always have to include this category of metrics for any project, no matter what its complexity. Quality metrics address functional quality, which helps you answer the question: Am I doing the right things for the business? Example metrics include customer experience and adaptability (rework trends). Quality metrics must also address technical quality, so look to measures like defects in production, defect density, and failure rates. Agile teams should track quality metrics like the number of defects per sprint and release and, more importantly, metrics on automation trends (for test cases as well as process automation).

Use progress metrics to keep track of project health. Lean and Agile thinking is all

about maximizing value for the business; progress metrics should continuously track the advancement toward that goal. If you are a mature Agile team, focus on cycle time. Cycles tracked should focus on throughput in different segments of the life cycle; from build to deploy is a common one when continuous deployment is part of your daily practice. Other useful metrics are the level of technical debt and reduced risk (and/or increased predictability).

Optimize your efficiency tracking metrics. Efficiency metrics are useful when youve

reached a stable project state. MTTR is one of the key metrics in this category; its particularly helpful if you have a strong root-cause analysis process and act quickly to fix identified problems. MTTR is also an indicator of good and bad technical and architectural quality. For more advanced Lean teams, cumulative flow is an effective measure used to track the throughput efficiency of value streams.

W h at i t M e a n s

LINK METRICS TO CLEAR GOALS AND MANAGE THE LIFE CYCLE Metrics have a life cycle. Once youve defined metrics, you will have to adapt and combine them, because their impact will change as complexity of your projects change. Use and extend our Cynefin metrics framework to manage the life cycle of your metrics. Make sure that you always link metrics to clear goals (business and IT) and that they help you drive process improvement and become a better application development team for your business.

Supplemental Material Online Resource The online version of Figure 6 is an interactive tool to determine your project complexity.

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

17

Companies Interviewed For This Report 407 ETR A global automotive manufacturer Ci&T CollabNet Dominion Digital Evolve Beyond IBM Rally Software Scrum.org Serena Software Tom Gilb & Kai Gilb XebiaLabs

Endnotes
1

Systems of engagement are software applications that help decision-makers make decisions on the fly, support clients in executing important personal and business tasks anytime and anywhere, and provide crucial in-the-moment information when needed. See the November 16, 2012, Great Mobile Experiences Are Built On Systems Of Engagement report. Tom and Kai Gilb have a rich repertoire of books, blogs, and papers. Source: Tom Gilb & Kai Gilb (http:// www.gilb.com).

Gabrielle and Robert Benefield are defining the outcome delivery method on the roots of EVO. Source: Evolving Systems (http://www.evolving.com/).

Walker Royce has written on the key role of uncertainty in software development and how to manage it. Source: Walker Royce, Measuring Agility and Architectural Integrity, ISCAS, March 2011 (http:// walkerroyce.com/PDF/Measuring_Agility.pdf). Source: Barry W. Boehm, Software Engineering Economics, Prentice Hall, 1981. In the late 1980s, NASA developed a handbook of software development practices. Many of their conclusions hold true today. Source: Managers Handbook for Software Development, NASA, November 1990 (http://homepages.inf.ed.ac.uk/dts/pm/Papers/nasa-manage.pdf).

Story point is an arbitrary measure used by Scrum teams. This is used to measure the effort required to implement a story. A backlog is a list of features or technical tasks that the team maintains and which, at a given moment, are known to be necessary and sufficient to complete a project or a release. In the Scrum framework, all activities needed to implement entries from the Scrum product backlog are performed within sprints (also called iterations). Sprints are always short: normally about two to four weeks. Source: What is a story point? AgileFaq, November 13, 2007 (http://agilefaq.wordpress.com/2007/11/13/what-is-astory-point/). It is critical for application development professionals to be able to effectively and objectively answer the question How big is your software project? in order to provide effective metrics, improve estimation

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

For Application Development & Delivery Professionals

Agile Metrics That Matter

18

practices, target improvement initiatives, and refine governance and architectural processes. For more, see the July 27, 2009, Software Size Matters, And You Should Measure It report.
9

One commonly used method during the estimation process is to play planning poker (also called Scrum poker). When using planning poker, influences between the participants are minimized, producing a more accurate estimate. Source: Scrum Effort Estimations Planning Poker, International Scrum Institute (http://www.scrum-institute.org/Effort_Estimations_Planning_Poker.php). Baselining and benchmarking at the organizational level might be useful in transformation programs, but you must track a statistically relevant number of projects. Commit tests are a basic set of tests that are run against each code commit to the mainline source trunk. The best way to link metrics to clear objectives and goals is through the goal question metric framework. For more, see the August 26, 2011, Introducing ADAM: Forresters Application Development Assessment Methodology report. For details on complexity theories and the origins Cynefin framework, check out the Cognitive Edge website. Source: Cognitive Edge (http://cognitive-edge.com/library/more/articles/summary-article-oncynefin-origins/). Many of David Snowdens research papers and blogs make reference to complexity drivers mostly influenced by the people factor as well the uncertainty around requirements. For software projects, we think that how well (or poorly) teams master technology also makes a big difference. When there isnt an easy connection between business metrics and operational metrics. the hypothesis development approach is the best for trying to establishing the correlation and tactics like multivariate testing help to do so. This is a widely used approach in mobile development. Source: Avinash Kaushik, Experimentation and Testing: A Primer, Occams Razor, May 22, 2006 (http://www.kaushik.net/avinash/ experimentation-and-testing-a-primer/).

10

11 12

13

14

15

2013, Forrester Research, Inc. Reproduction Prohibited

September 9, 2013

About Forrester
A global research and advisory firm, Forrester inspires leaders, informs better decisions, and helps the worlds top companies turn the complexity of change into business advantage. Our researchbased insight and objective advice enable IT professionals to lead more successfully within IT and extend their impact beyond the traditional IT organization. Tailored to your individual role, our resources allow you to focus on important business issues margin, speed, growth first, technology second.
for more information To find out how Forrester Research can help you be successful every day, please contact the office nearest you, or visit us at www.forrester.com. For a complete list of worldwide locations, visit www.forrester.com/about. Client support For information on hard-copy or electronic reprints, please contact Client Support at +1 866.367.7378, +1 617.613.5730, or clientsupport@forrester.com. We offer quantity discounts and special pricing for academic and nonprofit institutions.

Forrester Focuses On Application Development & Delivery Professionals


Responsible for leading the development and delivery of applications that support your companys business strategies, you also choose technology and architecture while managing people, skills, practices, and organization to maximize value. Forresters subject-matter expertise and deep understanding of your role will help you create forward-thinking strategies; weigh opportunity against risk; justify decisions; and optimize your individual, team, and corporate performance.

Andrea Davies, client persona representing Application Development & Delivery Professionals

Forrester Research, Inc. (Nasdaq: FORR) is an independent research company that provides pragmatic and forward-thinking advice to global leaders in business and technology. Forrester works with professionals in 13 key roles at major companies providing proprietary research, customer insight, consulting, events, and peer-to-peer executive programs. For more than 29 years, Forrester has been making IT, marketing, and technology industry leaders successful every day. For more information, visit www.forrester.com. 94261