You are on page 1of 6

Agile 2008 Conference

Two Case Studies of User Experience Design and Agile Development

Maryam Najafi Manager, User Experience, VeriSign, Inc. Len Toyoshiba User Experience Engineer, VeriSign, Inc.

How can user experience design (UED) practices be leveraged in Agile development to improve product usability? UED practices identify the needs and goals of the user through user research and testing. By incorporating UED in Agile development, user research and testing can be utilized to prioritize features in the product backlog and to iteratively refine designs to achieve better usability. Furthermore, integrating UED and Agile processes can be accomplished with little or no impact on release schedules. The cases studies presented in this paper describe two examples of UED and Agile integration at VeriSign.

This paper presents two case studies of Agile projects involving the User Experience team at VeriSign. The first case study, the development of a new consumer-based retail Web site, successfully integrated UED practices into the Agile development process. The second case study, a redesign of an existing Web site, utilized Agile practices, but excluded the User Experience team from participating in Sprint planning and Scrums. Features were developed in isolation and no opportunity to iteratively refine the designs was permitted. In both cases, the extent of User Experience involvement, including collaboration with the cross-functional team and integration with Agile development processes, had a significant impact on the success of the projects.

1. Introduction
Can user experience design (UED) and Agile development utilize complementary processes to achieve their goals? In the UED process, the users objectives and the methods by which they want to accomplish their tasks are discovered through analysis and user research [1]. From this discovery, a set of features the user requires is identified and initial designs are created for a specific feature and user tested [2]. The designs are then refined, redesigned as needed, re-tested and evaluated comprehensively with other features in an iterative process. Iterations continue until the designs meet defined acceptance criteria [2, 3]. Agile development processes identify features to implement based on a prioritized product backlog [4]. A specific feature is developed in Sprints. Like the UED process, the feature is researched, designed, tested and documented. Subsequent Sprints are used to refine the design based on an evolving set of requirements [5]. Because of these similarities, UED practices can be incorporated into Agile development to help improve product usability with little or no impact on release schedules.

1.1. Background
VeriSign, Inc. operates Internet infrastructure services that enable and protect billions of interactions every day across the world's voice, video and data networks. In addition to domain name registration and resolution, VeriSign is a leading provider of securityrelated products including SSL certificates, identity and authentication services, and enterprise security management. In 2006, VeriSigns product development teams began adopting Agile development methods and principles to achieve faster and more adaptable execution, emphasizing customer needs while still conforming to established security requirements.

1.2. The User Experience team

The VeriSign User Experience team consists of user interface (UI) designers, Web developers, user research and usability specialists, visual designers and interaction designers. The duties of the team encompass user research, interface design, front-end implementation and usability testing.

978-0-7695-3321-6/08 $25.00 2008 IEEE DOI 10.1109/Agile.2008.67


The User Experience team is a member of a shared services organization within VeriSign and is responsible for managing the user experience of VeriSigns product offerings. Some members of the team are dedicated to specific products. Others divide their time among multiple products depending on the skill set needed. This helps to focus the expertise and efforts of the team, and fosters a more collaborative relationship with product management and engineering development. With the shift to Agile practices, there was an opportunity for the User Experience team to ensure that user-centered design became an integral part of the development methodology. The teams process of performing user research, defining and iterating on interactive and visual design, conducting usability testing, and implementing the front-end was adapted to a use a discrete feature-by-feature approach in order to better align with the development Sprints [6]. However, this process also required much greater interaction and continuous communication with the larger cross-functional team. The first case study will describe the details of this process and how it was applied successfully to Agile development.

involve the User Experience team in the Agile process. This resulted in miscommunication between the User Experience team and the engineering team on several key features.

3. The Falcon Project

The objective of the Falcon project was to create and deliver a consumer-based Web site that would allow users to procure and manage devices to protect their online accounts. These devices would help prevent phishing and other methods of identity theft. The Falcon Project was one of the first products at VeriSign that was developed using Agile processes. It was also the first consumer-based offering from the identity and authentication services product line. Two months of investigation and research were required to determine if the product was marketable enough to proceed with development. Once research was completed and the decision was made to proceed, five months were scheduled for development based on market conditions, customer interest and available resources.

3.1. The team

The cross-functional development team for the Falcon Project consisted of the product manager, four development engineers, three QA engineers and three members of the User Experience team. One member of the User Experience team was responsible for user research and testing. Two other members were responsible for design and implementation, and regularly participated in Sprint planning and daily Scrums. All members of the team were centrally located in Mountain View, California.

1.3. The cross-functional team

The Agile cross-functional development team at VeriSign includes a product manager, engineering, QA, User Experience, and a project manager. The project manager supervises the release schedule and helps resolve logistical issues, but otherwise does not participate in the development process. The engineering team, consisting of three to eight developers, implements and delivers to QA a feature or set of features per Sprint based on a prioritized backlog determined by product management. QA tests the features in that Sprint and identifies issues. If any issues are found, engineering adjusts or fixes the features in the current Sprint. At VeriSign, Sprints are two weeks in length, with seven of the ten working days of the Sprint dedicated to implementation by engineering and three to QA for testing.

3.2. User Experience participation

Because marketing requirements and target audience were only partially defined, product management engaged the User Experience team to research and gather user requirements. The research results provided enough collateral for product management to approve the project and enabled them to identify the product feature set in preparation for the start of development. As a result of user feedback from the initial user research and testing, and the new challenges of designing and developing a consumer-based product, the User Experience team was asked to participate in the release planning and Sprints. The results of user research and testing were instrumental in prioritizing the product backlog. For example, product manage-

2. The case studies

Two case studies, the Falcon Project and the Razor Project, will be presented describing the User Experience teams involvement with Agile development. The Falcon Project successfully incorporated the User Experience team and UED practices with positive results. The Razor Project, in contrast, did not


ment initially decided that mandatory account sign in was a high priority feature. Users would have to sign in to their account before they could perform any other function on the site. However, user testing clearly indicated that users preferred not to sign in to their account to complete their tasks. As a result of these findings and the minimal security risk involved, mandatory sign in was removed from the backlog. User Experience involvement in the Sprints was a key factor in successfully focusing the cross-functional development teams efforts on the requirements of the users.

reviewed with the cross-functional team, and then handed off to engineering for development. User Experience participation in the daily Scrums was a crucial factor in the development process. In addition to communicating progress, it was a convenient forum to share the results of user research and testing, and to clarify any misinterpretations of the designs. 3.3.1. Sprint zero. Sprint zero is an additional two week Sprint that occurs prior to the start of development. It is typically used to gather requirements, and to identify and prioritize the product backlog [7]. At VeriSign, Sprint zero is used by the cross-functional team to review requirements and create initial user stories based on the backlog items. The engineering and QA teams additionally select their development environments and system platforms. The User Experience team specifically utilizes Sprint zero to better understand the users, explore their context and identify their goals for the project as a whole. The data obtained from the initial user research effort is used to negotiate the priorities of the first Sprint and to communicate the users expectations to the crossfunctional team. Personas and user scenarios are developed for features in subsequent Sprints. Designs for features engineering plans to implement in the first Sprint are also created and tested. 3.3.2. Learning and refining in subsequent Sprints. The staggered development approach was used throughout the development process. However, the User Experience team, as a result of constant user testing, realized that the Sprints tended to narrow the focus on feature development without regard for the overall user goals. An example of this was identified in the purchase feature of the product. In the purchase process, users created an account on the Web site to check the status of their orders and manage other sections of the site. Due to engineerings interpretation of the design, the original implementation of the purchase feature did not automatically sign in users to their account after they had created their account. When the purchase and order status features were tested together, the results showed that users wanted to be able to view the status of their orders after they completed their purchase. Since users were not signed in to their account, they were required to either sign in or provide their order number to view the status of their order. In this example, although the Sprint delivered a functional piece of the product, it did not meet users expectations. To address these issues, the User Experience team began utilizing a part of each Sprint

3.3. User Experience and Agile development

To adapt to Agile development processes, the User Experience team performed user research and testing, and completed designs for a specific feature one Sprint ahead of engineerings implementation of the feature. This staggered approach (Figure 1) [6] allowed the team to keep ahead of engineerings development.
Sprint 0
Personas and user stories

Sprint 1
User research and testing

Sprint 2
User research and testing

User Experience

User research and testing

Refine earlier designs

Refine earlier designs

UI designs for Sprint 1 features

Hand off

UI designs for Sprint 2 features

UI designs for Sprint 3 features

Hand off

Engineering Development

Implement features including UI

Implement features including UI

Figure 1. Staggered development method used for the Falcon Project During the Sprint, the engineering team and product management were invited to view user testing sessions, discuss findings and review preliminary designs. This helped engage the cross-functional team in the UED process, promoted the sharing of ideas, and enabled the engineering team to prepare for the next Sprint. At the end of the Sprint, final designs and specifications were


learning what users wanted and refining designs to meet their needs. The first week of each Sprint was spent researching and user testing a new feature in context with other implemented features. Users were asked to complete an entire process flow instead of feature-specific tasks. This allowed the User Experience team to understand what users wanted to achieve as an overall goal as opposed to verifying whether they could use a particular feature to perform a specific task. The second week was then used to design the new feature and refine the designs of implemented features to address issues identified in the previous weeks testing. In the example above, as a result of learning what users expected and refining the purchase process to satisfy those expectations, users were automatically signed in to their account after they completed their purchase and were able to view the status of their orders without providing additional information. Because features were comprehensively tested together in each Sprint, the amount of reimplementation or refactoring required for existing features as a result of testing was minimal. In most cases, the engineering effort could be absorbed at the end of the Sprint during QA testing and engineering bug fixing. 3.3.3. Leveraging user research and testing to prioritize the backlog. By testing new features comprehensively in context with previously implemented features, the Agile development team learned what users expected and wanted, and could refine the designs to meet their needs. The results of user testing also had a significant influence on the prioritization of the product backlog. For the Falcon Project, one of the backlog items was a global help system for users to obtain help with the Web site. Because the Web site was designed to have relatively short and straightforward processes, a comprehensive help system was initially a low priority item. However, after user testing several features, users would sometimes require assistance to understand the meaning of specific fields or labels in a form. The help system was then given a higher priority. However, further testing revealed that users wanted to view help content in context with the field or label they did not understand. This created an additional higher priority backlog item to develop a page level in-context help system. This feature was assigned to a Sprint and was implemented successfully throughout the Web site with positive user feedback.

4. The Razor Project

The Razor Project was a redesign of an existing product that allowed customers to purchase and manage services online. The goal was to increase sales and customer retention, and reduce the number of support calls by providing an improved user experience during the purchase and product renewal processes. With the existing product, the user had to perform as many as eight steps to purchase the product and required account credentials to complete the renewal process. Users often forgot or had no direct access to their credentials, and with no online password recovery process, users were forced to call customer support to manually renew their product. Competing products provided a simpler and more consistent user experience for purchasing their products. The existing product had been in use for several years and had a large, established user base. The challenge was to redesign the Web site such that existing users could easily transition to the new interface and flows. The Razor Project was also one of the first products at VeriSign that was developed using Agile processes. While similar in scope to the Falcon Project, the development time for the Razor Project was eleven months due to the number of issues that were identified and corrected during the project.

4.1. The team

The cross-functional development team for the Razor Project consisted of the product manager, eleven development engineers, two front-end Web developers, two QA engineers and four members of the User Experience team. Two members of the User Experience team were fully dedicated to the project and were responsible for user research and testing. The other two members (one full-time and one parttime) were responsible for the design, implementation and support of the user research and testing efforts. The team members were sparsely located with the product manager and User Experience team located in Mountain View, California and the rest of the team located overseas. The overseas team was assigned the development task since they had previous experience implementing a similar product.

4.2. User Experience and Agile development

Since the challenge of the project was to transition existing customers to new interfaces and processes while decreasing the number of support calls, product management engaged the User Experience team to


perform user studies on a set of proposed improvements. After several rounds of testing and exchanges with product management, a high level design specification and a prototype were generated by the User Experience team. Product management used the design specifications to define the product requirements. However, because several months had passed since the initial engagement, changes in the market and resource issues demanded changes to the requirements. To accommodate these shifts in requirements, the cross-functional team decided to adopt an Agile development approach. When Agile development started, the User Experience team requested participation in the release planning and Sprints. The high level designs and prototype were provided to engineering with the expectation that detailed mockups for specific features would be delivered in each Sprint. Since the engineering team and the User Experience team were separated geographically by a nine hour time difference, the User Experience team was excluded from participating in the release planning, Sprints and Scrums. Only the overseas team (engineering, Web development and QA) used Agile for development. The User Experience team was not informed of the features being implemented in a particular Sprint, so no user research or testing was performed and no detailed design specifications were created. The front-end Web developers created the UI based on their understanding of similar products that they had developed. They handed off their designs to the engineering team who implemented the features using the high level design specifications as a reference. 4.2.1. Development in isolation. The engineering team acted as the representative for all of the overseas teams. They managed the Web development and QA teams and sheltered them from the rest of the crossfunctional team in order to maintain control over the development efforts and schedule. When the first feature was implemented and approved by QA, the engineering team arranged a demo for product management and the User Experience team. After a review of the feature, several issues were identified. The issues were added to the backlog as enhancement items. Several more features were developed based solely on the high level design specifications and additional demos were arranged. Issues were again identified and added to the backlog. The engineering team used several Sprints to correct the highest priority issues. 4.2.2. Falling back to waterfall processes. The original scheduled release date could not be met due to

a number of key issues on the backlog. To prevent further misinterpretations of the design, the engineering team requested that the User Experience team provide detailed specifications for all planned features including comprehensive flow diagrams and high fidelity mockups of all screens. Screen shots of all variations of dynamic areas on a page also had to be generated. Despite these efforts, issues continued to be identified and added to the backlog. Figure 2 describes the interactions of the crossfunctional team as the development process evolved over time.
User Experience
Project Start

Engineering, Web developers, QA

User research and testing High level designs and specifications Implement features in multiple Sprints

Demo of implemented features; identify issues with design and add to backlog Web developers review specs with User Experience Engineering implements features in multiple Sprints Demo of implemented features; identify issues with design and add to backlog Product released with incorrect implementation of the design; product recalled to address issues

Detailed specs for all features

Project End

Figure 2. Cross-functional team interactions for the Razor Project 4.2.3. Feature completion does not result in a shippable product. After nine months, the product was released as a beta to select end customers. Although all planned features were implemented, no user testing had been performed and several of the usability issues identified on the backlog had not been corrected. The beta release showed that some users were not able to complete the purchase process. An analysis of user interaction with the beta release showed that several discrepancies in the flows User Experience originally designed and the flows that engineering implemented were preventing users from completing their task.


The engineering team used two months to correct issues with the purchase flow. The product was eventually released to production after eleven months of development.

objectives after analyzing and correcting key usability issues.

6. Summary
As these two case studies demonstrate, the benefits of involving the User Experience team and applying UED practices to the Agile development process more than offset the risks of potentially impacting product release dates. UED practices are iterative in nature and naturally complement the iterative nature of Agile development. However, successful integration of the User Experience team requires full cooperation and collaboration with all cross-functional team members. Learning what users want and expect from the product helps to prioritize features in the product backlog. Consistent user testing and refinement of designs ensures that the product is developed to meet the needs of its users.

5. Analysis
Although the User Experience team was involved in both the Falcon Project and the Razor Project, overall product usability differed significantly. Both projects adopted Agile methods for development in order to adjust to changing market requirements, but the ability to perform user research, understand what users want and expect from the product and develop features that users need to achieve their goals ultimately affected release dates, development resources and the success of the products. The Falcon Project was released on schedule and satisfied all of its requirements as a consumer-based Web site to purchase and manage security devices. The project was a true collaborative effort among all of the cross-functional team members. For the User Experience team, participation in the Sprints and Scrums was a key factor in focusing product development on the user experience. The User Experience team was able to adapt its processes to deliver designs as the engineering team began implementing them. As implemented features were user tested, refinements to the design were accommodated in the remaining Sprints. The iterative nature of Agile development and the inclusion of the User Experience teams efforts in the development schedule allowed time in the release planning for this to occur. In contrast, the Razor Project initially did not satisfy its requirements of increasing sales and improving customer retention due to a lack of communication among the cross-functional team members. Geographical separation and unwillingness by the engineering team to collaborate with non-engineering teams prevented the product manager and the User Experience team from participating in the Sprints and Scrums. Without participation and continuous feedback from the User Experience team, the engineering team interpreted and implemented the designs incorrectly. As development progressed, all efforts of the team were focused on correcting issues with features that were already implemented. No opportunities for user testing or refinement of the designs were allowed and the Agile process was eventually abandoned. When the product was released as a beta, it did not provide a better user experience than the existing product. However, the product was able to meet its original

7. References
[1] Beyer, Hugh, and Karen Holtzblatt, Contextual Design: Defining Customer-Centered Systems, Academic Press, San Diego, California, 1998. [2] Rubin, Jeffrey, Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests, John Wiley & Sons, Inc., New York, New York, 1994. [3] Cooper, Alan, Robert Reinmann, and David Cronin, About Face 3: The Essentials of Interaction Design, Wiley Publishing, Inc., Indianapolis, Indiana, 2007. [4] Schwaber, Ken, and Mike Beedle, Agile Software Development with Scrum, Prentice Hall, Upper Saddle River, New Jersey, 2001. [5] Shore, James, and Shane Warden, The Art of Agile Development, OReilly Media, Inc., Sebastopol, California, 2008. [6] Desire Sy, Adapting Usability Investigations for Agile User-centered Design, Journal of Usability Studies, Vol. 2, Issue 3, May 2007, pp. 112-132. [7] Dan Rawsthorne, Sprint Zero, May 2007;