You are on page 1of 27

StickyMinds.

com and Better Software magazine presents…

Avoid Throwaway Test Automation
Sponsored by Cognizant
Non-streaming participants should call 1-866-761-8643 International Non-streaming participants should call 1-904-596-2362

Setting the Context
What we mean by “automated testing”  Other types of tool-assisted testing  Principles will apply to other types  Many topics deserve more attention  Automated testing is software development

What Typically Goes Wrong
Create automated tests that don’t run anymore  Spending too much time babysitting and maintaining the automation  Automated tests are too brittle  Tools don’t work in environment  Automated tests not providing value

Common Mistakes
No plan for implementation  No buy-in from staff or management  No training for automators  No time allotted to automate  No time allotted for maintenance  No framework for reusability  Good intentions, poor execution

Why Automate?

   

Sounds cool Boss said so Can’t keep up Lots of repetitive tests Lots of data driven tests Reduce time spent on regression testing

Develop an Automation Plan
 Why

you will automate  What to automate  When to automate  Who will automate, execute, maintain  How to automate – Framework  How to report results  Where to run tests

What to automate
Smoke tests  Repetitive tests  Can run autonomously  Big risks  Take less time to automate than execute

 Should

run 3-5 times without changing

Data intensive tests

When to automate

When you’re ready Depends on development lifecycle
 Waterfall – may wait until the end  Agile – may need continuous automation

Not too early, not too late
 Need stable-ish UI  Before you have to

regression test

Plan time for automation and maintenance
 Manage

as part of regression testing time

Selecting the Right Tool

Define your tool requirements
 What

you need it to do  Compatibility with your application  Compatibility with your skillsets

Try it out  Beware of the hype

 “Record

and Playback” rarely is that simple

Open Source Tools
Free to acquire, not to use  More time required for implementation

 Installation

and configuration  Learning to use the product

More technical skills required  http://opensourcetesting.org

Develop a Framework

Organization of artifacts Aim for reusability
 Across

features, product versions, and products  Separate interface from functionality

Dealing with common activities
 Object

recognition  Navigation  Data validation

Dealing with Change

Plan for changes in UI
 How

to respond to test failures  Flexible object recognition

Make tests data independent
 Reduce

dependencies between tests  Set up test data in cleanup scripts  Script tests to use dynamic data

Enlist help of developers to ease automation

Automation Assessment

© 2008, Cognizant Technology Solutions. All Rights Reserved. The information contained herein is subject to change without notice.

Testing Services

Objective
Objective Background

• Automation Assessment Approach •Process •Infrastructure •Tools •Framework •Operating Model •Best Practices

14

Assessment Scope
Assessment Scope
              Automation Prioritization  Planning & Strategy  Approach Estimation Model Documents Guidelines Review procedures & Checklists Metrics Collection Configuration management Audits/Assessments Change Management Environment management Process Defect management Maintenance Approach Environment Management Test bed creation and maintenance            Architecture & Type Test Data Management Reporting Mechanism Error & Exception Handling Folder Structure Scalability Reusability Function Library Object Repository Database Testing Batch Execution

Infrastructure

Focus Areas
       Scripting Standards User Guides Maintenance Process Handbook Dynamic Script Allocation Automation Review Tool KR portal Integration of automation scripts to test management tools

Operating Model

    

Roles & Responsibility Governance Model Organizational SLAs Project structuring Communication

    

Functional Automation Test Management Configuration Management Defect Management License Management

15

Process
 Communication and Collaboration – (BA’s, Developer’s, Manual Tester’s, etc)  Identification and Prioritization  Planning and Estimation  Change Management  Maintenance Approach

16

Tools
     Functional Automation Test Management Configuration Management Defect Management Open Source

17

Framework
       Architecture and Type Test Data Management Reporting Mechanism Reusability Maintainability Object Repository Database Testing

18

Infrastructure
 Environment Management  Test Bed creation and management

19

Best Practices
     Scripting Standards User Guides Maintenance Process Handbook Automation Review Tool KR portal

20

Operating Model
 Roles and Responsibilities  Organizations SLA’s  Project Structuring

21

Approach - Highlights
Highlights
Definition of automation framework Customized metrics framework

Focus Areas
Organization – Organization structure, Training Methodology – Automation approach, Functional automation, Metrics and reporting

Define governance model
Structured methodology automation testing Assess current automation capabilities Identify the ideal automation tool Set up communication model and status reporting

Benefits
Well defined Organization structure and Governance model in place Consolidation of automation tool Fully customized metrics framework for implementation across application Defined communication and workload processes for onsite - offshore co-ordination Use of reusable automation scripts Well defined independent and peer review procedures in place

22

Testing Services Practice Overview
8500 (E)
Independent Verification & Validation Service (IV&V)

5000 2400 75
2001 & 2002
Launched to provide specialized functional testing services to existing Cognizant customers Integrated with other value added services such as Performance testing

170
2003
Offered as a distinct service offering to customers Established onsiteoffshore model for testing

800
2004
End-to-end IV&V services provided Brought in domain alignment (Domain Product Testing and BA/QA Offering)

2005
Engaged with clients to setup Managed Test Centers Commenced new client engagements with Test Consulting Focused on Automation and Mainframe CoEs

2006
Enhanced service offerings such as compliance testing, package testing, White-box testing as well as Domain/ Product Testing (VisionPLUS, FACETS & POS)

2007
Delivery excellence through deployment of innovative methodologies. Expand Global footprint

OUR DOMAIN FOUNDATION
Integrated BA / QA Offering in collaboration with domain practices BFS

Insurance Communications Manlog

IV V &
Over 70% of testing performed against code provided by client or thirdparty vendors.

INDEPENDENCE

Team of over 5000 dedicated SQA professionals

PEOPLE

CENTER OF EXCELLENCE

Invested in focused groups around tools & frameworks to provide client value-adds

?
Retail Technology Healthcare Life Sciences IME Established alliances with leading tool vendors like Mercury, Borland & IBM Rational

ALLIANCES

200+ Clients with 10+ Deep client engagements with over 100 people

CLIENTS

23

Value Adds
CRAFT defines the method for scripting of business functionalities as reusable libraries that are repetitive among test cases
CRAFT Bulk uploads QTP scripts, attachments and folder structure to Quality Center
CRAFT 2.0 is a tool which streamlines the test execution activity during test automation, it dynamically executes the test cases in multiple machines in a distributed environment

AHEAD

CRAFT 2.0

SOA testing solutions to test business logic. It enables client to execute data-driven web service testing without any programming knowledge

WS Test Professional

DataXpress DataXpress is a automated test generation tool which enables to streamline the test data preparation activity

24

Value Adds
Return on investment details to have maximum transparency to client before an automation engagement.
ROI Calculator
It integrates and synchronizes the defects management module of Quality Center with that of Bugzilla. QC2bugzilla
Selenium test manager

It is an automation functional test tool developed for web automation

Web Application Testing in Ruby (WATIR) is an open source function testing framework to test any web application built on ASP, .NET, J2EE or PHP

Watir

Win2Pro Converts Winrunner scripts to QTP automatically

25

Thank you

© 2008, Cognizant Technology Solutions. All Rights Reserved. The information contained herein is subject to change without notice.

Testing Services

Q&A
Have a question for the speakers? Ask now.