You are on page 1of 23

Why Test Automation Fails

In Theory and In Practice

May 20th, 2016

Jim Trentadue – Software Quality Consulting Director


jtrentadue@origsoft.com

www.origsoft.com | © Original Software 1


www.origsoft.com | © Original Software
Agenda

1 Test Automation Industry recap & current trends

2 Adoption challenges with Test Automation

3 Why are Test Automation adoptions failing

4 Why are Test Automation implementations failing

5 Correcting the Test Automation approach, concepts and applications

6 Session recap

www.origsoft.com | © Original Software 2


Test Automation Industry:
Recap & Current Trends

www.origsoft.com | © Original Software 3


Test Automation Industry - recap

First Record \ Playback tools appeared about 20-25 years ago

The primary objective was to test desktop applications. Some tools


could test broken URL links, but not dig into the objects

The tester was required to have scripting or programming knowledge


to make the tests run effectively, even in Record \ Playback

Structured Hybrid
Testing Model / (combines 2
Record \ Keyword Actions
(invokes Data Driven Object or more of
Playback Driven Based
more Based the previous
conditions) frameworks)

www.origsoft.com | © Original Software 4


Test Automation Industry – current trends

Number of test automation tools have increased to over 25-30, focused on


taking the scripting / programming out of the mix for the manual testers

Technical complexities have increased considerably to cover the following:

Manual Testers are slow to delve into test automation because almost every
market and open source solution requires some degree of coding

Products are becoming single niche tools, sometimes not offering a fully
integrated quality solution for the SQA organization

www.origsoft.com | © Original Software 5


Adoption of Test Automation;
Common myths and perceptions

www.origsoft.com | © Original Software 6


Adoption challenges
New challenges below have slowed productivity

Rapid deployment times to production


Shifts from a Waterfall SDLC to an Agile Methodology
How Test-Driven Development (TDD) impacts when to automate tests
How component testing such as: Database, Data Warehouse, Web-Service or
messaging testing impacts what tests to automate
To ensure success, many organizations have employed a Proof of Concept

www.origsoft.com | © Original Software 7


Common myths & perceptions
Significant
increase in
time and
people

Existing
Implement
testers will
with a click
not be
of a button
Test needed
Automation
can’t be
accomplished!

Associates
Will serve
training will
all testing
have no
needs
impact

www.origsoft.com | © Original Software 8


Why is the Test Automation
adoption failing in organizations?

www.origsoft.com | © Original Software 9


Case Study 1: Management Struggles
Excerpts from a test engineer’s interaction with mgmt.
“It Must Be Good, I’ve Already
Advertised It”
Management’s intention was to reduce testing time for “Testers Aren’t Programmers”
the system test. There was no calculation for our
My manager hired testers not be to programmers
investment, we had to go with it because manual
so he did not have more developers in the
testing was no longer needed. The goal was to
department. Once told that some tools require
automate 100 % of all test cases.
heavy scripting / coding, the message was relayed
that testers need to do “Advanced Scripting”, rather
than “Programming”
“Automate Bugs”
An idea by one manager was to automate bugs
we received from our customer care center. We
were told to read this bug and automate this “Impress Customers (the Wrong Way)”
exact user action. Not knowing the exact My boss had the habit of installing the untested beta
function, we hard-coded the user data into our versions for presentations of the software in front of the
automation. We were automating bugs for customer. He would install unstable versions and then
versions that were not in the field anymore. call developers at 5:30am to fix immediately. We
introduced automated smoke tests to ensure good builds.

Experiences of Test Automation – Case Studies of Software Test Automation: Graham & Fewster

www.origsoft.com | © Original Software 10


Adoption questions
Aligned into categories for questions not answered or asked

Who What When Where Why How

• are the right • is our • are testing • will the test • start a test • do our test
people to organization’s resources automation automation automation
create, definition of available for take place; initiative – objectives
maintain test automated this initiative? dedicated test what are the differ from our
cases and testing: what environment? specific overall testing
object it is and what • can training issues? objectives?
repository it is not? happen in • are the
information? relation to the associates • run a vendor • does manual
• are the test procurement? located? This POC testing fit into
• may execute automation will decide program; a test
automated objectives? • are there whether a what do we automation
tests instead project training want from this framework?
creating? • is the deadlines that should be assessment?
expected compete remote on • will a full
• will be the testing against test onsite • start with a integrated
dedicated go coverage automation? top-down quality suite
to person(s) sought after • will a central management function?
for the tool? with test repository approach?
automation? reside?

www.origsoft.com | © Original Software 11


Adoption Failures
The not so best practices & assumptions for adoption
• Manual Test Cases not written for automation
Automates
• Error handling is not considered as part of manual testing
manual test
cases • Modularity isn’t a key element for manual test case planning

• Testers should just automate the entire regression test library


Must cover
100% of the • Application changes instantly added to automated library
application
• Full overnight run with discrepancies logged as app. defect

• Replace manual testing time with automated testing time


Testers can
automate • Manual testers can automate their own tests
in a release
• Separate initiative is not required, ample time within project

www.origsoft.com | © Original Software 12


Why is the Test Automation
implementation failing in organizations?

www.origsoft.com | © Original Software 13


Case Study 2: Choosing the Wrong Tool
Background for the Case • Goal was to automate testing of major functionality on web
Study development tools for internet software development company

• Most functional tests were executed manually. Some dated


Pre-existing Automation automation scripts could potentially serve useful if restored

New Tool or Major • The GUI had gone through major changes for the application.
Maintenance Effort? Tool had to be flexible with app. changes & easy to maintain

Moving Forward with • Consensus was tool required coding; difficult for manual
existing tool? testers. Also, native object recognition provided did not work

What was done after the • Understand how automated tools use objects and realize that
tool was replaced? identification by image was not adequate

• Examine solutions that work best with the your applications


Conclusion and controls. Maintainability & error-handling should be critical

Experiences of Test Automation – Case Studies of Software Test Automation: Graham & Fewster

www.origsoft.com | © Original Software 14


Implementation Failures
What could wrong when trying to use procured solution
• AUT uses technologies that don’t work with the tool
Won’t work
on my • Attempt to playback my recording doesn’t playback at all
application
• Multiple tools on varied technologies (Desktop, Web, Mobile)

• Heavy maintenance in re-recording tests for GUI changes


Automate
• Only for regression testing of legacy apps; not new apps
my AUT if it
changes? • Can’t work when deployed incrementally; key for Agile

• No time with project schedules and over-allocation


Complex
for manual • Marketed to testers with a development skillset
testers
• Doesn’t require cross-department support

www.origsoft.com | © Original Software 15


Correcting the Test Automation
approach

www.origsoft.com | © Original Software 16


Adoption corrections
Changing the mindset concerning test automation
• Automated differs from manual with entry, data & exit criteria
Automates Objectives
manual test • Error handling is an essential part of automated test cases aim for the
cases • Modularity is how automated test cases are developed highest ROI

• Start with regression library, but focus on tests with a high ROI
Must cover • TP includes
• App changes planned and automated in a maintenance window automated
100% of the
application • Failure analysis: Defect in the automated TC or AUT? and manual

• Initial cycle is longer than manual; cycle runtime reduced after


Testers can Separate
automate • Requires training on the tool, approach and best practices goals &
in a release • Separate initiative, defined unique objectives apart from projects timelines

www.origsoft.com | © Original Software 17


Case Study 3: Multi-Solution Strategy
Background for the • Leading electronic design automation company had products
comprised of over 120 programs. This was GUI-based and
Case Study ported to all leading industry workstation hardware platforms

First trial
• Completed under a year
Current Testing Activities • Manual: 2-person weeks
• Three distinct test phases: Integration • Automated: 1-person
week – but not everything
• Evaluation (done by end-user), test could be automated
• Integration automation
• System Without automation
• Manual: 10 person-weeks
• Integration & System done manually on every platform
and consumed 38 person-weeks of
effort for every software release on
every platform • Automate in two phases:
breadth and depth
• Manual: 8 person-weeks
System per platform
Proposed Solution test
• Automated: 4 person-
automation weeks total
• Purchase solution or build internally
• Total tests automated was
Key Benefits approximately 79%
• Hire an outside consultant to validate
the tool selection prior to purchase,
then train testing staff  Test effort per platform cut in half
 Consistent regression testing
• DECISION: Build in-house tool  Less dependent on system SME’s
 Better use of testing resources

Software Test Automation: Effective use of test execution tools – Fewster & Graham

www.origsoft.com | © Original Software 18


Implementation corrections
Changing the approach to put automation into practice
• Develop criteria for vendors to prove their solution works
Won’t work POC will
on my • Standardize object naming structure from Dev, very important prove the
application solution
• Research fully integrated suites; vendor mgmt. will thank you!

• Find solution that you don’t re-record if adding new controls


Automate Understand
my AUT if it • Consider starting your testing with automated tests to start
the object
changes? • Test automation should be a key element in sprint planning landscape

• Schedule a just-in-time training shortly after procurement


Complex Develop a
• Layout the resource plan on how every tester contributes
for manual resource
testers • Testing mgmt. can build the IT cross-functional support plan plan for all

www.origsoft.com | © Original Software 19


Case Study 4: Automate Government System
Background for the • Speed up the Department of Defense (DoD), software delivery.
Leading-edge technology, but lengthy testing & certification
Case Study processes were in place before software can deploy

Can’t Be Intrusive to System Under Test (SUT)


• The tool has to be independent of the System Under Test; no modules should be added Test Automation ROI
Must Be OS Independent
• The solution(s), had to go across Windows, Linux, Solaris, Mac, etc.  Solutions selected
spanned across
Must Be Independent of the GUI technologies
• The tool needed to support all languages that the applications were written in

Must Automate Tests for Both Display & Non-Display Interfaces  Reduced test cycles
• The system should be able to handle operations through the GUI and backend processes from 5 testers to 1

Must Work in a Networked Multicomputer Environment


• The tool needs to test a system of systems: multiple servers, monitors and displays  Testing department
interconnected to form one SUT worked on automated
test cases, both
Non-Developers Should Be Able to Use the Tool creation and execution
• One of the main requirements from the DoD. Testers were SME’s in the application, but
not software developers writing scripts. Needed a code-free solution.
 Additional processes
Must Support an Automated Requirements Traceability Matrix for Test Case & Test
• Needed a framework that could allow for test management activities, including Data Management
documenting a test case in the Automated Test and Re-Test (ATRT) solution, along with a were now enabled
hierarchical breakdown of test cases into test steps and sub-steps

Experiences of Test Automation – Case Studies of Software Test Automation: Graham & Fewster

www.origsoft.com | © Original Software 20


Session recap

www.origsoft.com | © Original Software 21


Presentation Recap
Reviewing the main points of the presentation
Automation industry focused on desktop initially, but now must cover multi-technology
stacks to cover many platforms

Primary skillset for automation was programming, but solutions are available offering
code-free test automation

Common perceptions and myths will exist; review the list of questions to be asked to
drive your strategy

Present real-life case studies to management and ensure this does not occur during
your research and implementation

Review common failures in the theory and implementation; compare if any are
problematic in your organization

Make the corrective action for each failure step accordingly; incorporate test
automation as part of your overall test strategy

www.origsoft.com | © Original Software 22


Jim Trentadue – Software Quality Consulting Director

jtrentadue@origsoft.com

www.origsoft.com | © Original Software 23

You might also like