Professional Documents
Culture Documents
The Pathologies of Failed Test Automation Projects.v11.STAREast 2013 External
The Pathologies of Failed Test Automation Projects.v11.STAREast 2013 External
Based, in part, on work done with Alon Linetzki, Best Testing (www.best-testing.com)
2 On the Menu…
• Automation failure patterns
• What can we do?
• Summary
Based, in part, on work done with Alon Linetzki, Best Testing (www.best-testing.com)
Otto’s Team
Piece of
cake!
Perfect!
Can you add…
Otto’s Team
Mr. Mann a. Ger
Arggghhh!
Fix Fix Fix
Otto! The
last release
Otto’s Team fails!... Mr. Otto Mate
… and mates
… and mates
… and mates
© Michael Stahl, 2013, all rights reserved
A common automation story…
8
Let’s
Mr. Oto Mate
REDESIGN!!! … and mates
#$&@***!!!
… and mates
A pattern emerges…
Simple Tool
Enhanced Tool
Stage 2 – Generalization
© Michael Stahl, 2013, all rights reserved
A pattern…
13
Complicated Tool
Stage 3 – Staffing
© Michael Stahl, 2013, all rights reserved
A pattern…
14
Test Case
Management
Stage 5 – Overload
© Michael Stahl, 2013, all rights reserved
Pattern #2:
The Competition
16
Team A
Team B
Team A
Team B
Team C
Team D !!!
© Michael Stahl, 2013, all rights reserved
Pattern #2 – The Competition (ver. 3)
19
Team A
Team B
!!! …
!!!
Team C …
!!!
Team D …
© Michael Stahl, 2013, all rights reserved
Pattern #3: The Night Run Fallacy
Night Time
Test Time
Corporate Truism:
It’s easier to get budget for machines than for more testers
© Michael Stahl, 2013, all rights reserved
Pattern #3 – The Night Run Fallacy
24
multiple individuals…
will ultimately deplete a shared limited resource…
even when it is not in their long-term interest
Garrett Hardin, Science, 1968
Robustness is Invisible
3
Mushrooming
3
The Competition
The Night time Fallacy
Going for the Numbers
The Magician Apprentice
Unavoidable ? ?
Map
Startright and avoid
the wrong turns
Additional features?
Multiple users?
Automation web site?
>25% of the tester’s time?
Key words:
“Use by other testers”
“Common Libraries”
Stage 1 measures
Strategy
Architecture
Lightweight PM
Version control
Scope control
Bugs & Requests database
Metrics suggestions
Automation framework quality
Number of false fails
Framework’s test results; Bug trends
ROI
Number of runs
Invested effort by type (new,
maintenance, rewrite)
Number of bugs found by Automation (?)
Generic features?
Test log wading?
False fails?
Key words:
“test suite / cycle generation”
“robustness enhancement”
“setup issues”
Build VS Buy?
(hint: Buy)
See: http://www.stickyminds.com/s.asp?F=S17601_COL_2
Loss of credibility
Options:
Continue…
Give up problematic areas
Partial return to Stage 1
“We value Robustness over New
Features”
Prepare for re-design – with a new
map...
© Michael Stahl, 2013, all rights reserved
How to Use this information?
58
GPS
Locate the stage you are at
Be alert for
Get directions for the way Alerts
out
Implement
Identify your
Counter
stage
Measures
Analyze your
situation
Map
Start right and avoid
the wrong turns Plan your trip
Implement
Be alert for
Counter
Alerts
Measures
Analyze your
situation
Salvageable up to stage 2
Stage 3 and up:
Merge the teams
EOL both tools
Start a 3rd
Plugin Architecture
http://www.youtube.com/watch?v=zlnz1rSJj7Y
http://www.youtube.com/watch?v=qDVYIT85ntg
7
Mushrooming
4
The Competition
The Night time Fallacy
Going for the Numbers
The Magician Apprentice
Questions time…
michael.stahl@intel.com
www.testprincipia.com