Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
4Activity
0 of .
Results for:
No results containing your search query
P. 1
Datadriven and keyword driven[1]

Datadriven and keyword driven[1]

Ratings: (0)|Views: 62 |Likes:
Published by api-19840982

More info:

Published by: api-19840982 on Nov 30, 2009
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as DOC, PDF, TXT or read online from Scribd
See more
See less

03/18/2014

pdf

text

original

Test Automation Frameworks
by Carl Nagle
1.1 Thinking Past "The Project".
1.1.1Problems With Test Automation.
A CASE STUDY

1.1.2Some Test Strategy Guidelines.
1.1.3Test Automation is NOT a Sideline.
1.1.4Test Design and the Test Framework

are separate entities.
1.1.5Test Framework should be
Application-Independent.
1.1.6Test Framework must be easy to
expand, maintain, and perpetuate.
1.1.7Test Strategy/Design Vocabulary
should be Framework-Independent.
1.1.8Test Strategy/Design should remove
testers from the complexities of the
Test Framework.
1.2 Data Driven Automation Frameworks

1.2.1Data Driven Scripts.
1.2.2Keyword/Table Driven Automation.
1.2.3Hybrid Test Automation
1.2.4Commercial Keyword Driven Frameworks.

1.3 Keyword Driven Automation Framework Model

1.3.1Project Guidelines.
1.3.2Code and Documentation Standards.
1.3.3Our Automation Framework.
1.3.4The Application Map.
1.3.5Component Functions.
1.3.6The Test Tables.
1.3.7Core Data Driven Engine (DDE).
1.3.8Support Libraries.

1.4 Automation Framework Workflow

1.4.1High-Level Test Cycle Tables.
1.4.2Intermediate-Level Test Suite Tables.
1.4.3Application Mapping.
1.4.4Low-Level Test Step Tables.

1.5 Summary
Bibliography
Test Automation Frameworks
"When developing our test strategy, we must minimize the impact caused by changes in
the applications we are testing, and changes in the tools we use to test them."
--Carl J. Nagle
1.1 Thinking Past "The Project"

In today\u2019s environment of plummeting cycle times, test automation becomes an
increasingly critical and strategic necessity. Assuming the level of testing in the past was
sufficient (which is rarely the case), how do we possibly keep up with this new explosive
pace of web-enabled deployment while retaining satisfactory test coverage and reducing
risk? The answer is either more people for manual testing, or a greater level of test
automation. After all, a reduction in project cycle times generally correlates to a
reduction of time for test.

With the onset and demand for rapidly developed and deployed web clients test
automation is even more crucial. Add to this the cold, hard reality that we are often facing
more than one active project at a time. For example, perhaps the team is finishing up
Version 1.0, adding the needed new features to Version 1.1, and prototyping some new
technologies for Version 2.0!

Better still; maybe our test team is actually a pool of testers supporting many diverse
applications completely unrelated to each other. If each project implements a unique test
strategy, then testers moving among different projects can potentially be more a
hindrance rather than a help. The time needed for the tester to become productive in the
new environment just may not be there. And, it may surely detract from the productivity
of those bringing the new tester up to speed.

To handle this chaos we have to think past the project. We cannot afford to engineer or
reengineer automation frameworks for each and every new application that comes along.
We must strive to develop a single framework that will grow and continuously improve
with each application and every diverse project that challenges us. We will see the
advantages and disadvantages of these different approaches in Section 1.2

1.1.1 Problems with Test Automation

Historically, test automation has not met with the level of success that it could. Time and
again test automation efforts are born, stumble, and die. Most often this is the result of
misconceived perceptions of the effort and resources necessary to implement a

successful, long-lasting automation framework. Why is this, we might ask? Well, there
are several reasons.

Foremost among this list is that automation tool vendors do not provide completely
forthright demonstrations when showcasing the "simplicity" of their tools. We have seen
the vendor\u2019s sample applications. We have seen the tools play nice withthos e
applications. And we try to get the tools to play nice withour applications just as
fluently. Inherently, project after project, we do not achieve the same level of success.

This usually boils down to the fact that our applications most often contain elements that
are not compatible with the tools we use to test them. Consequently, we must often
mastermind technically creative solutions to make these automation tools work with our
applications. Yet, this is rarely ever mentioned in the literature or the sales pitch.

The commercial automation tools have been chiefly marketed for use as solutions for
testing an application. They should instead be sought as the cornerstone for an enterprise-
wide test automation framework. And, while virtually all of the automation tools contain
some scripting language allowing us to get past each tool\u2019s failings, testers have typically
neither held the development experience nor received the training necessary to exploit
these programming environments.

"For the most part, testers have been testers, not programmers. Consequently, the
\u2018simple\u2019 commercial solutions have been far too complex to implement and maintain;
and they become shelfware."

Most unfortunate of all, otherwise fully capable testers are seldom given the time
required to gain the appropriate software development skills. For the most part, testers
have been testers, not programmers. Consequently, the "simple" commercial solutions
have been far too complex to implement and maintain; and they become shelfware.

Test automation must be approached as a full-blown software development effort in its
own right. Without this, it is most likely destined to failure in the long term.
Case Study: Costly Automation Failures

In 1996, one large corporation set out evaluating the various commercial automation
tools that were available at that time. They brought in eager technical sales staff from the
various vendors, watched demonstrations, and performed some fairly thorough internal
evaluations of each tool.

By 1998, they had chosen one particular vendor and placed an initial order for over
$250,000 worth of product licensing, maintenance contracts, and onsite training. The

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->