You are on page 1of 9

Testing a Web Application

Harry M. Sneed
Ancon GmbH, Vienna, Austria & Budapest, Hungary
Email: h.sneed@axelero.hu

Abstract: The following paper describes the test domain. On top of that, the user application
requirements and contraints of testing a web took over many of the service functions, such as
application system for the Austrian Chamber of memory allocation and error handling, which were
Commerce. It seperates the testing task into two once handled by the operating system or the
parts – one for testing the web architecture and one teleprocessing monitor. Not only that, the number
for testing the internet application itself. The high of potential interactions between dispersed
costs of testing an internet application are driven functions was doubled as was their
not only by the complexity of the architecture, but interdependencies. This led Boris Beizer to claim
also by the complex usage profiles of the that the testing of object-oriented software in a
application. Non-functional quality characteristics client/server architecture could cost three times
such as usability, performance, security and more than the testing of conventional software in a
interoperability take on particular importance in mainframe environment. [3] This claim may be a
web-based systems. Without a staff of skilled exaggerated, but the fact is that the relation of test
testers equipped with automated tools, it is not effort to development effort has changed
possible to ensure the reliability of e-commerce significantly with the advent of client/server
applications. The evolution of such systems systems, so that testing may require up to 50 % of
requires that the evolved application be tested the total project effort according to P. Goglia [4]
against the old one. In the project referred to here, a
comparison tool was developed to validate both the The emergence of e-business applications in the
backend data base contents and the frontend web internet has added a further dimension of
page contents. complexity to the testing problem. Not only is the
Keywords: Web Applications, Evolution, user software distributed between clients and
Regression Testing, Assertions, File Comparison servers, but now there are many potential servers.
A client request may be processed by several
alternative servers, which may, in turn, route the
1. The high costs of testing a web request to other servers further down the line. If the
application application provider is cautious, he will also be
employing at least two web servers to route his
The more complex an environment is, the greater transactions. The operating environment becomes a
the effort required to demonstrate the reliability of a complex network of client, server and router nodes,
software system operating in that environment. exploding the number of interactions between the
Testing has always been difficult and time application software and the support software.
consuming even in the days of stabile, closed, Assuming that testing effort is driven by the
monolithic mainframe environments. Considering complexity of the environment, then testing web
the fact that that the operating system was almost based applications could be even more expensive
totally predictable, that the teleprocessing monitor than testing client/server systems which are already
and database systems were equally reliable and that more expensive than testing conventional stand
security was no real issue, it still required up to alone systems. The applications themselves may be
30% of the total project effort to test new simple, but the environment in which they operate
applications.[1] The test research community has is not and that environment is, by nature, part of the
always insisted that testing is underestimated and testing problem. [5]
that it requires equally as much effort to test a
function as it does to program that function. [2] That being the case, the significance of the
This applies to projects where the operating architecture test increases when testing applications
environment is assumed to be correct and not part on the web. Clients are connected to web servers
of the testing problem. It turns out to be an ,which are connected to application servers which
understatement when, as is the case with the web, are connected, in turn, to function servers and data
the environment becomes a part of the testing servers. Every connection is handled by another
problem. kind of middleware. Data can also be passed in
various forms such as HTML or XML forms, by
This was already the case to a certain extent with means of parameters in a CORBA or DCOM
distributed systems. The middleware, which is interface, or as a combination of both. [6] Each
connecting clients and servers became a part of the potential connection and the corresponding

Proceedings of the Sixth IEEE International Workshop on Web Site Evolution (WSE’04)
1550-4441/04 $ 20.00 IEEE
interface causes a unique set of circumstances user freedom relative to the productivity of the
which has to be tested in all variations. testers. The latter is driven by the knowledge,
experience, skill level, motivation, and, above all,
As a result of the many interactions between remote the tools of the testers. Without adequate tools they
nodes, there are also more exception conditions, cannot do their jobs.
which can be triggered. No test is complete without
raising all possible exceptions at least once if not 2. A Dual Strategy for Testing Web
twice. In this respect the rollback, restart and Applications
recovery functions of a system have to be tested, as
well as the multithreading features, not to mention Testing on the web involves two distinct goals. One
such conventional database functions as locking is to test the compatability of the web application
and committing and guaranteeing integrity of data with the technical environment. The other is to test
content. the functionality of the web application itself. The
effort required to test the functionality of an
Another special aspect of the web architecture is application is driven by the size and complexity of
that of net security. As opposed to closed systems the application independent of what environment it
running in a restricted environment, internet is running in. That effort is a factor of the number
applications are running in an open environment of data attributes processed – arguments, results and
where they are exposed to intruders. If they are not predicates, the number of dependencies between
protected by means of firewalls or other security interactive parts – inheritance, collaboration,
features, intruders may break in and intercept aasociation, etc., and the number of decision nodes
messages, access databases or even distroy and branchs – selections, iterations, exceptions,etc.
application programs. These aspects only cover the The effort required to test the compatability of an
perimeter security. In reality, much of the security application with it‘s environment is, on the other
expolits are coming from outside of the perimenter. hand, driven by the complexity of the environment,
For example, buffer overflow will not be protected independent of the application. It is a factor of the
by inward directed measures. Most of the attacks number of interfaces and dependencies between the
will not be detected by a firewall. Many other forms application and the environment. [9]
of attack are based on the principle ´that the client
side is not a controlled environment, and this is In the case of a web application this can be very
where the application producer has the biggest high, so that even trivial applications with little
risks. In light of the fragility of the web, security programming effort require a significant testing
becomes a major issue. [7] There are software effort relative to applications on a host or classical
techniques available to circumvent unwanted client/server systems. Web-based systems are
access, but these techniques have to be retested for highly intertwined with the environment, which
every environment. This again adds to the total test they operate, e.g., they have a high number of
effort. interdependencies with that environment, and it is
this high degree of dependency which makes them
As a consequence of the many dependencies in the so expense to test. It is mainly for this reason that
web, more attention must be given to testing the web applications cost two to three times more to
application software. This additional expense will develop than conventional systems. This experience
be enhanced by the costs of testing for novice, was borne out by the project concerned here. The
unpredictable users. Business to business e- costs of developing the new web application were
commerce systems have more complex user assumed to be 50% less than the costs of
interactions, more objects to deal with and more developing the old mainframe CICS application or
business rules to process, but here it is possible to 8 person years as opposed to 16 person years for the
train the users and to oblidge them to work within old application. As of now, and the new application
givin constraints. As employees their behaviour can is still evolving, the costs have exceeded 25 person
be controlled. They can even be held accountable years, or 50% more than what the older mainframe
for irrational actions. In dealing with customers, application cost. Most of the cost overrun is due to
this is not the case. Customers are kings and can do the additional test effort required.
what they want. Therefore, anything they might
want to do has to be tested. This makes the costs of Managers of web site development projects should
testing business to customer applications much be aware of this fact. Using a sophisticated
greater than testing business to business systems. In development environment like DotNet or Java
fact, this aspect is critical for the success of such Beans Enterprise Edition with a ready made
applications. [8] framework and many standard components, may
enable one to quickly produce a running system, but
In the end web testing costs are determined by the it will still require a much longer period of testing
size and complexity of the application plus the size before that system becomes a deliverable product,
and complexity of the architecture plus the especially if the non-functional requirements such
complexity of the user interface and the degree of

Proceedings of the Sixth IEEE International Workshop on Web Site Evolution (WSE’04)
1550-4441/04 $ 20.00 IEEE
as security, reliability and efficiency are high. It exceptions are intercepted and handled
could even turn out that the first solution will never events are caught and reported
be accepted and that they will have to start over messages are transferred and converted
again. In any case, testing will prove to be a long data is stored, indexed and retrieved
and tedious task with lots of unpleasant surprises, memory is allocated and deallocated
something which has to be planned for from the intermediate states are logged
beginning.
unauthorized entry is recognized and thwarted.
[12]
The fact that many standard components are used in
internet applications might save the effort required These are only some of the many functions required
to produce them,, it will not save the effort required
to run a web application. It cannot be assumed that
to test them. Kaiser and Perry have demonstrated in
they will all function as one expects. They have to
a land mark research paper on object-oriented be tested and validated in the target environment.
testing that one cannot assume that inherited or
That entails sending messages, causing events,
reused classes will perform correctly in every
triggering exceptions, overloading memory,
context, since they are all in some way context disrupting transactions and attempting unauthorized
dependent. [10] The same applies to components.
entry, all cases, which only experienced testers
They too must be validated in each and every
would think of. Testing the architecture demands an
context in which they are used. Referring to a insider’s knowledge of the system as a whole.
standard reusable method implies that each
Testers have to plan their test well, design a test
reference has to be tested. If a reference is
concept and specify all of the special cases they
embedded in a nested control structure it entails
intend to test, placing particular emphasis on
fulfilling all of the conditions leading up to it. This
extreme values and boundary conditions.
may sound easy, but if the condition predicates are
parameters or return values from other remote
Testing a web application architecture can be done
methods, it can become extremely tedious. The
from outside-in over the user interface, i.e the web
literature on object-oriented testing is abundant page, or from inside-out starting with the data
with proofs of the need to retest reused software.
server. If the outside-in approach is followed, it is
[11] Since web architectures offers a wide range of
necessary to simulate the servers via test stubs. If
reusable components, these insights are particularly the inside-out approach is taken, it is necessary to
relevant to web-based applications.
simulate the clients via test drivers. These two basic
approaches are definitely not new and are certainly
3. Testing a Web Architecture not unique to the web. They have been used both to
test host online applications as well as to test
The architecture of the project in question is client/server applications. [13] What is new are the
DotNet. The programming language is C#. XSL techniques for implementing them. In a web
style sheets are used to depict the web pages. These environment drivers must generate messages in a
are then converted via XSLT to XML documents, standard web format such as HTML or XML and
which are forwarded to the application server for forward them to the object under test via a message
processing the SQL Server database. One could say routing service. Stubs must, on the other hand,
that this system is a complex structure with many intercept messages, i.e. prevent them from going to
elements and even more interrelationships between the message router and generate an artificial
them. There are client components, server response in the form of standard communication
components and database access components. protocols. Doing this requires a good tool and
There are many standard components used, expert knowledge of web interfaces. In the project
including the MS-Explorer web browser SOAP as a described here, the tool Empirix was used to
component connector, XSLT as an interface generate messages to be sent over the web [14].
transformer and the ODBC database access
components. There is also the MS-Transaction Another important aspect of testing the web
Monitor. All of these components and their architecture is the ability to generate extreme
interactions with one another had to be tested prior transaction loads which will push the system to its
to starting the actual application development since limits. The system in question is a system for
there has to be a technical proof of concept and the collecting corporate taxes from companies. Thus,
burden of proof is on the side of the user. millions of requests have to be generated in a
limited period of time. For this purpose special
It is true that a web architecture provides a wide tools are needed which can duplicate and mutate
range of useful functions, everything from search web messages. Emprix was used for this purpose as
machines to interface interpreters. One might well, but the data produced was not really
expect from the architecture that representative of the actual production load,
long transactions are managed including probably because it was never intended for this
rollback, recovery and restart purpose.

Proceedings of the Sixth IEEE International Workshop on Web Site Evolution (WSE’04)
1550-4441/04 $ 20.00 IEEE
project turned out to be a long learning process,
Finally, the transaction management ability of the both for the users and the developers. Here one can
web architecture has to be tested with selected long really speak of evolutionary development. The web
transactions, which cause transactions to be raised site of the Chamber of Commerce has slowly
and the transaction to be rolled back. Data updates, evolved to become that, what it is now, and it is still
which have occured up to that point have to be evolving at an average rate of 18 new requirements
annuled, transient objects deleted, persistent objects per month. This amounts to a change rate of about
restored and allocated memory freed. If there are 3% per month.
several servers involved in handling one
transaction, as was the case here, this will prove to 4.1 Validating the XML Outputs
be difficult. Yet since it will undoubtably come up
in production, it had to be tried. This presented a problem to the testers of the web
site. Since new versions of the software were being
In testing the web architecture there will always be released every month, there had to be a way to
a certain degree of non-determinism to deal with. repeat previous tests with a minimum of effort. The
This has to do with the evolving nature of the solution chosen was to record the dialog tests with a
components and the user’s lack of control over capture replay tool - Empirix - and to play them
them. One can never be sure that the browser one back. However, to check the results another method
tested yesterday is the one you are using today. It was used. The XML documents produced by the
might be the responsibility of the service providers server to be converted to XSL-Stylesheets and then
to sustain the same interface, but a user can never displayed to the user were intercepted upon leaving
be sure he is using it correctly. Interactions may the server and stored in a sequential file. Afterwards
appear to be working when in fact they are only a the XML file generated by the test of the current
coincidence. The moment an error is corrected in version is compared to that produced by the
the server, the interface to that server may no longer previous version.
respond in the same manner. Then one is left
wondering why it ever functioned in the first place.

Web testers have to be aware of these architectural


GU-
inconsistencies and test to ensure that the client System
old
Reports REPTOXML

software is prepared to handle rejected requests. It


needs to be checked that it will react accordingly to
XML-
any potential failure. Nowhere is Murphy’s law Files

more true than in the web. [15]


Comparison
XMLComp protocol
4. Testing a Web Application Missing Objects,
Matching Objects,
Non-matching Results
The web application in question here was conceived GU-
System Test
XML-
Files

to interact with both accountants at the different


Chamber of Commerce offices at the state level –
there are nine states in Austria, each has its own set
of business rules - and with the companies and Figure 1: Comparing XML Output Files
persons being charged. Companies and self
employed persons liable for taxation, must fill out The tool for making the comparison does so at the
web forms giving financial data on the status of element level, since the XML structures may vary.
their company – the number of employees, the fixed First the old documents are read in and stored in a
costs, investments, income, profit, etc. This direct access XML database. Then the new file is
information is checked by the Chamber of read and processed document for document. From
Commerce state offices, which then enhances it and the new document the key is taken to access the old
send out notices requiring the companies to pay a document. If it is not found, then the old document
given corporate tax. If they do not pay by a certain is considered to be missing. If it is found the
deadline, up to five reminders are sent, after which elements marked for comparison are compared.
their data is forwarded to the courts for filing a Those, which do not match are reported. Those old
claim. documents found are marked so that at the end of
the run, the old file is read and all unmarked old
Due to the complexity of the system, it could not be documents reported as missing in the new file. (see
developed all at one time. Besides, most of the Example 1)
users had never worked on the internet, so they had
no idea of what they wanted. They first had to be A test is always a test against something else. To
shown working web pages before they could make determine whether data is correct or not, the user
up their minds as to how they wanted them. The must formulate assertions about the relationship
between one set of data and another. One such

Proceedings of the Sixth IEEE International Workshop on Web Site Evolution (WSE’04)
1550-4441/04 $ 20.00 IEEE
relationship is that a record with a certain key also field to be replaced must both be in the same file,
exists in the other file with the same key. Another either in the old file or the new file. They may also
relationship is that, if the keys match then certain be the same field. The replacement statement
fields, or attributes, must also match. Thus, a simple allows key values in either the old or the new file to
first order predicate logic is required to describe the be overridden so that the keys will match.
relationships in a script file. This file has to be read
and checked. If the statements are not syntactically The statement which connects the old XML objects
correct, the execution of this particular file to the new ones is the if statement. The syntax is:
comparison is terminated. If they are correct, the if ( <Condition> &
assertions with keys and compare values are stored <Condition> &
in a table to be interpreted at compare time. The <Condition> )
assertion interpretation is a preprocessor and a There can be up to 20 conditions. The conditions
prerequisite to the file comparison. It is actually identify the keys. The condition is a
automatically invoked for each assertion procedure comparison of a new key field with an old key field.
selected.
new.KeyField1 = old.KeyField1
Assertions are, as used here in this context, claims
about the relationship between data items. [16] One The comparator can be =, > or <. The key field
can claim that the value of an attribute in one object names must correspond exactly to the titles of the
should be =, < or > than the value of the same or SQL table columns or the tags of the xml file. The
another attribute in another object, e.g. assert field names must not be the same in both files. One
Object_A.Attribute_1 = Object_B.Attribute_2. can compare new.Field-B with old.Field-C. It is
One can also claim that the attribute of a given also possible to match sub strings of the fields. In
object is equal to a constant, e.g this case the beginning position and the length of
Object_C.Attribute_3 = “10”. The values must be the sub string must be in brackets, separated by a
ASCI character strings. There is no comparison of colon, immediately after the field, e.g Field_A[2:8]
internal numeric data types, i.e integer, decimal, means the key begins with the second character of
double or floating, only character. It is assumed Field_A and is 8 characters long.
that the values of a database are converted by the
SQL query to character format when they are There is also an OR clause in the condition. By
extracted from the database. The Assertion Script is inserting the character ‘!’ followed by a spaces and
a text file. It must start with the name of the file to a constant, it is possible to say a field in the new file
be compared, e.g: must match the field in the old file or it matches to
the constant, e.g.
file: This_File;
if (new.Field_A = old.Field_B ! “0” &….
The file name is preceeded by the word ‘file:’ in
lower case followed by at least one blank, then the That means either the new Field_A is = the old
name of the file to be processed. This is done to Field.B or it has the value 0.
ensure that the correct file is being compared. The The conditions are included in parenthesis. The
file name must be the same as the name of the closing parenthesis terminates the conditions.
assertion script, since it is the name that connects
the two. For comparing XML files, the if statement may also
refer to an object tag to identify which object is to
After the file assignment declaration it is optional to be compared. This must be the first condition, e.g.
write when statements to replace the value of fields
in the files to be compared. As in all asserion If (object = “Object_1” & new.Attr-A = old.Attr-
statements the keywords are in lowercase and are A)
separated from the variables by spaces. All field
names must be qualified by either new. or old. In When coimparing XML files, one object is
order to identify to which of the files they belong. compared at a time. This means that the file will be
The when statement identifies the key field by parsed as many times as there are objects in it. This
value and replaces the value of that key field or rather inefficient approach was taken to avoid the
another one by a given value, e.g. problem of dealing with nested objects. The
prerequisite is that each object has a unique
when ( new.Field1 = “ ” ) then replace new.Field2 identifier. In this way, even if objects are embedded
by “GT “; at a deeper level inside another object, they will still
when (old.Year = “00”) then replace old.Year by be stored as individual objects in the database,
‘99’; albeit with a concatenated key, which includes the
key of the higher level object, e.g.:
There are some restrictions to this replacement
statement. The field whose value is queried and the Object1_ID.Object12_ID.Object121_ID

Proceedings of the Sixth IEEE International Workshop on Web Site Evolution (WSE’04)
1550-4441/04 $ 20.00 IEEE
The assert statement is dependent on the if Comparing the XML messages returned from the
statement, i.e, it will only be executed when the ‘if’ server revealed some 72 errors in the output to the
statement is true. Since the if statement is actually end user. This accounts for 14% of the total errors
linking the new object to the old object, there can reported. Altogether some 520 errors have been
be no assert without an if. registered in the error tracking system “SCARAB”
If ( <Condition> & <Condition> ) as of now. Some 221 errors have been reported by
assert <Comparison>, the users, which are testing parallel to the regular
<Comparison>; testers. That makes up for 42% of the total errors
reported. The other 58% of the errors have been
The assert statement begins with the key word reported by the testers. Of these 296 errors, 117
assert followed by a sequence of comparisons. A errors, or 45%, were revealed by the tools described
comparison compare two file attributes with each in this paper. The remaining errors were found
other or an attribute with a constant, eg. mainly in the dialogue test.

assert new.Field_A = old.Field_A, 4.2 Validating the SQL Databases


new.Field_B = old.Field_C;
The same tool used to validate the XML outputs is
If other assertions are to follow, an assertion is also used to validate the status of the SQL
ended with a comma. It it is the last assertion, it is databases. In this case there are standard SQL select
ended by a semicolon ;. There can be any number statements, which extract the same views from both
of assertions in a sequence. The operator can be the previous and the latest database tables. These
either =, > , >=,<=, or <. Here too it is possible to views are then stored as comma seperated value
compare substrings, e.g. files. It is important to note that this preselection is
done so as not to compare the entire content of the
assert new.Field_1[3:5] <= database tables, but only those attributes, which are
old.Field_1[10:5] considered to be crucial. The old csv file is
compared with the new csv file using the assertions.
The assertion script is terminated by an end; Tuples of the database can be missing in the old
statement in the last line. This signals to the database, missing in the new database or not
interpreter that the script file is ended. If there are matching. (see Example 2)
several if statements, then each if statement is
terminated by an end; This is necessary because in The file comparison is made here in four steps. In
the case of XML there can be many objects in any the first step, the records of the base file, the old
one file. data, are read sequentially and stored in a random
file. While reading, key values may be altered to
The following script was used to test the charges match the new keys. For this purpose the replace
file: statement is used. The keys are extracted and stored
in an indexed table. The data fields of the old record
file: Vorschreibung; are stored in a random file. In the second step, the
if ( object = "Mitglied" & index table is sorted so that the keys are in
new.MitgliedsNummer = ascending order and can be accessed directly. In the
old.MitgliedsNummer ) third step, the compared file is read sequentially and
assert with the key of each record an access is made to the
new.Name[1:8]=old.Name[1:8], index table. Here too key values may be altered to
new.Datum[1:10] = old.Datum[1:10], match the old keys. If the key is not found, even not
new.RechnungsBetrag = by alternating the keys, this record is assumed to be
old.RechnungsBetrag, missing in the base file. If the key is found, the base
new.Ocr_Zeile[1:20]= record is retrieved from the random file and its
old.Ocr_Zeile[1:20], selected contents are compared with the contents of
new.Kundendatenfeld=old.Kundendate the current record. Field values that do not match
nfeld; are documented in the compare report. In the fourth
end; step, the indexed file is searched thru for all keys
if ( object = "Forderung" & which were not selected. Those that were not
new.ForderungsId[1:4] = selected are reported as being missing in the
old.ForderungsId[1:4] ) compare file.
assert new.Jahr = old.Jahr,
new.Betrag = old.Betrag, As with the XML files, the user gets a report on
new.Fachgruppe = old.Fachgruppe; missing records and non-matching attributes as well
end; as some statistics as to the degree to which the two
end; files coincide. This is equivalent to a correctness
Example of an Assertion script for an XML Output metric.

Proceedings of the Sixth IEEE International Workshop on Web Site Evolution (WSE’04)
1550-4441/04 $ 20.00 IEEE
Comparing the database tables of the last release process and to enforce that process rigidly. The
with the database tables of the new release proved process must state explicitly the test steps to be
to be an effective means of detecting erroneous data taken and it should be supported by tools. The two
in the database. The database comparison has key tools are
revealed more than 45 serious errors or 9% of the
total number of errors reported. The severity of a capture/replay tool and
these errors was, however, much greater than that a comparison tool
of the errors discovered in the user interface.
The capture/replay tool serves to record and play
back the user interface test cases. The comparison
tool should be able to compare both the system
outputs and the system databases. Both tools have
GUN-
GUN
SQL
to be built into a well defined test process. [17]
System Test
DB

The managers of the project have complained about


the fact that 42% of the errors have been reported
SQL- CSV- Fields which map to the
Query Files Fields of the old Database by the users. They claim, the testers should have
found more. This may be true – there were only
three testers, as compared to 11 testing users - but
GU- Comparison
VSAM
System
Files
TabComp protocol without the testing tools, even less errors would
old
Missing Records,
have been found by the testers. It is very difficult to
Matching Records,
Non-matching Fields
quantify the value of errors found, so this will
FileTran
CSV-
Files
always be a bone of contention. As a tester one is
always left with the frustrating feeling, one could
have done more.
Figure 2: Comparing SQL Tables
Nonetheless, in the project described here it would
have been impossible to retest every new release
5. Lessons learned without the availability of the tools. Therefore,
since web site evolution is a test driven activity and
The main lesson learned from this project is that the test is dependent upon the availability of tools,
web site evolution is a test driven activity. It is test tools are the key to a successful web evolution.
important to have a precise and practical test

References:
1] Poore,J and Trammel,C.: “Bringing Respect to Testing through Statistical Science“, American
Programmer, Vol. 10, No. 8, August, 1997, p. 15-22
2] Gelperin,D. and Hetzel,W.: “The Growth of Software Testing“, Comm. Of ACM, Vol. 31, No. 6, June, 1988,
p. 687-695
3] Beizer,B: “Testing Technology – The Growing Gap“, American Programmer, Vol. 7, No. 4, April, 1994, p. 9
4] Goglia,P.: Testing Client/Server Applications, QED Publishing Group, Boston, 1993, p. 25
5] Downs,T.: “Extensions to an Approach to the Modeling of Software Testing with some Performance
Comparisons“, IEEE Trans. on S.E., Vol. 12, No. 9, Sept., 1986, p. 979-987
6] Bazzana,G./Fagnoni,E.: “Testing Web Applications“ Proc. of ICSTEST – Int. Conf. on Software Testing,
Software Quality Systems, Cologne, April, 2000, p. 65-74
7] McDonald,M.: “From manual Commerce to e-Commerce“, in Cutter IT-Journal, Vol. 13, No. 4, April, 2000,
p. 12-24
8] Becker,S./Berkemeyer,A./Zou,B.: “A Goal-Driven Approach to Assessing the Usability of an e-Commerce
System“, Cutter IT Journal, Vol. 13, No. 4, April, 2000, p. 25-34
9] Sneed, H. “Testing Software for Internet Applications“, Software Focus, Vol. 1, No. 1, Sept. 2000, p. 15-22
10] Perry,D./Kaiser,G.: “Adequate Testing and object-oriented Programming“, Vol. 5, No. 2, Jan. 1990, p. 13-19
11] Kung,D./Hsia,P./Gao,J.: Testing object-oriented Software, IEEE Tutorial, Los Alamitos, CA., 1998, p. 89
12] Nguyen,H.Q.: “Testing Web-based Applications“, Software Testing & Quality Engineering, Vol. 2, No. 3,
May 2000, p. 23-30
13] Siegel,S.: Object-oriented Software testing – A Hierarchical Approach, John Wiley & Sons, New York,
1996, p. 235
14] Fewster,M./Graham,D.: Software Test Automation, Addison-Wesley, New York, 1999, p. 248
15] Anderson,M.: “The top 13 Mistakes in Load Testing Web Applications“, Software Testing & Quality
Engineering, SQE Pub., Vol. 1, No. 5, Sept. 1999, p. 30-41
16] Binder,R.: Testing Object-oriented Systems, Addison-Wesley Longman, Reading Mass., 1999, p. 810-832
17] Kaner, C., Falk, UJ., Nguyen, H.-Q.: Testing Computer Software, Wiley & Sons, New York, 1999, p. 177

Proceedings of the Sixth IEEE International Workshop on Web Site Evolution (WSE’04)
1550-4441/04 $ 20.00 IEEE
Samples:
+------------------------------------------------------------------------+
| Object: Report XML Comparison Report Date: 04.04.04 |
| Type : XML System: REPORTS |
| Key Fields of Record(new,old) |
+------------------------------------------------------------------------+
| Non-Matching Fields | Non-Matching Values |
+-----------------------------------+------------------------------------+
| RecKey:4711 | |
| New: Zip | 39508 |
| Old: Zip | 39507 |
+-----------------------------------+------------------------------------+
| RecKey:4711 | |
| New: Total_Costs | 364.25 |
| Old: Total_Costs | 363.75 |
+-----------------------------------+------------------------------------+
| Total Number of old Records checked: 10 |
| Total Number of new Records checked: 10 |
| Total Number of Fields checked: 15 |
| Total Number of non-Matching Fields: 10 |
| Percentage of matching Fields: 67 % |
| Percentage of matching Records: 100 % |
+------------------------------------------------------------------------+
Example 1: XML Comparison Report
+------------------------------------------------------------------------+
| Object: Bewegung File/Table Comparison Report Date: 21.06.04 |
| Type : CSV System: TEST |
| Key Fields of Record(new,old) |
+------------------------------------------------------------------------+
| New:MGNR BNR TYP JAHR |
| Old:MitgliedNr BerechtigungNr Type Jahr |
+---------------------------------+--------------------------------------+
| Non-Matching Fields | Non-Matching Values |
+---------------------------------+--------------------------------------+
| RecKey:238132 1 GU 0 | alternate key in new File/Table |
+---------------------------------+--------------------------------------+
| RecKey:254036 1 GU 2000 | missing from the old File/Table |
+---------------------------------+--------------------------------------+
| RecKey:238132 0 MK 2003 | |
| New: VS | 0000000000001.45 |
| Old: VS | 0000000000002.00 |
+---------------------------------+--------------------------------------+
| RecKey:238132 0 MK 2003 | |
| New: OP | 0000000000001.00 |
| Old: OP | 0000000000000.00 |
+---------------------------------+--------------------------------------+
| Total Number of old Records checked: 10 |
| Number of old Records found in new File: 02 |
| Number of old Records not in new Table: 07 |
| Total Number of new Records checked: 13 |
| Number of new Records found in old File: 03 |
| Number of new Records not in old File: 08 |
| Total Number of Fields checked: 09 |
| Total Number of non-Matching Fields: 02 |
| Percentage of matching Fields: 78 % |
| Percentage of matching Records: 38 % |
+------------------------------------------------------------------------+
Example 2: SQL Comparison Report

Proceedings of the Sixth IEEE International Workshop on Web Site Evolution (WSE’04)
1550-4441/04 $ 20.00 IEEE
This document was created with Win2PDF available at http://www.daneprairie.com.
The unregistered version of Win2PDF is for evaluation or non-commercial use only.

Proceedings of the Sixth IEEE International Workshop on Web Site Evolution (WSE’04)
1550-4441/04 $ 20.00 IEEE

You might also like