You are on page 1of 9

Quantitative Evaluation of Frameworks for Web Applications

Abstract:
The paper presents an empirical study of web applications that use software frameworks. The analysis is
based on two approaches. In the first, developers using such frameworks are required to assign weights to
parameters such as database connection based on their experience. In the second approach, a performance testing
tool, openSTA, is used to compute start time and other such measures. From this analysis, the authors conclude that
open source software is superior to proprietary software. The motivation
Of the research is to examine ways in which a quantitative approach can be taken to evaluate software in general and
frameworks in particular. The paper also discusses concepts such as metrics and architectural styles, and previously
published research in this area.
Keywords: Metrics, Frameworks, Performance Testing, Web Applications, Open Source

INTRODUCTION:
Measurement is fundamental to any branch of engineering. Software engineering does provide various
techniques for this purpose. However, standards have been slow to emerge. The main reason for this is the rapid
evolution that has taken place in producing software. The building paradigm of yesteryear is based on
creating customized code for each application. We have now moved, at least partially, to the assembly paradigm.
We reuse existing components where available, and create new ones where they do not exist. The old measurement
techniques cannot therefore be readily applied to this new paradigm. Measurement takes place at various stages in
the software development life cycle. Terms such as harvesting time are used to denote this aspect of measurement.
Reuse of architectural styles (data flow, call & return, repository, layers, etc.) is common. In the new assembly
paradigm, partial applications called frameworks have become very popular in order to cut down development effort
as well as to improve quality. Open Source software is also widely used for the same reason. These are further
elaborated in this introduction. Subsequent sections describe metrics in more detail, discuss a particular tool,
openSTA, for performance testing, the empirical study conducted by the authors and its analysis, and conclusions.

1.1. Problem and Solution Oriented Metrics:


Requirements engineering precedes design, coding and testing of software. If measurements can be done at this
early stage, planning is greatly enhanced.
Albrecht proposed Function Point in 1979[1], and there is an International Function Point User Group (IFPUG) to
promote metrics based on function Points. IFPUG also certifies professionals to carry out this task. Use Cases are
used to decompose customer requirements. They too can be used to compute metrics at early stages of software
development. As we harvest metrics at the time of describing the problem, we call them problem oriented metrics.
Lines of Code (LOC) have been popular from the inception of software metrics, when procedural programming
languages dominated software development. Halstead proposed Computer Science Metrics in 1977[2] based on
LOC. The disadvantage with such metrics is the harvesting time. We have to wait till the completion of
implementation to derive the numerical figures. Nevertheless, software companies use them to reflect on the past in
order to project the future in a more professional manner. Most software these days follow the Object Oriented (OO)
paradigm.Chidamber and Kemerer proposed metrics for OO in 1994[3], and this is still widely followed. For
calculating complexity, they used Cyclomatic Complexity proposed by McCabe in 1976[4].

1.2 Architectural Styles and Frameworks:


Baseline architecture is an essential starting point for software development, once the requirements have
been established. The Call & Return architecture was appropriate in the days when mainframe computers and
procedural languages dominated the computing Scene. With UNIX, a new style came to be used, namely the Data
Flow Architecture. Filters are smaller programs written in 'C' language, and these are put together using pipes with
Shell programming to make larger programs. Image processing applications typically use this architectural style.
Certain applications were dominated by a repository, and clients either retrieved or manipulated data in the
repository. OSI came out with the seven layer architecture, and soon such layered approach became popular in web
applications. Typically, there is a back end database layer that interfaces with the application layer. The customer
uses a web browser (called a thin client) and is linked to the application layer via a web server layer. The essential
thing about these architectural styles is that they are abstractions. Measurement becomes tricky with such
abstractions. Frameworks on the other hand are concrete; they follow some architectural style and incorporate some
design patterns, and are customized through parameters or through hook methods. They allow for components to be
added or replaced. So the new software development paradigm is to use these frameworks to put together
components just like we assemble automobiles. Operating Systems, Database Management Systems, etc. can also be

1
thought of as frameworks. Microsoft uses Active X components and Distributed Component Object Model
(DCOM). It has also come out with Active Server Pages (ASP) framework for web applications.
The most fundamental component of CORBA is the Object Request Broker (ORB) whose task is to
facilitate communication between objects. Given an Interoperable Object Reference (IOR), the ORB is able to locate
target objects and transmit data to and from remote method invocations. The interface to a CORBA object is
specified using the CORBA Interface Definition Language (IDL). An IDL compiler translates the IDL definition
into an application programming language, such as C++, Java or Tcl/Tk, generating IDL stubs and skeletons that
respectively provide the framework for client-side and server-side proxy code. PHP is a popular scripting language
for web applications, and several frameworks such as JOOMLA, XAMPP, TYPO3 and MOODLE are built using
PHP for Content Management, Course Management, and a variety of applications. Measurement is facilitated by
writing PHP code to collect data such as the number of pages (static) or response time (dynamic).

1.3. Open Source:


For a variety of reasons, open source has penetrated every aspect of computing. And the trend is expected
to continue. Since source
Code is available for open source software, we can have glass box components, and these can be extended by
creating subclasses. Also, we can incorporate code in these components for measurement purposes.
2. METRICS:
Function Point (FP) Metric uses the requirements document for computation. Inputs, outputs, inquiries,
interfaces and files are weighted based on their complexity. An adjustment factor is applied based on reuse,
distribution, etc. to the raw figure to come out with the final numerical value for the software. Mention has been
made already how Use Cases that describe the requirements are now forming the basis for problem oriented metric.
2.1. Object Oriented Metrics:
Chidamber and Kemerer (CK) came out with a set of 6 metrics, and they are:
• Weighted Methods per Class (WMC)
• Response for a Class (RFC)
• Lack of Cohesion (LCOM)
• Coupling between Object Classes (CBO)
• Depth of Inheritance Tree (DIT)
• Number of Children (NOC)
For weighting, CK suggest the McCabe’s [4] complexity calculation. There is also a proposal to grade relationships
between classes such as association, dependency, aggregation, composition and generalization/specialization, and
not treat them equally. Since components are basically a set of collaborating classes and a set of interfaces to the
components (which are themselves classes), it is quite a straightforward extension of CK metrics for computing
values in case of Component Based Software Development. However, some authors feel that additional metrics such
as reuse, packing density and criticality are needed to complete the picture.

2.2 Component Based Metrics:

Many authors argued that Component Based Software Development (CBSD) is not a simple extension of
Object Oriented Software Development, although some of the concepts can be used. Dolado [5] analyzed 46
projects and used Neural Networks for computing the figures. But the technology used in software applications at
that time was fourth generation languages (Application Language Liberators). Dolado also used Mark II version of
Function Point. But today's CBSD is far more sophisticated for using this approach. For example,
Frameworks are used in conjunction with components. Cho and Kim [6] use a banking case study to show how we
can calculate static and dynamic complexities of components. They too use McCabe's Cyclomatic Complexity for
assigning weights. They assign priorities of 2, 4, 6, 8 and 10 for Dependency, Association, Generalization,
Aggregation and Composition respectively. They also propose new measures for customizability and reusability.

2.3 Web Metrics:


Pioneering work, using empirical methods, has been done by Emilia Mendes [7] [8] [9] after analyzing
several web hypermedia projects. They use three techniques, namely, Expert Judgment, Algorithmic Models and
Machine Learning. Essentially, we use the first technique and assign weights to pages, links, database connection,
multimedia, and so on. That is, we depend on experts to make a subjective assessment of numeric values to be
assigned to various factors (as is the case with Albrecht's Function Point method). We presented this in the
Functional Sizing Summit 2007[10]. This Work was further extended to include frameworks and published in
IJWSP [11]. However, performance issues were not considered by us at that time.

2
3. PERFORMANCE TESTING:
The goal of performance testing is not to find bugs, but to eliminate bottlenecks and establish a baseline for
future regression testing. To conduct performance testing is to engage in a carefully controlled process of
measurement and analysis. Ideally, the software under test is already stable enough so that this process can proceed
smoothly.
Web applications depend on quick response to a visitor's request. Often times, the software has to be loaded
in the server, before a service can begin. Since there will be repeat requests for the same information, cache and
other techniques (e.g., proxy) are used to speed up the process. Rarely, we find that such information is provided by
available frameworks. We therefore use tools to intercept requests so that we can obtain the relevant information.

3.1. Tools for Performance Measurement:


Two screen shots of the tool OpenSTA are given in the appendix (figures 4 and 5). These give a flavor for
the tool. OpenSTA is an open source tool for performance measurement. After we developed web applications, we
ran these applications, and measured their performance using the tool OpenSTA, and thus compared various
projects. The tool creates virtual users to simulate a live environment that loads the server. OpenSTA has the
following components:
• Test Commander – The central control application for testing using Open STA
• Name Server – CORBA background process to let open STA components find each other and
communicate.
• Script Modeler- Applications where scripts are recorded and developed.
• HTTP Gateway – Proxy like background process that performs the recording.
• Test Executer – Background process which actually executes the test.
• Web Relay Daemon – Uses XMLRPC to get over CORBA limitations on the internet.
• Repository – Where all test scripts, configurations and results are stored.
• Test Manager – Background process, which manages the test run.
• Task group executer – Process which runs a task group.

4. MOODLE FRAMEWORK:
We chose this because it is very popular in all educational institutions. It is an eLearning Framework. It is
written in PHP. It is open source software.
4.1. PHP (Personal Hypertext Processor):
Today, PHP is competing with Perl for building high performance dynamic web sites. It is a server side
scripting language, and uses a Parser for dynamically interpreting scripts containing both HTML and PHP as shown
in Appendix Fig.1. The Zend engine enhances the performance of PHP based web sites. Software can be developed
using the Object Oriented paradigm, including SOAP. Exception handling enables the developers to handle errors
during operation. Excellent support is provided for MySQL database management system, as well as SQLite.

4.2. MOODLE:
MOODLE stands for Modular Object Oriented Dynamic Learning Environment. It is open source software
for course management. It has an excellent database organization, supported by an ADODB library. The
components of Moodle are called 'activity modules' and could be useful in informing the expansion of the ELF
which is currently not so activity-based. The popularity of Moodle is partly because of the way it draws students in
and partly because it is relatively easy to install and run, not requiring any Java or .NET skills. Its functionality has
captured the imagination of many teachers and learners. With the code at our disposal we can develop new functions
to match our needs and the needs of educational institutions. The database structure is shown in Appendix Fig.2, and
the software architecture is shown in Fig.3. MOODLE can be hosted in several sites and linked to each other. It
supports Linux, Windows and Mac OSX.
A software house, Tenth Planet Technologies Limited, Chennai, specializing in this framework supplied us
generously data for doing our empirical analysis. We also made use of graduate students to do projects in house. We
also made use of the URLs given in references.

3
5. EMPIRICAL STUDY:
The study uses 2 approaches. In the first approach, we gave the same project to several developers. They
built the software, and quantified the effort involved. In the second approach, we ran the software with OpenSTA
intercepting the requests. Our conclusions are based on the outcome of these two approaches.
5.1 Web Metrics:
For each project, we asked the experts who developed it using ASP, PHP or MOODLE to give weighting to
the following factors:
• Platform Neutral
• Creating Record Set
• Database Connection
• Email Objects
• Cascading Style Sheet
• Content (multimedia)
• Scripting Language
• Audio and Video Files
We then computed the project size by multiplying the number of each item with the weight for the item, and totaling
them. The expert’s view of the weights is given below.
MetricsforVariousFrameworks

12
Weightage

10

8 ASP
6 PHP

4 MOODLE

2
n
t,

e
o
e
l

g
a

ti
S

0
c
tr

ct

s
ts

u
e
rd
u

le
je

g
n

n
e

fi
o

n
b
n
N

te
S

a
c

o
o

ia
C
e

ip o n

L
rm

il

d
R

e
e

C
m
o

tin
g

M
s
tf

E
la

ti

b
a
P

cr
ta
re

S
a
C

Metrics

5.2 Measuring LOC/FP metrics using COSMIC FP:


"The Common Software Measurement International Consortium (COSMIC) has come out with a
measurement method for functional size based on some assumptions. Firstly, a layered architectural style is the basis
for component assembly; no component can straddle two layers. The metric is based on data movement, and it
ignores data manipulation. In addition to 'Entries' and 'Exits' of data to and from components, there are also 'Reads'
and 'Writes' from and to persistent storage. We simply sum up the Entries, Exits, Reads and Writes to arrive at the
size. Event-driven paradigm is assumed for programming. An event triggering a functional process is considered an
Entry, and may have only one data attribute (not a group). If input to a functional process comprises more than one
data group, identify each unique data group as one Entry. Do likewise for Exits, Reads and Writes. Any message
from a functional process to the user retrieving data shall not be counted as an Exit. A requirement to delete a data
group from persistent storage shall be measured as a single Write."
We have collected metrics data from one web application developed by students in both proprietary and
non-proprietary server programs. The following things list out the various problems and solution oriented metrics
and it’s attributes of proprietary and non-proprietary software.

The chart for LOC/FP values for proprietary and non-proprietary server program is given below.

4
LOC/FP LOC/FP of ASP and PHP

Applications

5.3. Performance Metrics:

When testing all these applications including open source software (PHP, Perl) and proprietary software
(ASP,JSP, JavaScript) using openSTA testing, the elapsed time for the open source software is less when compared
to the proprietary software. The OpenSTA test used in this dissertation itself is an open source test and it supports all
the application software regardless of whether that software is installed in the system or not. It automatically
generates test results in the form of a chart. From that chart the user can easily conclude which software is best when
comparing to the other.

The properties of test used in this dissertation for all applications are, Scheduled task group is set to 10 sec. The stop
task group after fixed time is set to 5 seconds. The number of virtual user for the elapsed time with respect to timer
values is set to 2.

Starting Timer value for all Applications

Serial Application Starting timer


Number Value in sec
1 JSP 10
2 ASP 14
3 JavaScript 16
4 Perl 5
5 PHP 5

The properties of test used in this paper for all applications are, Scheduled task group is set to 10 sec. The stop task
group after fixed time is set to 5 seconds. The number of virtual user for the elapsed time with respect to timer
values is set to 2.

5
Chart showing Elapsed time & timer values for PHP

6. Enhanced framework for web application

Enhanced framework for web application contains the following properties like Rich Internet Application
(RIA), Event Driven Programming (EDP), and Auth Module.
(Rich Internet Application) is a fully featured software package that runs in a browser. Early generations of
Internet-hosted, browser-based applications were notoriously basic compared to equivalent software that ran on a
Windows or Mac desktop. This led to the evolution of RIA platforms (also known as rich client platforms), which
boost the core functionality of the basic browser by temporarily downloading extra software to the client. This
makes it possible to develop applications with the look and feel of a full-fledged Windows or Mac application,
making them faster and more convenient to use.
Event Driven Programming (EDP)
An application that responds to input from the user (mouse movement, keystrokes, menu choices, etc.) or from
messages from other applications. A program, which waits for events to occur and responds to them, instead of
going through a prearranged series of actions.

Auth Module:

It indicates whether the framework has an inbuilt module for handling user authentication.

7. CONCLUSION:
This paper presents the results of the quantitative measurement of traditional and component based
framework metrics for developing web applications. Moreover, it offers fresh motivation for empirical and analytical
research that may lead to more accurate models and metrics for component-based systems.
The researcher has made a comprehensive analysis of using architectural styles and patterns in component
based software engineering through various frameworks like DCOM, CORBA and TYPO3.Architecture based
measures have been introduced for Proprietary and Non-proprietary framework with the help of a web analyzer tool.

6
Results show that component based architecture measures have a better optimization than a traditional based
architecture framework.
The researcher has focused attention on evaluating the size metrics for Proprietary and Non-proprietary
server programs using data set. Various frameworks are compared with the help of the metrics and the results show
that non-proprietary framework is better than proprietary framework for developing dynamic web applications.
Time and cost are reduced while using open source frameworks.
Function point complexity weights have been calculated for both Proprietary and Non-proprietary data sets.
The results obtained in the experiments carried out show that the solution oriented measures are better than the
problem oriented measures for evaluation.
This paper emphasized the following facts through quantitative evaluation of solution-based metrics.
Experimental results reveal that the traditional framework is platform dependent but in case of component-based
framework, it is platform independent and user-friendly .It also has many advantages such as plug-in, templates,
performance, and time to create a dynamic web application. Moreover traditional software metrics are inappropriate
for component-based systems.
A new attempt is made to analyze Open Source Software and Proprietary software using performance
testing tool. In this thesis, the website application was developed in various software including open source software
and proprietary software. All of these applications are tested using OpenSTA performance testing tool. From the
results, one can conclude the benefits of Open source software when compared to the Proprietary software. The
performance measures like elapsed time and starting timer values of the Open Source Software are less compared
with the Proprietary software. After running open source software website using scripts and tests, the test report is
automatically generated for the same number of virtual users.
A new framework called Rich Internet Application (RIA PHP) framework was used for developing web
application. The features of existing PHP frameworks are compared with that of the RIA PHP framework.
Components of RIA PHP like Modules and Event Driven Programming (EDP) are evaluated against the existing
PHP frameworks.RIA PHP introduce a client-side rendering engine which enables asynchronous processing
independent of communication with the back-end server.
It significantly improved user experience for many different types of user tasks. Web applications can be
developed in a full-fledged manner using the enhanced RIA PHP framework..

Quantitative evaluation of such frameworks using empirical studies is the main mechanism in comparing
design alternatives. This kind of a quantitative framework before embarking on an application framework is novel in
this researcher's view.

8. ACKNOWLEDGEMENT:

We sincerely thank the company Tenth Planet Technologies Limited, Chennai, and its staff for providing us
with information on software projects.

9. REFERENCES:

a. Publications
[1] A. J. Albrecht and J. E. Gaffney Jr., “Software Function, Source Lines of Code and Development Effort
Prediction”, IEEE Transactions on Software Engineering, Vol. 9, No. 6, November 1983.
[2] M. H. Halstead, “Elements of Software Science”, Elsevier, New York, 1977.
[3] S. Chidamber and C. Kemerer, “A Metrics Suite for Object Oriented Design”, IEEE Transactions on Software
Engineering, Vol. 20, No. 6, June 1994.
[4] T. McCabe, “A Software Complexity Measure”, IEEE Transactions on Software Engineering, Vol. 2, No. 12,
December 1976. [5] J. J. Dolado, “A Validation of the Component Based Method for Software Size Estimation”,
IEEE
Transactions on Software Engineering, Vol. 26, No.10, October 2000.
[6] E. S. Cho, M. S. Kim and S. D. Kim, “Component Metrics to Measure Component Quality”, the 8th Asia Pacific
Software Engineering Conference (Macau), 2001.
[7] E. Mendes, N. Mosley and S. Counsell, “Web Metrics – Estimating Design and Authoring Effort”, IEEE
Multimedia Special Issue on Web Engineering, 2001.
[8] E. Mendes, N. Mosley and I. Watson, “A Comparison of Case based
Reasoning Approach to Web Hypermedia Project Cost Estimation”,
Proceedings of the 11th International World Wide Web Conference (Hawaii), 2002.

7
[9] E. Mendes, N. Mosley and S. Counsell, “Investigating Early Web Size Measures for Web Castigation”,
Proceedings of EASE Conference
(Keele University), 2003.
[10] R. Thirumalai Selvi, “Metrics in Component Based Software Engineering”, Second International Functional
Sizing Summit (IFPUG), Vancouver, Canada, April 2007.
[11] R. Thirumalai Selvi, N. V. Balasubramanian and P. Sheik Abdul Khader, “Framework and Architectural Style
Metrics for Component Based Software Engineering”, International Journal of Web Services Practices, Vol. 3, No.
12, 2008, ISSN 17386535.
[12] AaronDon M. Africa “Quantitative Evaluation of Open Source Content Management Systems “IEEE
Multidisciplinary Engineering Education Magazine, Vol.3, No. 2, June 2008
[13] http://www.openSTA.org
[14] http://www.opensourcetesting.org/resources.php
[15] http://en.wikipedia.org/wiki/Proprietary_software

b. URLs

meetinguniverse.com
nambco.com
showcase.rhytha.org/vyabrcdmainteractive.com
thejoengg.com
denvik.inchipkidz.com
10.APPENDIX
Fig.1 Interaction Between Web Server & PHP Parser

HTML PHP
Web server + Parser
PHP

Pure
HTML
Data
Base

Fig. 2 Diagram For MOODLE Database

8
Fig. 3 Architecture Diagram for Moodle

Fig. 4 The Commander Screen of OpenSTA

Fig. 5 Single Stepping in OpenSTA