You are on page 1of 19

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/276839987

Continuous CMMI Assessment Using Non-Invasive Measurement and Process


Mining

Article  in  International Journal of Software Engineering and Knowledge Engineering · November 2014


DOI: 10.1142/S0218194014400117

CITATIONS READS

8 415

4 authors:

Saulius Astromskis Andrea Janes


Free University of Bozen-Bolzano Free University of Bozen-Bolzano
9 PUBLICATIONS   48 CITATIONS    71 PUBLICATIONS   702 CITATIONS   

SEE PROFILE SEE PROFILE

Alberto Sillitti Giancarlo Succi


Innopolis University Innopolis University
243 PUBLICATIONS   2,341 CITATIONS    457 PUBLICATIONS   4,792 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Recommendation Techniques for Software Quality Improvement in Small Medium Enterprises View project

Innometrics View project

All content following this page was uploaded by Andrea Janes on 05 August 2016.

The user has requested enhancement of the downloaded file.


International Journal of Software Engineering
and Knowledge Engineering
Vol. 24, No. 9 (2014) 1255–1272
#.c World Scienti¯c Publishing Company
DOI: 10.1142/S0218194014400117

Continuous CMMI Assessment Using Non-Invasive


Measurement and Process Mining

Saulius Astromskis*, Andrea Janes†, Alberto Sillitti‡ and Giancarlo Succi§


Center for Applied Software Engineering
Free University of Bozen/Bolzano, Piazza Domenicani 3
Bolzano 39100, Italy
*saulius.astromskis@stud-inf.unibz.it

andrea.janes@unibz.it
‡alberto.sillitti@unibz.it
§giancarlo.succi@unibz.it

The reputation of lightweight software development processes such as Agile and Lean is
damaged by practitioners that claim bene¯ts of such processes that are not true.
Teams that want to demonstrate their seriousness, could bene¯t from matching their pro-
cesses to the CMMI model, a recognized model by industry and the public administration.
CMMI stands for Capability Maturity Model Integration and provides a reference model to
improve and evaluate processes according to their maturity based on best practices.
On the other hand, particularly in a lightweight software development process, the costs of a
CMMI appraisal are hard to justify since its advantages are not directly related to the creation
of value for the customer.
This paper presents Jidoka4CMMI, a tool that once a CMMI appraisal has been con-
ducted allows the documentation of the assessment criteria in form of executable test cases.
The test cases, and so the CMMI appraisal, can be repeated anytime, without additional costs.
The use of Jidoka4CMMI increases the bene¯ts of conducting a CMMI appraisal. We hope
that this encourages practitioners using lightweight software development processes to assess
their processes using a CMMI model.

Keywords: CMMI; non-invasive measurement; process mining.

1. Introduction
Promoters of non-traditional software development processes (e.g. Extreme Pro-
gramming [1], Scrum [2], Lean software development [3], etc.) often face the problem
that their success attracts dubious consultants that use this success to attract new
clients [4]. Such consultants then claim that you do not need to document anything,
can solve every problem, can produce bug-free software, turn your customers into
friends, maintain deadlines easily, and will have a 9 to 5 job. Then, when such claims

1255
1256 S. Astromskis et al.

cannot be met, such behavior damages the reputation of the new method and those
that use it.
Moreover, some companies call themselves Agile, Lean, etc., not because they
pursue agility, but because it is fashionable to do so [5]. The picture of a young,
dynamic, °exible, and lean team is what customers expect and, as a consequence,
what companies convey within their marketing material.
In summary, the described development damages the reputation of lightweight
software development processes. The conscientious part of the community needs to
develop methods and tools to objectively assess their method to produce software
and hence, to understand in which context it works and in which not [6]. The
availability of such methods and tools will help to distinguish the serious practitioner
from the quacksalver.
Agile and Lean teams could bene¯t from having a reference model, accepted by
the industry and the public administration, that de¯nes what it means to develop
\good" software. Teams that adhere to such model could claim that what they do
corresponds to the best practices of software development.
The CMMI for Development is such a model that describes typical elements of
e®ective processes [7]. Therefore, a proposed solution to increase the reputation of
Agile and Lean teams is to embrace both: CMMI and Agile [7].
The comparison of the development process of a given organization with the
recommendations of the CMMI helps to evaluate the maturity of the analyzed
process, ¯nd gaps between the performed activities and the suggested ones, identify
improvement possibilities, and demonstrate their maturity towards criticizers using
a model recognized throughout the industry and public administration.
According to Hillel Glazer, one of the autors of the paper \CMMI or Agile, Why
not embrace both?" [7], a CMMI appraisal for a small business with less than 100
developers that is completely ready to be appraised can cost from $36.000 to $60.000,
the exact price depends on a variety of factors. If the team needs help to prepare
everything to get appraised, clients typically spend about $125.000 over about a
year's time or less [8].
Agile and Lean practitioners are attentive on the elimination of any unnecessary
activities to improve their e±ciency. Hence, there is a certain level of skepticism
towards the CMMI since its value is indirect: a CMMI appraisal does not directly
contribute to the creation of value for the client. Only if the team is able to use the
results of the appraisal to optimize the software development process, and subse-
quently the software it produces, then it was worth it. For many, a CMMI appraisal
represents an investment that, considering the costs, is too risky.
We want to encourage practitioners using lightweight software development
processes to assess their processes from the CMMI perspective. Therefore we propose
a tool that once con¯gured, constantly evaluates the CMMI compliance of the on-
going processes, without generating additional costs.
Continuous CMMI Assessment Using Non-Invasive Measurement and Process Mining 1257

The goal of our tool, called Jidoka4CMMI, is (similar to JUnita or Jenkinsb) to


provide a framework to de¯ne test cases that verify if a process conforms the CMMI
recommendations.
As JUnit or Jenkins, our tool is based on the Jidoka principle (see Sec. 3.1), i.e.
once con¯gured, it does not require any further e®ort to assess a process.
This article is structured as follows: Sec. 2 gives an overview of the state of the art,
Sec. 3 brie°y explains the concepts Jidoka, CMMI, Non-invasive data collection, and
Process mining. Section 4 provides an overview of Jidoka4CMMI, Sec. 5 describes an
example of its usage, and Sec. 6 concludes this article.

2. State of the Art


From a bird's-eye perspective, a CMMI appraisal consists of two steps: collecting
data and interpreting it according to the CMMI model [9]. Past research can be
classi¯ed into these two groups, i.e. methods and tools that (a) support the data
collection or (b) the data interpretation.
Data collection tools alleviate the burden of manual data collection through forms
and templates, extract data generated by already existing systems, or infer properties
about the executed processes using techniques based on data mining (e.g. [10–13]).
Data interpretation tools help to infer the achieved maturity level interpreting the
collected data (e.g. [14–17]).
The idea of applying process mining techniques to facilitate software process
improvement is not new. In 2006, M. Huo et al. performed a case study in a medium
sized Australian software house, where the team used the ISO/IEC 12207 stan-
dard [18] as their process. The authors collected the data about developers' activities,
mined the process, and checked for discrepancies between real and assumed process.
They used the di®erence between the processes as an input to the software process
improvement method, in their case the Plan Do Check Act (PDCA) [19] cycle
[20, 21].
A. M. Lemos published a study, where they performed an experiment in a large
Mexican software house. They checked for discrepancies between the assumed and
mined processes as well, and used it as an input to software process improvement
framework [22]. In their case it was the CMMI [23].
Table 1 summarizes the identi¯ed papers of software process improvement using
software process mining.
All of the software process improvement methodologies share the common
perquisite: the process and the product has to be continuously measured. To do
so, the data about the development process has to be collected and analyzed
(e.g. as in [27, 28]). Manual collection of this data requires additional e®ort from
the developers, and the analysis of the data collected in this way takes more

a JUnit, http://junit.org
b Jenkins CI, http://jenkins-ci.org
1258 S. Astromskis et al.

Table 1. Studies about software process improvement using software process mining.

Type a
Process improvement
Title Year framework Conformance Discovery Algorithm b
An Exploratory study of 2006 PDCA [24]  Customized Alpha
process enactment as
input to software pro-
cess improvement [20]
A Systematic Approach to 2006 Not speci¯ed  Alpha [25]
Process Enactment
Analysis as Input to
Software Process Im-
provement or Tailor-
ing [21]
Using process mining in 2011 CMMI [23]  Markov chain [26]
software development
process management:
A case study [22]
a Type of software process mining.
b Software process mining algorithm used.

time than it would if the data would be collected automatically, and stored in a
database.
Our work di®ers from previous works in that it combines non-invasive data col-
lection and process mining to support the maturity assessment. In using non-invasive
measurement techniques to collect data about the software development process, we
want to encourage practitioners using lightweight software development approaches
to analyze their processes from a CMMI perspective.

3. Background
In the following three sections we brie°y present three concepts we used to develop
Jidoka4CMMI.

3.1. Jidoka
One of the pillars of Lean Management is \Jidoka", the introduction of quality inside
the production process and product.
Jidoka is often translated as \autonomation" or \automation with a human
mind" and is usually illustrated making the example of a machine that can detect a
problem with the produced output and interrupt production automatically rather
than continue to run and produce bad output [29].
Some authors translate Jidoka with \quality-at-the-source" [30] meaning that
quality is inherent in the production system, which is in contrast to the traditional
approach that checks quality after the process. In essence Jidoka is composed by two
Continuous CMMI Assessment Using Non-Invasive Measurement and Process Mining 1259

Fig. 1. An illustration of the Jidoka principle.

parts: a mechanism to detect problems, i.e. abnormalities or defects, and a mecha-


nism to notify when a problem occurs [31].
Figure 1 illustrates the idea: produced parts are delivered into a box by an as-
sembly line. On the way a switch a mechanism to detect a problem ensures
that the parts are not to big. If the switch is enabled, the assembly line is stopped and
a lamp a mechanism to notify informs the operator that something is wrong.
In this work we propose to use the concept of Jidoka to ensure that the processes
in place correspond to the recommendations of the CMMI.

3.2. CMMI
The CMMI for Development is a model that describes typical elements of e®ective
processes [7]. It is used to assess the maturity of a process. A CMMI model describes
typical work products and typical practices that if performed indicate a
certain maturity level.
CMMI models are not precise process descriptions. The actual processes used in
an organization depend on many factors, including application domains and orga-
nization structure and size. The process areas of a CMMI model typically do not map
one to one with the processes used in the organization [32].
To automate the assessment of processes against the recommendations of the
CMMI, we adopt non-invasive data collection.

3.3. Non-invasive data collection


The term \non-invasiveness" originates from the medical ¯eld, in which it describes a
treatment that does not require \the entry into the living body (as by incision or by
insertion of an instrument) [33]".
1260 S. Astromskis et al.

SoŌware development process

Non-invasive measurement

Legend
SoŌware development acƟvity
TransiƟon from one acƟvity to another
Measurement probe

Fig. 2. An illustration of measurement probes that collect data of a process [35].

We use this term to describe a measurement methodology that does not require
any participation by the software developer. As depicted in Fig. 2, Non-invasive data
collection relies on a set of sensors or probes that extract the data from other systems
(active) or that get triggered by the observed events (passive) [34].
The measurement probe logs events and report them to a message queue over the
Internet using the REST [36] protocol.
We use a message queue to minimize the workload of the client computers. Devel-
opers do not want that their machines slow down because some measurement program
is running in the background. By using a message queue (we use Apache ActiveMQc),
we can minimize the time the client machine is busy uploading data. The data are then
read from the message queue, processed, and inserted into the data warehouse. Due to
the large amount of data we collect, we use a NoSQL database, Apache Cassandrad.
The data are then extracted from the data warehouse and processed to feed it to the
process mining algorithm and its results visualized on a dashboard (see Fig. 3).
Figure 3 contains the same elements as Fig. 1: a mechanism to detect a problem
and a mechanism to notify everybody that a problem exists: in Fig. 1, the switch
under the lamp detects the problem and the lamp noti¯es everybody. In Fig. 3, the
measurement probes report problems to the data warehouse and the dashboard
informs everybody about the current status of the system.

3.4. Process mining


Process mining provides the methods and tools that allow the discovery and en-
hancement of processes and check their conformance with de¯ned process models in

c Apache ActiveMQ, http://activemq.apache.org


d Apache Cassandra, http://cassandra.apache.org
Continuous CMMI Assessment Using Non-Invasive Measurement and Process Mining 1261

Laptop Laptop WorkstaƟon WorkstaƟon

Message Queue Data warehouse

Legend
Computer
Data transfer
Dashboard
Data processing and transfer

Fig. 3. Data °ow in the Jidoka4CMMI tool.

an organization [37]. The process mining algorithms analyze raw data without using
any a-priori information about the underlying process and aim to discover the actual
process model.
Software systems that support and/or control real word processes typically have
logging mechanisms in place so that one can analyze why something is not working or
how the systems can be improved (e.g. see [38–40]). Such data is stored in an event
log database, which is the main information source for process mining. The process
mining community distinguishes three main process mining activities:

(1) Process discovery: the core of process mining. Process discovery methods analyze
the event logs and extract process models that model the actual real world
events.
(2) Process conformance: if the organization has a formal model describing how a
process has to be executed, it is possible to compare the de¯ned model with the
mined one and verify if they conform.
(3) Process enhancement: the aim is on improving the actual process. One group of
process enhancement methods aim to modify the existing process model, by
comparing it with the de¯ned model in an organization. Another group aims to
extend the de¯ned model by cross-correlating it with the activity log, and
enriching the existing process model with additional perspectives, e.g.
throughput times, service levels, bottlenecks, and so on [37].

There is a common misconception, that the process models that can be mined are
only related to the control-°ow of activities [41]. The control-°ow is the core of mined
process models, but other perspectives can be mined too; some of them can be
combined with the control °ow perspective. Possible process mining perspectives are
summarized in Table 2.
1262 S. Astromskis et al.

Table 2. Possible process mining perspectives [37].

Perspective Aim
Control-°ow Describe the discovered models as an ordered °ow of activities, modeled and visu-
alized as Petri-nets [42], EPCs [43], BPMN [44], YAWL [45], UML Activity
Diagram [46], etc. to identify a \good" a characterization of all possible paths.
Social network Study the communication between the team members during the life cycle of the
task.
Organizational Cluster team members to roles according to the tasks that they are usually
working on.
Information Map activities to the typical information or material resources of information that are
and resource needed to execute the task.
Case Study the properties of particular cases. Cases can be characterized by its path in the
process or by the originators working on it.
Time Study the timing and the frequency of events. If the events in the initial log have
timestamps, then it is possible to discover process bottlenecks, observe the re-
source utilization and predict the time that the process needs to ¯nish.
a With \good" we mean that the characterization conforms to the business goals (see section 4.1).

The possibilities of mining di®erent perspectives are bounded by the amount of


data available. The simplest model that we can mine is the control-°ow model. It
requires the event data to have the time stamp, a case identi¯er, and the activity.
The case identi¯er \glues" all activities together that belong to one case. A case is
some event (in which di®erent observable activities take place) that the software
development team considers worth analyzing.
Typical use cases of process mining techniques are [47]:

. Discover the structure of the process to reveal the structure of an unknown process
or discover how a process looks like in practice.
. Discover the most frequent path in the process so the user is able to get a clearer
understanding of their process by knowing the path of events followed by the
majority of cases. This information can be used for redesign purposes, since op-
timizing the most frequent path in the process might bring bene¯ts to the overall
performance.
. Visualize the distribution of cases over paths: by visualizing the percentage of
process instances following each possible path the user can decide to focus on the
most frequent behavior. This also allows determining whether there are a lot of
variations in the process and possibly representing a starting point for a redesign
meant to reduce these variations.
. Check for compliance to the explicit model: By executing this use case scenario, the
user can detect possible inconsistencies between the actual process and the one
that is prescribed. This analysis may either lead to the enforcement of the docu-
mented process or to an update of the documentation.

The research area of process mining is still rather young and these use cases
are just the examples, and do not reveal the full potential of this discipline. We
Continuous CMMI Assessment Using Non-Invasive Measurement and Process Mining 1263

want to propose the possibility to apply process mining techniques to support the
CMMI appraisal.

4. The Jidoka4CMMI Tool


The software engineering challenge we propose to address, is to allow practitioners to
de¯ne test cases for their processes that are monitored by Jidoka4CMMI. Such a test
case de¯nes criteria to detect a problem and a mechanism to notify stakeholders
when a problem occurs.
An analogy to our tool is to set up a mouse trap: we want to set it up so that it
snaps when there is a problem.
The architecture of our system is shown in Fig. 4. The measurement probes
continuously extract data from the development process about the employed
resources, the produced output, and the carried out activities.
We implemented measurement probes that record all interactions developer have
with popular integrated development environments such as Eclipsee or Microsoft
Visual Studiof, o±ce automation suites such as Apache Open O±ceg or Microsoft
O±ceh, and other tools developers frequently use (e.g. web browsers or chat appli-
cations). Developers interested in adding additional data to the system can develop a
plugin implementing a de¯ned interface.
The collected data (a log of all detected events) are stored in the Data Storage.
The Rules Engine retrieves the data from the data storage, analyses them, and
generates warnings if a rule is violated. The Noti¯er executes the actions to perform if
a rule is violated.

Rules Engine Warnings NoƟfier

Rule
execuƟon

Data Measurement Measurement


Storage probes

Fig. 4. Component diagram of the Jidoka4CMMI tool.

e Eclipse,
http://www.eclipse.org
f MicrosoftVisual Studio, http://www.microsoft.com/visualstudio
g OpenO±ce, http://www.openo±ce.org
h Microsoft O±ce, http://o±ce.microsoft.com
1264 S. Astromskis et al.

To cover the assessment of as many speci¯c practices as possible, we also collect


the con¯guration and utilization of hardware or software items as well as the access
permissions to artifacts and resources.

4.1. How to use Jidoka4CMMI


Typically, a CMMI appraisal team identi¯es practice implementation indicators, i.e.
evidence that provides a basis for veri¯cation of the activity or practice implemen-
tation.
These practice implementation indicators (since version 1.3 called \objective
evidence") indicate how well a practice was implemented [7]. The concept of
\objective evidence" relies on the idea that the implementation of a practice results
in \footprints" that can be observed and that prove the implementation of the
practice.
A user that wants to write a test case for a process using Jidoka4CMMI has to:

(1) Specify a Goal Question Metric (GQM) model [48] that de¯nes what to collect,
how to collect it, and why to collect it;
(2) Specify the conditions that the collected \footprints" have to ful¯ll using
JavaScripti or Javaj;
(3) De¯ne how a detection of non-conformance has to be noti¯ed.

In step (1) we require the user not only to specify what data to collect, but also to
document which question this data answers and how to interpret the answer of the
question. The result of the test case is then linked back to the measurement goal to
understand what the result means.
The CMMI model for Development speci¯es requirements for processes. Jido-
ka4CMMI contains process mining support to allow the user to evaluate if a process
conforms to the de¯ned process or, if we ¯nd out that the de¯ned process is not
followed, to discover the real process.
Process mining is a research discipline that sits between machine learning and
data mining on the one hand, and process modeling and analysis on the other
hand [37]. Process mining supports:

. Process discovery, in which an algorithm generates a process model based on the


data coming from event logs.
. Process conformance, in which an algorithm locate discrepancies between the
actual and the ideal process. Process conformance requires the de¯nition of the
ideal model in addition to the event logs.

i JavaScript, http://www.ecmascript.org
j Oracle Java, http://www.java.com
Continuous CMMI Assessment Using Non-Invasive Measurement and Process Mining 1265

. Process enhancement, in which the current process is analyzed to ¯nd opportu-


nities to enrich the model, e.g. to make it re°ect the reality better or to align it to
the strategy of the organization.

The detection of non-conformance can be noti¯ed via e-mail or in a user con¯g-


urable dashboard. The dashboard summarizes the results of the test cases.

5. Proof of Concept
The initial validation of our approach occurred through the implementation of proof-
of-concepts. In this paper we present one. Each proof-of-concept consists of ¯ve
parts: the description of the context, the GQM model that describes why, how, and
which data is collected, examples of the collected data, the description how the data
is processed, and examples of possible results of the analysis.

5.1. Context description


The following proof-of-concept demonstrates how to write a test case for the
speci¯c practice 1.1, \Objectively Evaluate Processes" that is part of the \Process
and Product Quality Assurance" process area of the CMMI for Development,
Version 1.3 [32].
In this proof-of-concept, the process to evaluate is the requirements management
process. Each requirement has to go through the states \Backlog", \Selected",
\Development", \Done", \Deployment", and \Live".
Initially, the product owner adds new requirements to the requirement backlog.
Requirements that should be picked up by developers are set to the status
\Selected". When a developer begins implementing a requirement, he or she sets the
status of the requirement to \Development" and after ¯nishing, to \Done". When a
feature gets deployed to the live system, the status of the requirement is changed to
\Deployment" and ¯nally, after ¯nishing the deployment, to \Live".

5.2. The GQM model


Figure 5 shows the Goal Question Metric model that speci¯es how Jidoka4CMMI
should verify if the requirements management process ful¯lls the speci¯c practice 1.1
of the \Process and Product Quality Assurance" process area.
The model veri¯es if requirements follow the de¯ned process through process
mining. The log of status changes is compared with the de¯ned process model using
process conformance checking. Moreover, using process discovery, the number of
loops that requirements performed is evaluated. If requirements are moved from
\Deployment" to \Development" too often, it might indicate that the team has
di±culties with some type of requirements, e.g. that the requirements are not precise
enough.
1266 S. Astromskis et al.

Analyze
requirements management (object)
for the purpose of conformance checking (why)
with respect to the defined process (focus)
from the point of view of the project manager (who)
in the context of the soŌware
development department (where).

Goal Do we have requirements


that take longer than
Do requirements expected to implement?
follow the defined
process? QuesƟon QuesƟon

Metric Metric

Process conformance Number of loops a


between the defined requirement performs
process model and the between the status
change log of the “Deployment” and
requirements “Development”

Fig. 5. An excerpt of a Goal Question Metric model describing how a given speci¯c practice is evaluated.

5.3. Example of collected data


The data used to evaluate this GQM model comes from a probe that logs all status
changes of the requirements. We track the task id, name of the task, status and the
timestamp of status change. In our proof of concept, the probe connected to the
requirements management system reports data as in Table 3.
The data is passed to the process mining engine that generates a process that
re°ects the collected data. The generated process model uses task ID as a process
instance id, status as an activity, and the timestamp. Additional ¯ltering can be done
according to the type of the task.
To extract the visual process model from the event log, we use an algorithm called
the Heuristics miner because this algorithm is suited for real world data [49]. The
Heuristics miner algorithm is able to deal with noisy data [50], ignores rare events
and activities, and therefore generates simple models representing only the most
common behavior. The example visualization is shown in Fig. 6.

5.4. System con¯guration and example results


In our proof of concept, to make the analysis automatic, one has to model the GQM
model through our system.
Continuous CMMI Assessment Using Non-Invasive Measurement and Process Mining 1267

Table 3. Example data.

Id Name Type Name Timestamp


a1e21737-6d®-4bed-b5a7-e7419439fd0a Task 1 3 Backlog 2014-07-16 22:20:56
1f08e6f0-e11c-46f9-a672-8009d58c81a3 Task 2 2 Backlog 2014-07-16 22:21:10
1cd48893-de7c-4890-8bf6-39d440a3d39f Task 3 1 Backlog 2014-07-16 22:21:28
a1e21737-6d®-4bed-b5a7-e7419439fd0a Task 1 3 Selected 2014-07-16 22:21:33
1f08e6f0-e11c-46f9-a672-8009d58c81a3 Task 2 2 Selected 2014-07-16 22:21:40
1cd48893-de7c-4890-8bf6-39d440a3d39f Task 3 1 Selected 2014-07-16 22:21:46
a1e21737-6d®-4bed-b5a7-e7419439fd0a Task 1 3 Development 2014-07-16 22:21:52
a1e21737-6d®-4bed-b5a7-e7419439fd0a Task 1 3 Done 2014-07-16 22:22:02
1f08e6f0-e11c-46f9-a672-8009d58c81a3 Task 2 2 Development 2014-07-16 22:22:08
1f08e6f0-e11c-46f9-a672-8009d58c81a3 Task 2 2 Done 2014-07-16 22:22:12
1cd48893-de7c-4890-8bf6-39d440a3d39f Task 3 1 Development 2014-07-16 22:22:15
1cd48893-de7c-4890-8bf6-39d440a3d39f Task 3 1 Done 2014-07-16 22:22:20
a1e21737-6d®-4bed-b5a7-e7419439fd0a Task 1 3 Deployment 2014-07-16 22:22:24

At the metric level, the measurement probes collect process data which are stored
in our database.
At the question level, it is necessary to link the collected data to the question it
answers. To de¯ne how this link occurs, we allow the user to de¯ne rules written in
JavaScript that de¯ne how the incoming data is processed and interpreted to de-
termine the answer of the question.
To ensure that the system can be easily extended, every measurement probe can
expose objects that allow the JavaScript rules to access the collected data and so, to
construct rules that interpret them.
These objects are accessible while con¯guring the data, and we developed an
autocomplete feature, that helps the user to de¯ne the rules. An example of a rule
that identi¯es cases where a task was moved from the \Deployment" phase back to
the \Development" phase is reported below.

Fig. 6. Dashboard visualizing the outcome of two measurements in form of tiles.


1268 S. Astromskis et al.

function live_to_dev_checker(processMetrics) {
var matches = [];
for (var i = 0; i < processMetrics.instances.length; i++) {

var eventFlow = processMetrics.instances[i].events;


var instanceName = processMetrics.instances[i].instanceName;

for (var j = 1; j < eventFlow.length; j++) {

if (eventFlow[j - 1].eventName === "Deployment" &&


eventFlow[j].eventName === "Development") {
matches.push(instanceName);
}

}
}
return matches;
};

This function loops over the di®erent process instances, and looks for matches
of a task sequence. The results of the rule are used to interpret the connected GQM
question and to visualize the result of this interpretation on a dashboard.
We visualize the results of the data collection and interpretation in a dashboard,
which is based on tiles (see [51] for a complete description of our dashboarding
approach). Each tile can be con¯gured de¯ning a caption, a color, and a value (see
Fig. 6).
In our case, for each question, a user can de¯ne rules to control how the inter-
preted data, i.e. the answer to the corresponding GQM question, is visualualized by a
speci¯c color, caption, and value to display on the dashboard.
For example, if for the question \How many times did a requirement move be-
tween the deployment and development phase?" we want to assign a green color if
the result is below 5, orange if it is between 5 and 15, and red if it is over 15, the rule is
as follows.

function tile_background (noOfCases){


if(noOfCases<5){
status= "GOOD";
} else if (noOfCases>=5 && noOfCases<15){
status="WARNING";
} else if (noOfCases>=15){
status = "BAD"
}
}
Continuous CMMI Assessment Using Non-Invasive Measurement and Process Mining 1269

Fig. 7. Heuristics Net produced by Heuristics Miner.

An example of two such tiles is shown in Fig. 6: one tile is colored red, because the
number of cases where the requirement was moved from deployment to development
is higher than 15.
In the same way as JavaScript rules de¯ne the interpretation of data to answer a
question, we allow the user to de¯ne JavaScript rules to use the answers to a set of
questions to evaluate the outcome of a measurement goal.
In this way, the user of our dashboard can immediately understand which speci¯c
practices and process areas do not need any attention (green), have warnings (yel-
low), and where immediate attention is needed (red). Each tile can also display a
numeric value and a message to help the user to understand which practice requires
improvement.
As recommended by [52], to increase the trust of the user in our visualization, we
give the user the possibility to see the original data that caused the state of the tile. In
our example, if a user clicks on the tile describing the process conformance, a visu-
alization is displayed that describes the identi¯ed process is depicted in Fig. 7.

6. Conclusion
In this work, we present a novel scenario in the assessment of light weight software
development processes.
We show, how quality assurance in the of the recommended practices by the
CMMI can be \built into the process", as proposed by Lean Thinking [53].
We give an example of how to integrate the continuous monitoring of produced
artifacts, ongoing development activities, and consumed resources with the pro-
duction process together with the knowledge about unacceptable values of the
monitored data.
Once modeled in a tool as described in this paper, also low ceremony processes
such as Agile and Lean processes, can continuously assess their processes using the
CMMI framework.

References
1. K. Beck and C. Andres, Extreme Programming Explained: Embrace Change, 2nd ed.
(Addison-Wesley Professional, 2004).
2. K. Schwaber, Agile Project Management with Scrum (Microsoft Press, Redmond, WA,
2004).
1270 S. Astromskis et al.

3. M. Poppendieck and T. Poppendieck, Lean Software Development: An Agile Toolkit.


(Addison-Wesley Longman, Boston, 2003).
4. A. A. Janes and G. Succi, The dark side of agile software development, in Proc. of the
ACM International Symposium on New Ideas, New Paradigms, and Re°ections on
Programming and Software, 2012, pp. 215–228.
5. M. Ceschi, A. Sillitti, G. Succi and S. D. Pan¯lis, Project management in planbased and
agile companies, IEEE Software (2005) 21–27.
6. R. Moser, P. Abrahamsson, W. Pedrycz, A. Sillitti and G. Succi, A case study on the
impact of refactoring on quality and productivity in an agile team, in Balancing Agility
and Formalism in Software Engineering, Lecture Notes in Computer Science Vol. 5082,
2008, pp. 252–266.
7. H. Glazer, J. Dalton, D. Anderson, M. Konrad and S. Shrum, CMMI or agile: Why not
embrace both! Software Engineering Institute, Carnegie Mellon University, Tech. Rep.,
2008.
8. H. Glazer, Roughly how much does an CMMI evaluation cost for a small business (less
than 100 developers)? http://www.quora.com/Roughly-how-much-does-an-CMMI-eval-
uation-cost-for-a-small-business-less-than-100-developers, Feb 2011, online, accessed
November 15, 2013.
9. SCAMPI Upgrade Team, Standard CMMI Appraisal Method for Process Improvement
(SCAMPI) A, Version 1.3: Method De¯nition Document, http://www.sei.cmu.edu/li-
brary/abstracts/reports/11hb001.cfm, 2011, online, accessed November 15, 2013.
10. T. Sunetnanta, N.-O. Nobprapai and O. Gotel, Quantitative CMMI assessment for o®-
shoring through the analysis of project management repositories, in Software Engineering
Approaches for O®shore and Outsourced Development (Springer, Berlin, Heidelberg,
2009), Vol. 35, pp. 32–44.
11. I. Garcia and C. Pacheco, Toward automated support for software process improvement
initiatives in small and medium size enterprises, in Software Engineering Research,
Management and Applications 2009 (Springer, Berlin, Heidelberg, 2009), pp. 51–58.
12. N. Chen, S. C. H. Hoi and X. Xiao, Software process evaluation: A machine learning
approach, in Proc. 26th IEEE/ACM International Conference on Automated Software
Engineering, 2011, pp. 333–342.
13. J. R. Curanas, Process Mining Opportunities for CMMI Assessments, http://www.tue.
nl/en/publication/ep/p/d/ep-uid/276454, 2010, online, accessed November 15, 2013.
14. Accenture, Measuring and Managing the CMMI Journey Using GQM, http://www.sei.
cmu.edu/library/abstracts/presentations/Prasad-SEPG2006.cfm, online, accessed No-
vember 15, 2013.
15. L. Buglione, Strengthening CMMI maturity levels with a quantitative approach to root-
cause analysis, in Proc. of 5th Software Measurement European Forum, 2008.
16. M. West, Real Process Improvement Using the CMMI (Auerbach Publications, 2004).
17. M. Khraiwesh, Validation measures in CMMI, World of Computer Science and Infor-
mation Technology Journal 1(2) (2011) 26–33.
18. ISO, Systems and software engineering–software life cycle processes, 2008, Geneva,
Switzerland.
19. William Edwards Deming, Out of the Crisis (MIT Center for Advanced Engineering,
Cambridge, 1986).
20. M. Huo, H. Zhang and R. Je®ery, An exploratory study of process enactment as input to
software process improvement, in Proc. of 2006 International Workshop on Software
Quality, 2006, p. 39–44.
Continuous CMMI Assessment Using Non-Invasive Measurement and Process Mining 1271

21. , A systematic approach to process enactment analysis as input to software


process improvement or tailoring, in Software Engineering Conference, 2006,
pp. 401–410.
22. A. Lemos, C. Sabino, R. Lima and C. Oliveira, Using process mining in software devel-
opment process management: A case study, in Proc. of IEEE Int. Conf. on Systems, Man
and Cybernetics, 2011, pp. 1181–1186.
23. CMMI Product Team, CMMI for development, version 1.3, Software Engineering Insti-
tute, Carnegie Mellon University, Pittsburgh, Pennsylvania, Technical Report CMU/
SEI-2010-TR-033, 2010.
24. J. Ning, Z. Chen and G. Liu, PDCA process application in the continuous improvement of
software quality, in International Conference on Computer, Mechatronics, Control and
Electronic Engineering, Vol. 1, 2010, pp. 61–65.
25. W. van der Aalst, T. Weijters and L. Maruster, Work°ow mining: discovering process
models from event logs, IEEE Trans. Knowledge and Data Engineering 16(9) (2004)
1128–1142.
26. A. Rozinat, M. Veloso and W. M. van der Aalst, Evaluating the quality of discovered
process models, in 2nd Intl. Workshop on the Induction of Process Models, 2008, pp. 45–52.
27. E. di Bella, I. Fronza, N. Phaphoom, A. Sillitti, G. Succi and J. Vlasenko, Pair pro-
gramming and software defects–a large, industrial case study, IEEE Trans. Software
Engineering 39(7) (2013) 930–953.
28. I. D. Coman and G. Succi, An exploratory study of developers' toolbox in an agile team,
in XP, Lecture Notes in Business Information Processing Vol. 31, pp. 43–52.
29. T. Ohno, Toyota Production System: Beyond Large-Scale Production (Productivity
Press, 1988).
30. C. Standard and D. Davis, Running Todays Factory: A Proven Strategy for Lean
Manufacturing (Hanser Gardner Publications, 1999).
31. Y. Monden, Toyota Production System, 2nd ed. (Industrial Engineering Press, 1993).
32. CMMI Product Team, CMMIr for Development, Version 1.3, Software Engineering
Institute, Carnegie Mellon University, Tech. Rep., 2010.
33. Merriam Webster, De¯nition of invasive, http://www.merriam-webster.com/dictionary/
invasive, 2010, online, accessed November 15, 2013.
34. A. Janes, M. Scotto, A. Sillitti and G. Succi, A perspective on non invasive software
management, in Proc. IEEE Instrumentation and Measurement Technology Conference,
2006, pp. 1104–1106.
35. S. Astromskis, A. Janes, A. Sillitti and G. Succi, Implementing organization-wide gemba
using noninvasive process mining, Cutter IT Journal, April 2013.
36. R. T. Fielding and R. N. Taylor, Principled design of the modern web architecture, ACM
Trans. Internet Technol. 2(2) (2002) 115–150.
37. W. Aalst, A. Adriansyah, A. K. Medeiros, F. Arcieri, T. Baier, T. Blickle, J. C. Bose, P.
Brand, R. Brandtjen, J. Buijs et al., Process mining manifesto, in Business Process
Management Workshops, 2012, pp. 169–194.
38. A. Mahdiraji, B. Rossi, A. Sillitti and G. Succi, Knowledge extraction from events °ows,
in Methodologies and Technologies for Networked Enterprises, Lecture Notes in Com-
puter Science Vol. 7200, 2012, pp. 221–236.
39. A. Sillitti, G. Succi and J. Vlasenko, Understanding the impact of pair programming on
developers attention: A case study on a large industrial experimentation, in 34th Inter-
national Conference on Software Engineering, 2012, pp. 1094–1101.
40. , Toward a better understanding of tool usage: NIER track, in 33rd International
Conference on Software Engineering, 2011, pp. 832–835.
1272 S. Astromskis et al.

41. V. Rubin, C. Gunther, W. M. P. van der Aalst, E. Kindler, B. F. van Dongen and W.
Schaefer, Process mining framework for software processes, in Proceedings of Software
Process Dynamics and Agility, 2007, pp. 169–181.
42. C. Petri, Kommunikation mit automaten, Ph.D. dissertation, Rheinisch-West°isches
Institut fr instrumentelle Mathematik an der Universitt Bonn, 1962.
43. A. Scheer, ARIS - vom Gesch€ aftsprozess zum Anwendungssystem (Springer, 1999).
44. Object Management Group (OMG), Business Process Model and Notation (BPMN)
Version 2.0, Tech. Rep., 2011.
45. W. M. P. van der Aalst and A. H. M. ter Hofstede, Yawl: Yet another work°ow language,
Inf. Syst. 30(4) (2005) 245–275.
46. J. Rumbaugh, I. Jacobson and G. Booch, Uni¯ed Modeling Language Reference Manual,
2nd Edition (Pearson Higher Education).
47. I. Ailenei, A. Rozinat, A. Eckert and W. M. Aalst, De¯nition and validation of process
mining use cases, in Business Process Management Workshops, 2012, p. 7586.
48. V. R. Basili, G. Caldiera and H. D. Rombach, The goal question metric approach, in
Encyclopedia of Software Engineering (Wiley, 1994).
49. W. M. P. van der Aalst, Process Mining: Discovery, Conformance and Enhancement of
Business Processes, 1st ed. (Springer, 2011).
50. J. D. Weerdt, M. D. Backer, J. Vanthienen and B. Baesens, A multi-dimensional quality
assessment of state-of-the-art process discovery algorithms using real-life event logs, In-
formation Systems 37(7) (2012) 654–676.
51. A. Janes, A. Sillitti and G. Succi, E®ective dashboard design, Cutter IT Journal 26(1),
2013.
52. M. Robillard, W. Maalej, R. J. Walker and T. Zimmermann, Eds., Recommendation
Systems in Software Engineering (Springer, 2014).
53. J. P. Womack and D. T. Jones, Lean Thinking: Banish Waste and Create Wealth in Your
Corporation, 2nd ed. (Free Press, 2003).

View publication stats

You might also like