Structure of the ISO/IEC 15504-5:2012 Model
These are standards that are best practices for the software creation procedure and approaches for
the equation of the procedure capabilities assessment as well as improvements where necessary
(El-Emam and Garro, 1999). The structure outlines the software life cycle process as well as the
Maturity Models such as the Bootstrap, Trillium and the CMM (El Emam and Birk, 2000).
Mainly, these standards for the process assessment are for the evaluation of the process strengths
as the grounds for the process improvements (Mesquida, Mas, Amengual and Calvo-Manzano,
2012).
ISO/IEC 15504-5:2012 structure utilises the definition that is brought out by the ISO/IEC
122207:2008 to enumerate the Process Reference Model (Anacleto et al., 2004). The structure
utilises the method of Process Reference Model in order to describe the Process Assessment
Model in terms of the main aim and produce and they are further categorised into three processes
model (Peldzius and Ragaisis, 2011). The Process Assessment Model builds on the Process
Reference Model progression descriptions by involving a group of process performance
indicators namely Base Practice in each and every procedure involved in the evaluation of the
level of output by affiliating task produce with every process
Primarily the ISO/IEC 15504-5:2012 Model shows how Process Assessment and the Process
Capabilities Determination impacts to process improvements as illustrated in the figure below
(Barafort et al., 2002).
Figure 1. ISO/IEC 15504-5:2012 Model (Peldzius and Ragaisis, 2011).
Process
Is examined by Identify capabilities and risk of
Identify changes to
Process Assessment
Leads to
Leads to
Motivates
Capability
Determination
Process Improvements
The figure above gives the skeleton structure of the standard model (Peldzius and Ragaisis,
2011). As shown, a process is examined with a validation, which eventually leads to process
capabilities evaluation and process amendments (Peldzius and Ragaisis, 2011).
Capability determination identifies the capacity and the various risks a process is exposed to and
Process Improvements points out the changes that should be amended to the process (Peldzius
and Ragaisis, 2011). Software capacity’s determination evokes and catalyses the organisations to
integrate the process improvements (Anacleto et al., 2004).
At the stage of Process assessments, all the examinations about the process are evaluated. This
stage determines the final output that the process will encounter (Peldzius and Ragaisis, 2011).
The input of this stage is sourced from a variety of sources, such as the utilizers of the process
including the top managers as well as the direct users of the system (Peldzius and Ragaisis,
2011).
Software process assessment and improvements using the ISO/IEC 15504 information
technology - Process Assessment model
The Assessment Process refer to a collection of any data and information in evaluating the
current state in any organisations process system capacity and capabilities (Peldzius and
Ragaisis, 2011). They usually commence when there is an inquiry to regulate, determine even
improve the aptitude of these processes. Simulation Program with Integrated Circuit Emphasis
document lays the grounds on which requirement on how to be set up can be interpreted, and this
significantly assists the assessment team. However, these guidelines are directed at leading a
team-based valuation thus the principles for the evaluation process can be utilised in a consistent
manner. Nevertheless, in a continuous state, the data collection differs from the rest.
The Simulation Program with Integrated Circuit Emphasis document aims at first providing steps
on the preparation of the assessment team before embarking on the assessment (Peldzius and
Ragaisis, 2011). Then it also provides guidelines on how to internalise the results of the
assessment as well as the participants of the assessment procedure to actualise the whole
procedure of the assessment. Thirdly, it ensures the understanding of the assessment and the
importance of it to all the staff in the organisation (Anacleto et al., 2004). Finally, it gives the
tools and methodology needed by developers for the valuation equipment and methods
reinforcing the Process Assessment Model to be garnered (Barafort, Di Renzo and Merlan,
2002).
Process Assessment
The Process assessment can be initiated by the need for an improvement of the organisation
system. Moreover, the process can be as well initiated by the need of evaluating the
organisation's system capabilities (Barafort et al., 2002). The input for the assessment is collected
with the aid of the Assessment tools and the process model is utilised in the assessment. The
final output is the one utilised for the improvements or evaluation of the process capabilities
(Barafort et al., 2002).
Figure 2. The illustration (Barafort et al., 2002).
Assessment input
From process improvements or the Assessment purpose
output of the capacities evaluation Assessment scope
Assessment constrains
Assessment responsibilities
Assessment instruments Extended Process Definitions
Additional information to be collected.
Process indicator
Process Management Indicator
Process Assessment
Process Model
Assessment output
Process purpose
Generic practice adequacy evaluation
Practice,
Process capabilities level evaluation
Assessment record
Improvements or determination of the process capabilities.
In case of the improvements are done with, the output of the valuation process provides the
abilities level range of the selected process and the ground on which the plan, preparation,
implementation and monitoring of the specific improvements practices (Barafort et al., 2002).
However, if the valuation procedure outlines the capabilities, the output of the valuation process
procures the information for pinpointing, investigating and computing of the organisations risks,
puniness and the strengths (Peldzius and Ragaisis, 2011).
Assessment guide
The valuation procedure contains eight levels, which are: the review of the input assessment,
selection of the instance process, preparation stage, verification and collection of the
information, valuation of the real-time rating, evaluation and determination of the derived
grading, validation of the rating and finally, the presentation of the assessment output (Mc
Caffery et al., 2010). These are the stages of the assessment in accordance with ISO/IEC 15504-
5:2012 model as illustrated in figure 3 below (Mesquida et al. 2012).
Figure 3. Assessment stages (Mc Caffery et al., 2010).
Aligned to Presentation of the assessment output.
Revision of the assessment
input
Selection of the instances. Validating the rating
Preparation the assessment Determination of the derived rating
Collection and verification of
Determination of the real time rating
information
The illustration firstly sets out the minimal necessities to be met in the establishment of an
assessment tool. Secondly, gives meaning to set of indicators to be utilised in an Assessment
Instrument. Finally, provides guidance on the sorting, establishing and usability of Assessment
Instrument (Peldzius and Ragaisis, 2011).
Simulation Program with Integrated Circuit Emphasis document inaugurates the necessities for
establishing an assessment instrument (Paulk, 1999). Moreover, it outlines the guidance on the
sorting and usability traits affiliated with an array of assessment tools. Assessment tools are
utilised in the evaluation of the completeness or existence of an activity. An assessment tool is
inquired for the provision of the consistency in the set of indicators as discriminators to aid in
providing a verdict on how accurately the practice has been implemented. The tool also provides
a mechanism for recording the information that is collected (Mesquida et al., 2012).
This Simulation Program with Integrated Circuit Emphasis guidelines is highly utilised by those
who are responsible for valuation, design, and establishment of the Assessment Instruments,
such as assessors, tool suppliers and the methodology providers. Assessors and the valuation
crew are tasked with the accountability of sorting and grading as well as purchasing of the
Assessment Tools (Paulk, 1999). On the other hand, assessors, funders and other affiliated
people are responsible for valuation conformance of an assessment tool to those necessities.
Construction of an Assessment Instrument
In this standard it has not been stipulated the format and designs the team should take in the
valuation process. For example, the team can decide to take the paper-based instruments,
whereby they can use the forms, questionnaires or even checklists (Barafort et al., 2002). Or it
can be computer-based thus employ platforms such as spreadsheet, the database structure or the
cohesive CASE tools.
Regardless of the format used, the main aim of the assessment instruments is to assist the team of
assessors to perform an assessment in a sequential and reputable manner, hence eradicating
assessors’ element of subjectivity and ensuring objectivity, usability, and compatibility of the
final results (Paulk, 1999). Moreover, all indicators incorporated into an assessment tool shall be
clear to the team, hence conforming to the corresponding process, generic performs or Validation
of the samples (Barafort, Di Renzo and Merlan, 2002).
Process improvement
Also the standard outline on the management of the improvement of the software of an
organisation (Barafort et al., 2002). The document guide on the utilisation of Process Assessment
to understand the stature of the process and the creation as well as the prioritising the
improvement strategies (Paulk, 1999). The document is basically aiming at the overseeing of an
organisation considering of the improvement programme of the software, parties involved in the
improvement, software developers and the designers and all consultants both external and the
internal ones Mesquida, Mas, Amengual and Calvo-Manzano, 2012).
Moreover the process also guides in the: firstly the overview of the improvement process,
secondly the application of the methodology for the improvement, thirdly equation and scrutiny
of the cultural issues of the organisation and finally the management, aid in software procedural
improvement from a management viewpoint including the eventual structure for process
enhancement (Anacleto, von Wangenheim, Salviano and Savi, 2004).
Capability Dimension and process Dimension in the context of the ISO/IEC 15504
information technology - Process Assessment model
In accordance with ISO/IEC 15504-2, Process Assessment model involves two dimensions
which are the process dimension and the capability dimension.
Process dimension is in relation to the concept of the Process Reference Model. Process
Reference Model defines processes in the manner of a purpose statement and one or more results
or attainments should be satisfied when the process or the activity is undertaken (Rout and
Tuffley, 2007). It is crucial in order to attain the aim of the process as well as the significant
capabilities of the process (Barafort et al., 2002). The fifth part of the ISO/IEC 15504 exists as a
software Process Assessment model (SPICE Model) with the aid of the amended version of
ISO/IEC 12207 acting as the Process Reference Model.
Capability dimension is related to the valuation framework for the procedural capacity
assessment via process traits and the relevant capabilities level (Rout, 1998). Process Assessment
model, also contains indicators used in the valuation process in order to evaluate the procedural
attribute rating for every process (Rout and Tuffley, 2007). Each trait must be graded on a scale
of either ‘achieved’ or ‘not attained.’
Process dimension of the SPICE model is Process Reference Model and was later on replaced
with the ISO/IEC 12207 adjustments (Rout, 1998). Process Reference Model includes three core
classes of the process; Primary Life Cycle Process, Organisational Life Cycle Process and
Support Life Cycle Process. According to Barafort et al. (2002).Capability Dimension and
attributes of each of the SPICE model elements can be solely assessed via the range of the five
capabilities levels as follows:
Level 0: Incomplete; the procedure has yet to be integrated or has failed in attaining the aim.
Contains no traits.
Level1: Performed; the process is integrated and fulfilled the aim. Contains one trait of Process
performance.
Level 2: Managed; the process attains the goals and controls its execution. It contains two traits,
one being Performance Management and the other Work Product Management.
Level 3: Established; the managed process in level two is now integrated as an outlined and
documented procedure that has the ability to attain its results. There are two traits; Process
Definition and Process Deployment.
Level 4: Predictable; the defined process in level three attains the results within the outlined
control limitations. The process is managed and must be predictable, traits at this stage are
Process Measurement and Process Control.
Level 5: Optimisation; the level four predictable process is an incessant improving in order to
attain business goals of the firm. There are two traits, one is Process Innovation and Process and
the other Optimisation.
In conclusion, current industries involved in the software creation are creating questionable
software. Thus the strategic goal of the software organisation standards are to improve the
quality of the production by the application of variety of standards, methodology, tools that
enhance the software development based on the most effective and efficient practices of the
integrated process and results development as well as the process for the maturity assessment of
such activities. Consequently, there is a possibility to determine the needed amendments for the
software development.
References
Anacleto, A., von Wangenheim, C.G., Salviano, C.F. and Savi, R., 2004, April. Experiences
gained from applying ISO/IEC 15504 to small software companies in Brazil. In 4th International
SPICE Conference on Process Assessment and Improvement, Lisbon, Portugal (pp. 33-37).
Barafort, B., Di Renzo, B. and Merlan, O., 2002, December. Benefits resulting from the
combined use of ISO/IEC 15504 with the Information Technology Infrastructure Library (ITIL).
In International Conference on Product Focused Software Process Improvement (pp. 314-325).
Springer, Berlin, Heidelberg.
El-Emam, K. and Garro, I., 1999. ISO/IEC 15504. International Organization for
Standardization.
El Emam, K. and Birk, A., 2000. Validating the ISO/IEC 15504 measure of software
requirements analysis process capability. IEEE transactions on Software Engineering, 26(6),
pp.541-566.
Management Process Improvement based on ISO/IEC 15504: A systematic review. Information
and Software Technology, 54(3), pp.239-247.
Mc Caffery, F., Dorling, A. and Casey, V., 2010. Medi SPICE: an update. [Online]. Available at:
http://eprints.dkit.ie/48/1/Medi_SPICE_An_Update.pdf
Mesquida, A.L., Mas, A., Amengual, E. and Calvo-Manzano, J.A., 2012. IT Service
Management Process Improvement based on ISO/IEC 15504: A systematic review. Information
and Software Technology, 54(3), pp.239-247.
Paulk, M.C., 1999, October. Analyzing the conceptual relationship between ISO/IEC 15504
(software process assessment) and the capability maturity model for software. In 1999
International Conference on Software Quality.
Peldzius, S. and Ragaisis, S., 2011. Comparison of maturity levels in CMMI-DEV and ISO/IEC
15504. Applications of Mathematics and Computer Engineering, pp.117-122.
Rout, T.P. and Tuffley, A., 2007. Harmonizing iso/iec 15504 and cmmi. Software Process:
Improvement and Practice, 12(4), pp.361-371.
Rout, T., 1998. SPICE and the CMM: is the CMM compatible with ISO/IEC 15504. AquIS’98,
p.12.