You are on page 1of 40

Benchmarking Standard

Draft

09/02/2014

International Software Benchmarking Standards Group (ISBSG)

DRAFT

Author: &ate: (ersion: Reference:

Anthony L Rollo !am "orris #wa $asylkowski %arol &ekkers !ekka 'orselius ) 'e*ruary +,-.-/, +,)))0.1-/doc

Commercial In Confidence

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 1 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Contents -/ !urpose/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////2 +/ Scope3/ 456 S7R# AB576 68IS S#%6I543////////////////////////////////////////////////////////2 -/- 6ailoring 6his Standard3/ 456 S7R# 68IS S857L& B# 8#R#3/////////////////////////2 -/+ %onformance////////////////////////////////////////////////////////////////////////////////////////////////////////////////////) -/9 Limitations////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////) +/ &efinitions////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////) +/- Benchmark (noun)////////////////////////////////////////////////////////////////////////////////////////////////////////////) +/+ Benchmark (:er*)/////////////////////////////////////////////////////////////////////////////////////////////////////////////) +/9 Benchmark Analyst///////////////////////////////////////////////////////////////////////////////////////////////////////////) +/. Benchmark #;perience Base//////////////////////////////////////////////////////////////////////////////////////////-, +/0 Benchmark Li*rarian//////////////////////////////////////////////////////////////////////////////////////////////////////-, +/1 Benchmark "ethod////////////////////////////////////////////////////////////////////////////////////////////////////////-, +/< Benchmark !rocedure////////////////////////////////////////////////////////////////////////////////////////////////////-, +/2 Benchmark !rocess////////////////////////////////////////////////////////////////////////////////////////////////////////-, +/) Benchmark !rocess 5wner////////////////////////////////////////////////////////////////////////////////////////////-, +/-, Benchmark 7ser///////////////////////////////////////////////////////////////////////////////////////////////////////////-, +/-- !erformance//////////////////////////////////////////////////////////////////////////////////////////////////////////////////-, +/-+ Repository ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////-, +/-9 Stakeholder///////////////////////////////////////////////////////////////////////////////////////////////////////////////////-, +/-. 6ype of Benchmark method/////////////////////////////////////////////////////////////////////////////////////////-+/-0 #;ternal Benchmarking////////////////////////////////////////////////////////////////////////////////////////////////-+/-1 Internal Benchmarking/////////////////////////////////////////////////////////////////////////////////////////////////-+ 9/ Application of this International Standard//////////////////////////////////////////////////////////////////////////////-+ 9/- !urpose and 5utcomes of the software *enchmarking process/////////////////////////////////////-+ 9/+ 5:er:iew of this Standard//////////////////////////////////////////////////////////////////////////////////////////////-. 9/9 5:er:iew of this Standard//////////////////////////////////////////////////////////////////////////////////////////////-0 ./ &escription of the Acti:ities///////////////////////////////////////////////////////////////////////////////////////////////////-< ./- #sta*lish and sustain *enchmark commitment////////////////////////////////////////////////////////////////-< ./-/- Accept Re=uirements//////////////////////////////////////////////////////////////////////////////////////////////-< ./-/+ "aintain Re=uirements////////////////////////////////////////////////////////////////////////////////////////////-< ./-/9 Assign Responsi*ility///////////////////////////////////////////////////////////////////////////////////////////////-< ./-/. Assign Resources////////////////////////////////////////////////////////////////////////////////////////////////////-2 ./-/0 "anagement %ommitment//////////////////////////////////////////////////////////////////////////////////////-2 ./-/1 %ommunicate %ommitment/////////////////////////////////////////////////////////////////////////////////////-2 ./+ Identify Information 4eeds/////////////////////////////////////////////////////////////////////////////////////////////-2 ./+/- Benchmark information needs/////////////////////////////////////////////////////////////////////////////////-2 ./+/+ !rioritise Information 4eeds///////////////////////////////////////////////////////////////////////////////////-) ./+/9 Select Information needs/////////////////////////////////////////////////////////////////////////////////////////+, ./9 &etermine >uestions//////////////////////////////////////////////////////////////////////////////////////////////////////+, ./. #sta*lish Benchmark parameters////////////////////////////////////////////////////////////////////////////////////+, ././- Benchmark 6ype/////////////////////////////////////////////////////////////////////////////////////////////////////+, ././+ Benchmark Scope////////////////////////////////////////////////////////////////////////////////////////////////////+././9 Benchmark 're=uency/////////////////////////////////////////////////////////////////////////////////////////////++ ./0 !lan the Benchmark !rocess//////////////////////////////////////////////////////////////////////////////////////////+9 ./0/- &escri*e organisation//////////////////////////////////////////////////////////////////////////////////////////////+9 ./0/+ Select "easures///////////////////////////////////////////////////////////////////////////////////////////////////////+. ./0/9 &ocument "easures////////////////////////////////////////////////////////////////////////////////////////////////+.
Ref: ISBSG Standard v1.0 Copyright ISBSG Page 2 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

./0/. Select Benchmark Supplier//////////////////////////////////////////////////////////////////////////////////////+. ./0/0 Select Benchmark &ataset///////////////////////////////////////////////////////////////////////////////////////+0 ./0/1 &efine !rocedures///////////////////////////////////////////////////////////////////////////////////////////////////+1 ./0/< %onfiguration "anagement/////////////////////////////////////////////////////////////////////////////////////+< ./0/2 #:aluating Information !roducts/////////////////////////////////////////////////////////////////////////////+< ./0/) #:aluating Benchmark !rocess///////////////////////////////////////////////////////////////////////////////+< ./0/-, Appro:ing the Benchmark !rocess///////////////////////////////////////////////////////////////////////+< ./0/-- Appro:al of !lanning ////////////////////////////////////////////////////////////////////////////////////////////+2 ./0/-+ Ac=uire support technologies////////////////////////////////////////////////////////////////////////////////+) ./1 !erform the Benchmark !rocess////////////////////////////////////////////////////////////////////////////////////+) ./1/- Integrate !rocedures///////////////////////////////////////////////////////////////////////////////////////////////+) ./1/+ %ollect &ata/////////////////////////////////////////////////////////////////////////////////////////////////////////////9, ./1/9 Analyse &ata///////////////////////////////////////////////////////////////////////////////////////////////////////////9+ ./1/. %ommunicate Information !roducts////////////////////////////////////////////////////////////////////////9+ ./< #:aluate the Benchmark/////////////////////////////////////////////////////////////////////////////////////////////////99 ./</- #:aluate "easures///////////////////////////////////////////////////////////////////////////////////////////////////99 ./</+ #:aluate the *enchmark process/////////////////////////////////////////////////////////////////////////////99 ./</9 Identify potential impro:ements//////////////////////////////////////////////////////////////////////////////99 0/ Informati:e References///////////////////////////////////////////////////////////////////////////////////////////////////////////9. Anne; A: #;amples (informati:e)/////////////////////////////////////////////////////////////////////////////////////////////90 A/- !roducti:ity e;ample///////////////////////////////////////////////////////////////////////////////////////////////////90 A/+ Schedule adherence/////////////////////////////////////////////////////////////////////////////////////////////////////91 Anne; B !rocess $ork !roducts (informati:e)////////////////////////////////////////////////////////////////////////9< %/- 6imeliness/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////92 %/+ #fficiency//////////////////////////////////////////////////////////////////////////////////////////////////////////////////////92 %/9 &efect containment///////////////////////////////////////////////////////////////////////////////////////////////////////92 %/. Stakeholder satisfaction////////////////////////////////////////////////////////////////////////////////////////////////92 %/0 !rocess conformance////////////////////////////////////////////////////////////////////////////////////////////////////92

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 3 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 4 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Foreword ISBSG (the International Software Benchmarking Standards Group) is a not for profit organisation made up from mem*er organisations/ 6hese mem*er organisations are the :arious national software measurement organisations including:

Australian Software "etrics Association ? Software >uality Association (AS"A@S>A) %hinese Software Benchmarking Standards Group (%SBSG) &eutschsprachige Anwendergruppe fuer Software@"etrik und AufwandschatAung (&AS"A) 'innish Software "easurement Association ('iS"A) Gruppo 7tenti 'unction !oint Italia @ Italian Software "etrics Association (G7'!I ? IS"A) International 'unction !oint 7sers Group (I'!7G) Bapanese 'unction !oint 7sers Group (B'!7G) Corean Software "easurement Association (C5S"A) 4ational Association of Software and Ser:ice %ompanies (4ASS%5" @ India) 4etherlands Software "etrics 7sers Association (4#S"A) 7C Software "etrics Association (7CS"A)

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Introduction
Since the Benchmarking !rocess is a specific instance of a measurement process then this process descri*ed is a tailoring of the IS5DI#% Standard -0)9) "easurement !rocess/ It has therefore adopted the structure and maEor acti:ities descri*ed within IS5DI#% -0)9): adapted to the specific needs of a *enchmarking process/
!he Ben"h#ar$ing of %oftware and %oftware re&ated a"tivitie% "o'&d ta$e one of %evera& for#%:

#;ternal Benchmarking @ 6he process of continuously comparing and measuring an organisation with *usiness leaders anywhere in the world to gain information to help the organisation take action to impro:e its performance !eer group *enchmarks may also *e used within an organisation to allow comparisons *etween di:isions or sites within that organisation !eriodic) or Internal Benchmarking is the process of determining a metric *aseline for an organisational or functional unit of the purposes of comparison/

Contin'a& i#prove#ent re('ire% "hange within the organi%ation. )va&'ation of "hange re('ire% *en"h#ar$ing of perfor#an"e and "o#pari%on. + *en"h#ar$ "an *e '%ed a% the vehi"&e for the perfor#an"e *en"h#ar$ and "o#pari%on, a% we&& a% to provide the i#pet'% to initiate a pro"e%% i#prove#ent initiative. Ben"h#ar$% %ho'&d &ead to a"tion, and not *e e#p&oyed p're&y to a""'#'&ate infor#ation. Ben"h#ar$% %ho'&d have a "&ear&y defined p'rpo%e. !hi% Ben"h#ar$ing Standard define% a %oftware *en"h#ar$ pro"e%% app&i"a*&e to a&& %oftware-re&ated engineering and #anage#ent di%"ip&ine%. !he pro"e%% i% de%"ri*ed thro'gh a #ode& that define% the a"tivitie% of the *en"h#ar$ pro"e%%, whi"h are re('ired to ade('ate&y %pe"ify what infor#ation i% re('ired, how the #ea%'re% and ana&y%i% re%'&t% are to *e app&ied, and how to deter#ine if the ana&y%i% re%'&t% are va&id. !he %oftware *en"h#ar$ pro"e%% i% f&e.i*&e, tai&ora*&e, and adapta*&e to addre%% the need% of different '%er%.

Benchmarking can be regarded as a special application of software measurement, in that a benchmark requires some measurement of some aspect(s) of performance. Benchmarking extends the needs of measurement by the requirement to perform comparisons against some repository, either internal or external. For these reasons the I ! standard "#$%$&'(() has been utilised in the deri*ation of this I B + standard.

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page / of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

6he ISBSG group has de:eloped the first :ersion of the document specifically for the needs of the ISBSG community/ 6he longer@term strategy is to refine the document to a generic framework standard :ia contri*ution from the wider community (eg/ Benchmarking companies)/ It is anticipated that the final document may *ecome the *asis of a future IS5 standard/ 6his *enchmarking standard is applica*le to *enchmarking any aspect of Information 6echnology howe:er in order to assist in understanding we ha:e pro:ided informati:e guidance on how to use this standard to *enchmark software de:elopment producti:ity using 'unctional SiAe "easurement as the product measure/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 0 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

1.

Purpose
6his Benchmarking Standard identifies the re=uired acti:ities and tasks that are necessary to successfully identify define select apply and impro:e *enchmarking for I6 Ser:ice "anagement/ It also pro:ides standard definitions for *enchmarking terms within the I6 industry/ 6his Benchmarking Standard is intended to pro:ide guidance to organiAations a*out issues and considerations for data selection and comparison in I6 *enchmarking/ It will also assist them in *eing a*le to interpret the output from a *enchmark/ 6his Benchmarking Standard does not pro:ide an e;hausti:e catalogue of all possi*le *enchmark types nor does it pro:ide a recommended set of *enchmarks for I6 ser:ice management It pro:ides a generic framework to define the most suita*le set of *enchmark re=uirements that address specific information needs/

2. Scope. NOT SURE ABOUT THIS SECTION

6his Benchmarking Standard is intended to *e used *y software suppliers and ac=uirers/ Software suppliers include personnel performing management technical and =uality management functions in software de:elopment maintenance integration and product support organisations/ Software ac=uirers include personnel performing management technical and =uality management functions in software procurement and user organisations/ 6he following are e;amples of how this Benchmarking Standard can *e used: By a supplier to implement a *enchmarking process to address specific proEect or organisational information re=uirements/ By an ac=uirer (or third@party agents) for e:aluating the performance of the supplierFs processes and ser:ices in the conte;t of a contract to supply software or software related ser:ices By an organisation to *e used internally to answer specific information needs/ 6his ISBSG Standard contains a set of acti:ities and tasks that result in a *enchmarking process that meets the specific needs of software organisations and proEects/ 6he tailoring process consists of modifying the non@normati:e descriptions of the tasks to achie:e the purpose of the *enchmarking process and to produce the re=uired outcomes in a specific organisational conte;t/ 6he purpose and outcomes specified for the *enchmarking process in this Standard must all *e satisfied and all the normati:e descriptions of the tasks must *e satisfied/ 4ew acti:ities and tasks not defined in this Benchmarking Standard may *e added as part of tailoring/ 6hroughout this Standard GshallH is used to e;press a pro:ision that is *inding on the party that is applying this International Standard GshouldH to e;press a recommendation among other possi*ilities and GmayH to indicate a course of action permissi*le within the limits of the Standard/

1.1 Tailoring This Standard. NOT SURE THIS SHOUL BE HERE.

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 1 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

1.2

Con!or"ance

1.3

Li"itations

%onformance to this Standard is defined as satisfying the purpose all the normati:e clauses within the tasks in clause ./ Any organisation imposing this Standard as a condition of trade is responsi*le for specifying and making pu*lic all task@specific criteria to *e imposed in conEunction with this Standard/ 6his Benchmarking Standard does not assume or prescri*e an organisational model for *enchmarking/ 6he user of this Standard should decide for e;ample whether a separate *enchmark function is necessary within the organisation whether the *enchmark function should *e integrated within an e;isting function such as software metrics or software =uality/ 5r as in many organisations where a *enchmark process is regularly in:oked eg/ annually or *iannually then it may *e more economic to rely upon an e;ternal data collection and or *enchmark agency/ 6his Standard is not intended to prescri*e the name format or e;plicit content of the documentation to *e produced from the *enchmarking process/ 6he Standard does not imply that documents *e packaged or com*ined in some fashion/ 6hese decisions are left to the user of the Standard/ 6he *enchmarking process should *e appropriately integrated with the organisational =uality system/ All aspects of internal audits and non@ conformance reporting will not *e e;plicitly co:ered in this International Standard as they are assumed to *e in the domain of the =uality system/ 6his Standard is not intended to conflict with any organisational policies standards or procedures that are already in place/ 8owe:er any conflict needs to *e resol:ed and any o:erriding conditions and situations need to *e cited in writing as e;ceptions to the application of the International Standard/

2.

Definitions
6his standard uses many of the terms used in software measurement in general only those terms associated with *enchmarking ha:e *een defined here/ 6he reader is referred to the IS5 standard -0)9)for definitions of measurement terms/

2.1

Benchmar !noun"

A :alue of some measure or deri:ed measure which is indicati:e of the relationship *etween an organisational attri*ute and the :alues of that attri*ute maintained in a *enchmarking repository/ %arrying out the set of processes undertaken to esta*lish the relati:e :alue of some organisational attri*ute with respect to the data repository to *e used for comparison purposes/ An indi:idual or organisation that is responsi*le for the planning performance e:aluation and impro:ement of *enchmark/

2.2

Benchmar !#er$"

2.3

Benchmar Anal%st

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 2 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

2.&

Benchmar '(perience Base

A data store that contains the e:aluation of the information products and the *enchmark process as well as any lessons learned during *enchmark and analysis/ An indi:idual or organisation that is responsi*le for managing the *enchmark data store(s)/ A logical se=uence of operations descri*ed generically used in =uantifying an attri*ute with respect to a specified scale (*ased on the definition in IInternational (oca*ulary of Basic and General 6erms in "etrology -))9J)/ A set of operations descri*ed specifically used in the performance of a particular *enchmark according to a gi:en method IInternational (oca*ulary of Basic and General 6erms in "etrology -))9J/ 6he process for identifying defining selecting applying and impro:ing software *enchmark within an o:erall proEect or organisational *enchmark structure/ An indi:idual or organisation responsi*le for the *enchmark process An indi:idual or organisation that uses the information products/

2.)

Benchmar *i$rarian

2.+

Benchmar ,ethod

2.-

Benchmar Procedure

2..

Benchmar Process

2./ 2.11

Benchmar Process 0wner Benchmar 2ser

2.11

Performance

A deri:ed measure which gi:es an indication of some attri*ute associated with how well how =uickly or how efficiently a product performs or a process is carried out/ 6ypical performance attri*utes of a product may *e =uality where a typical process attri*ute might *e producti:ity/ An organised and persistent collection of multiple data sets that allows for its retrie:al/ And which is designated for use as the source of comparati:e measures for the purpose of *enchmarking/

2.12

Repositor%

2.13

3ta eholder

An indi:idual or organisation that sponsors *enchmarking pro:ides data is a user of the *enchmark results or otherwise participates in the *enchmarking process/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 10 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

2.1&

T%pe of Benchmar method

6he type of *enchmark method depends on the nature of the operations used to =uantify an attri*ute/ 6wo types may *e distinguished: =ualitati:e ? =uantification in:ol:ing human Eudgement or comparison/ An e;ample would *e rating some aspect in line with a model/ $hilst often e;pressed numerically these will *e ordinal measures arri:ed at *y su*Eecti:e Eudgement/ =uantitati:e ? *ased on numerical rules "ay *e su*Eected to arithmetic processes

2.1) '(ternal Benchmar in4

6he process of continuously comparing and measuring organiAational units within an organisation with *usiness leaders anywhere in the world to gain information to help the organisation take action to impro:e its performance ,ote& -xternal benchmarks are benchmarks where comparison is drawn between an organisation or part of an organisation and either industry a*erage performances, or with competitor organisations within the same industry. .hese types of benchmark may be conducted for a *ariety of reasons. For example, to answer the following questions& /ow does the organisation compare to industry 0standards12 3re we more or less effecti*e in comparison to our competitors2 3re our processes effecti*e or do we need to launch an impro*ement initiati*e2 4an we reduce the costs of maintaining our existing portfolio and still pro*ide a sufficient ser*ice2 Is the outsource contract achie*ing the ser*ice le*el agreed in the contract2 3re we getting *alue for money internally or from our suppliers2 5hat is the likely 6!I if we undertake a process impro*ement pro7ect2

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 11 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

2.1+

Internal Benchmar in4 6he process of determining a metric *aseline for an organisational or functional unit for the purposes of comparison/ ,ote& Internal benchmarks may be conducted for a *ariety of reasons, it is important to be clear as to the reason for the conduct of the benchmark as this will be a ma7or factor in the determination of the type of benchmark to be undertaken. 8uestions, which may be addressed by an internal benchmark, are& Is our process impro*ement initiati*e pro*ing effecti*e2 (baselining) Is the outsource contract meeting the le*els agreed in the contract2 3re all the di*isions and sites in our organisation performing at the same le*el2 /as the introduction of a new technology achie*ed the benefits expected2 5hat e*idence is there to support the estimates that we are using2

3.
3.1

Application of this International 3tandard


Purpose and 0utcomes of the software $enchmar in4 process

6his clause presents an o:er:iew of the software *enchmark process/ 6he o*Eecti:e is to orient the users of this Benchmarking Standard so that they can apply it properly within conte;t/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 12 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

6he purpose of the software *enchmarking process defined in this Standard is to collect analyse and report data relating to the products de:eloped and processes implemented within the organisational unit to support effecti:e management of the processes and to o*Eecti:ely demonstrate the comparati:e performance of these processes/ As a result of successful implementation of the *enchmarking process organisational commitment for *enchmarking will *e esta*lished and sustainedK the information o*Eecti:es of technical and management processes will *e identifiedK an appropriate set of =uestions dri:en *y the information needs will *e identified andDor de:elopedK *enchmark scope will *e identifiedK the re=uired performance data will *e identifiedK the re=uired performance data will *e measured stored and presented in a form suita*le for the *enchmark the *enchmark outcomes will *e used to support decisions and pro:ide an o*Eecti:e *asis for communicationK *enchmark acti:ities will *e plannedK opportunities for process impro:ements will *e identified and communicated to the rele:ant process owner/ the *enchmark process and measures will *e e:aluated/ 6he performance measures defined and utilised during the *enchmark process should *e integrated with the organisations e;isting measurement process which should comply with IS5DI#% -0)9):+,,+/ ,ote & .he purposes for doing the comparison may be for& 4omparing other di*isions or sites within your organisation 4omparison with your closest competitors 4omparable benchmarking against industry performance a*erages 9ear:on:year comparisons of the organisations performance for process impro*ements !btaining performance measures from completed pro7ects for input into pro7ect estimates

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 13 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

3.2

0#er#iew of this 3tandard

6his Benchmarking Standard defines the acti:ities and tasks necessary to implement a *enchmarking process/ An acti:ity is a set of related tasks that contri*utes towards achie:ing the purpose and outcomes of the *enchmarking process (see clause ./-)/ A task is a well@defined segment of work/ #ach acti:ity is comprised of one or more normati:e tasks/ 6his Standard does not specify the details of how to perform the tasks included in the acti:ities/ 6he acti:ities of the *enchmarking process consist of acti:ities that are illustrated in the process model in 'igure -/ 6he acti:ities are se=uenced in an iterati:e cycle allowing for continuous feed*ack and impro:ement of the *enchmark process/ 6he process model in 'igure is *ased upon the "easurement process in IS5DI#% -0)9)@-:+,,-/ $ithin the acti:ities the tasks are in practice also iterati:e/ 6hree acti:ities are considered to *e the %ore Benchmark !rocess: !lan the Benchmark !rocess !erform the Benchmark !rocess and #:aluate and present the *enchmark results/ 6hese acti:ities mainly address the concerns of the *enchmark user/ 6he other acti:ities pro:ide a foundation for the %ore Benchmark !rocess and pro:ide feed*ack to help sustain *enchmark commitment and #:aluate Benchmark Results/ Benchmarks should *e e:aluated in terms of the added :alue they pro:ide for the organisation and only deployed where the *enefit can *e identified/ 6hese latter two acti:ities address the concerns of the *enchmark process owner/ 'igure - shows that the %ore Benchmark !rocess is dri:en *y the information needs of the organisation/ 'or each information need the %ore Benchmark !rocess produces an information product that satisfies the information need/ 6he information product is presented to the organisation as a *asis for decision@making/ 6he link *etween *enchmarks and an information need is descri*ed as the benchmark information model in Anne; A and illustrated with e;amples/ 6he process defined in this Benchmarking Standard includes an e:aluation acti:ity (as shown in 'igure -)/ 6he intent is to emphasiAe that e:aluation and feed*ack are an essential component of the *enchmark process and should lead to impro:ements of the *enchmark process/ #:aluation can *e simple and performed in an ad@ hoc manner (when capa*ility is low) or it can *e =uantitati:e with sophisticated statistical techni=ues to e:aluate the =uality of the *enchmark process and its outputs (when capa*ility is high)/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 14 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

At the centre of the cycle is the G*enchmark e;perience *aseH 6his is intended to capture information needs from past iterations of the cycle pre:ious e:aluations of information products and e:aluations of pre:ious iterations of the *enchmark process/ 6his would include the measures that ha:e *een found to *e useful in the organisational unit/ 6he cycle also includes a *enchmark repository this may *e incorporated in the *enchmark e;perience *ase or may *e maintained e;ternally often *y e;ternal organisations which pro:ide access to their proprietary *enchmark repository and analysis/ 4o assumptions are made a*out the nature or technology of the G*enchmark repositoryH or the *enchmark e;perience *ase only that it *e a persistent storage/ Information products stored in the G*enchmark e;perience *aseH are e;pected to *e reused in future iterations of the *enchmark process/ Since the process model is cyclical su*se=uent iterations may only update *enchmark products and practices/ 6his Standard does not imply that *enchmark products and practices need to *e de:eloped and implemented for each instantiation of the process/ 6he wording used in this International Standard adopts the con:ention that one is implementing the *enchmark process for the first time (ie the first instantiation)/ &uring su*se=uent instantiations this wording should *e interpreted as updating or changing documentation and current practices/ 6he typical functional roles mentioned in this Standard are: stakeholder sponsor *enchmark user *enchmark analyst *enchmark li*rarian data pro:ider and *enchmark process owner/ 6hese are defined in the G&efinitionsH section of this International Standard/ A num*er of work products are produced during the performance of the *enchmark process/ 6he work products are descri*ed in Anne; B and mapped to the tasks that produce them/
3.3 0#er#iew of this 3tandard

In this international standard clauses 4/n denote an acti:ity and 4/n/n a task within an acti:ity/ 4on normati:e te;t is italici;ed/ In addition informati:e notes are headed 4ote:/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 1 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 1/ of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

&.

Description of the Acti#ities

In implementing a *enchmark process in line with this International Standard the organisational unit shall perform the acti:ities descri*ed *elow/ 6he GRe=uirements for BenchmarkH from the 6echnical and "anagement processes trigger the *enchmark process/ ,ote& Benchmarking is an imprecise tool as it is not possible to find directly comparable& !rganisations 4ontracts 9ears %lients suppliers and *enchmarker should understand *enchmarking limitations to ensure the process scope is designed to pro:ide the most compara*le *enchmark data possi*le and that the method and underlying assumptions are transparent and audita*le/ As technology changes the original *enchmark measurements may no longer pro:ide suita*le comparisons and there may *e a need to re esta*lish the *aseline measurements or the comparison group against which he *enchmark is *eing conducted/ It may e:en *e necessary to reconsider the terms of an outsourcing contract as new technologies may render older agreements unfair to one side or the other/ 5:er time the LnormalF *alance of proEect types undertaken may alter significantly with results similar to those descri*ed for changing technologies/ #ffecti:e *enchmarking *egins at the contract stage/ &ue to a Llag timeF factor *enchmark results can *e anywhere from si; months to a year old/
&.1

'sta$lish and sustain $enchmar commitment


6his acti:ity consists of the following tasks: Accept the re=uirements for *enchmark Assign resources #sta*lish management commitment %ommunicate to the organisational unit

4.1.1 Accep t Re#$ire"ents 4.1.2 %aintai n Re#$ire"ents

Re=uirements for the *enchmark should *e gathered and agreed *etween all the stakeholders for the *enchmark 6he re=uirements will *e recorded in some suita*le format and as changes to these re=uirements emerge o:er time they shall *e maintained/ It will *e necessary to ensure that the impact of any changes to re=uirements is properly understood *y the stakeholders *efore they can *e accepted/ ,ote& 3s the benchmark process proceeds o*er a number of years there may well be changes to the requirements caused by changes in business information needs or due to ad*ances in technology. .hese changes may in*alidate some of the data and conclusions of preceding benchmarks and these effects will need to be e*aluated to assess their impact.

4.1.3

Assign

6he sponsor of *enchmark should assign this responsi*ility/


Copyright ISBSG Page 10 of 40 www.ISBSG.org

Ref: ISBSG Standard v1.0

Benchmarking Standard

Draft

09/02/2014

Responsi&ilit'

It should *e ensured that competent indi:iduals are assigned this responsi*ility/ %ompetent indi:iduals may *e ac=uired through transfer coaching training su*@contracting andDor hiring professional *enchmarking organisations/ %ompetence includes knowledge of the principles of *enchmark how to collect data perform data analysis and communicate the information products/ At a minimum competent indi:iduals should *e assigned the responsi*ility for the following typical roles: the *enchmark user the *enchmark analyst the *enchmark li*rarian 6he num*er of roles shown a*o:e does not imply the specific num*er of people needed to perform the roles/ 6he num*er of people is dependent on the siAe and structure of the organisational unit/ 6hese roles could *e performed *y as few as one person for a small proEect/ 6he sponsor of *enchmark should *e responsi*le for ensuring that resources are pro:ided/ Resources include funding and staff/ Resource allocations may *e updated in the course of acti:ity ./+ %ommitment should *e esta*lished when a GRe=uirement for BenchmarkH is defined (see 'igure -)/ 6his includes the commitment of resources to the *enchmark process and the willingness to maintain this commitment/ 6he organisational unit should demonstrate its commitment through for e;ample a *enchmark policy for the organisational unit allocation of responsi*ility and duties training and the allocation of *udget and other resources/ %ommitment may also come in the form of a contract with a customer stipulating that certain measures *e used/ 6his can *e achie:ed for e;ample through organisational unit@ wide announcements or newsletters/

4.1.4 Assign Reso$rces

4.1.5 %anage"ent Co""it"ent

4.1.6 Co""$nicate Co""it"ent

&.2

Identif% Information 5eeds Information needs are o*tained from the stakeholders in *enchmark/ Information needs are *ased on: *enchmark goals constraints risks and organisational andDor proEect pro*lems/ 6he information needs may *e deri:ed from the *usiness organisational regulatory (such
Copyright ISBSG Page 11 of 40 www.ISBSG.org

4.2.1 Bench"ar( in!or"ation needs

Ref: ISBSG Standard v1.0

Benchmarking Standard

Draft

09/02/2014

as legal or go:ernmental) product andDor proEect o*Eecti:es/ Information needs may address =uestions such as: Ghow do I predict the producti:ity of my planned proEectMH Ghow do I e:aluate the =uality of the software product compared to industry normsMH and Ghow do I know the cost effecti:eness and efficiency of my supplier compared to industry normsMH ,ote& In measuring performance in order to make comparisons it is important to establish what aspects of performance are of interest to the organisation wishing to conduct a benchmark. .he implication of the choice of a range of aspects to be measured is also important when utilising the results of a performance benchmark or measurement. <erformance measurement may include& =easures of producti*ity =easures of time to market (time to deli*er) =easures of quality =easures of cost =easures of rework =easures of user (customer) satisfaction Before approaching *enchmarking organisations it is important that not only should an organisation decide which aspects of performance are of importance *ut also to define what is meant *y the :arious terms descri*ed a*o:e/ 'or e;ample what is meant *y =uality is it the num*er of defects disco:ered in a deli:ered system in the first few months of operation or should measures of usa*ility relia*ility maintaina*ility and so on *e includedM $hen measuring cost is it simply the cost to de:elop a system or should cost to maintain the system *e includedM
4.2.2 )rioritise In!or"ation Needs

6his prioritisation is normally accomplished *y or in conEunction with the stakeholders/ 5nly a su*set of the initial information needs may *e pursued further/ 6his is particularly rele:ant if *enchmark is *eing tried for the first time within an organisational unit where it is prefera*le to start small/ An e;ample of a simple and concrete prioritisation approach is to ask a group of stakeholders to rank the information needs/ 'or each information need calculate the a:erage rank/ 6hen order the a:erage ranks/ 6his ordering would pro:ide a prioritisation of the information need/ ,ote& .he purpose for which a benchmark is undertaken relates directly to the types of questions set out abo*e, for which answers are sought. /owe*er it must be recognised that the list of questions is not exhausti*e and the answers to many other questions may be needed, it is ne*ertheless important to decide exactly what questions need to be addressed before undertaking a benchmark exercise, and hence defining
Copyright ISBSG Page 12 of 40 www.ISBSG.org

Ref: ISBSG Standard v1.0

Benchmarking Standard

Draft

09/02/2014

the purpose of the benchmark. <ossible reasons for undertaking a benchmark are& et competiti*e range for metrics baseline. >emonstrate ongoing competiti*eness and continuous impro*ement in pricing ? ser*ice le*els Identify <rocess Impro*ement opportunities Identify best practices >ecision making re&: outsourcing -stablish market position
4.2.3 Select In!or"ation needs

4o assumptions are made a*out the type of documentation/ It can *e on paper or electronic for e;ample/ It is only necessary that the documentation is retrie:a*le/ 6he selected information needs should *e communicated to all stakeholders/ 6his is to ensure that they understand why certain data are to *e collected and how they are to *e used/ -xamples of the definition of a measure deri*ed from information needs can be found in 3nnex 3 to this document. .he reader is also referred to the I !@I- "#$%$&'((' standard for further information on defining measures 6he information needs pre:iously identified shall *e used in determining the =uestions which need to *e answered/ 'or e;ample if the information need is to esta*lish the relati:e producti:ity of an organisational unit/ 6he =uestions which need to *e answered would *e: -/ $hat is the producti:ity of the unit +/ 8ow does it compare with other organisational units/

Determine 6uestions
&.3

&.&
4.4.1 T'pe

'sta$lish Benchmar parameters


Bench"ar(

,ote & Business ,eed 5hat are the questions that the business needs to ha*e answered by the benchmark exercise2 Benchmarking .ype Internal Is an internal benchmark sufficient to answer the questions posed2 If so is it to be undertaken as a year on year comparison in which case, is sufficient data likely to be a*ailable in any single year to meet the business ob7ecti*es. It should be remembered that a sample of one or two performance measurements is unlikely to be a sound basis for comparisonA you may require ':B years of data before a suitable basis for comparison is a*ailable, especially if a range of pro7ect types or technologies are to be measured. Is the internal benchmark being conducted to allow comparison

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 20 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

between di*isions or sites in the organisation2 If that is the case do the disparate di*isions or sites de*elop the same type of software utilising the same technologies and approaches2 3 comparison between modern e:business de*elopment and the more traditional legacy based system is unlikely to pro*ide a useful basis for comparison as *ery different performance is to be expected. -xternal If an external benchmark is to be conducted, then it is essential to ensure that the scope of the systems being measured is representati*e of the long:term de*elopment profile of the work being measured. 4omparing the performance of a help desk in the first year of introduction of a radically new system against industry 0standards1, where most systems will be mature, is unlikely to re*eal worthwhile insights. It is important that the period o*er which measurements are taken for the benchmark is comparable to the period of work, which forms the bulk of the external benchmark data repository. .hus to compare three months maintenance and support effort against a benchmark data base which consists of measurement reflecting a whole years work is liable to lead to misleading comparisons. =ust establish if the benchmark period is representati*e of past and future years
4.4.2 Bench"ar( Scope

6he scope of *enchmark is an organisational unit/ 6his may *e a single proEect a functional area the whole enterprise a single site or a multi@ site organisation/ 6his may consist of software proEects or supporting processes or *oth/ All su*se=uent *enchmark tasks should *e within the defined scope/ In addition all stakeholders should *e identified/ 'or e;ample these may *e proEect managers the Information Systems manager or the head of >uality "anagement/ 6he stakeholders may *e internal or e;ternal to the organisational unit/ 6he scope of the organisational unit can *e identified through inter:iews and the inspection of documentation such as organisational charts/ ,ote& For example an organisational unit may be the 3pplications >e*elopment Function Benchmarking this unit is often referred to as an 3>@= benchmark the applications de*elopment function usually includes enhancement pro7ects o*er a certain si;e, maintenance acti*ity will carry out minor enhancements usually of small duration (typically less than ten days or smaller than #( function points). <robably the most useful benchmark for this area is a pro7ect based benchmark and se*eral benchmark pro*iders will undertake this type of benchmark. =any pro*iders howe*er ha*e a standard product, which they will recommendA this may well be acceptable as it may include a pro7ect:based element. /owe*er if this is the type of benchmark you belie*e will
Copyright ISBSG Page 21 of 40 www.ISBSG.org

Ref: ISBSG Standard v1.0

Benchmarking Standard

Draft

09/02/2014

4.4.3 Bench"ar( *re#$enc'

answer the business questions you wish to answer then check carefully that you will get the benefits you need from a wider benchmark, which will of course pro*ide you with information on other aspects of your organisations performance, but may increase your monitoring costs for little real benefit. Before undertaking a *enchmarking e;ercise it is important to decide at what fre=uency it will *e necessary to carry out further *enchmarks/ Benchmark pro:iders may suggest standard fre=uencies usually annually or *iannually/ 8owe:er you need to decide at what inter:als meet your needs/ 6his will relate to the =uestions decided upon during the initial planning stage/ (arious strategies are a:aila*le: Annually throughout contract/ If you ha:e entered an outsourcing agreement then it might *e wise to ha:e an annual *enchmark at the end of each year of the contract/ 8owe:er remem*er the outsourcer will pro*a*ly not achie:e all the *enefits in a linear manner/ 6he outsourcer will ha:e to a*sor* some or all of your staff so they will need to make some cultural change in addition user departments will now *e working in a different relationship to the I6 supplier than pre:iously/ 6here will *e se:eral part complete proEects during the first year so impro:ement will *e difficult on these proEects/ "any contracts do not demand a demonstrated impro:ement in the first year and some allow the first and second years to form the *aseline for impro:ement though this can act as a disincenti:e to making impro:ements in year one of the contract/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 22 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

!rior to contract renewal/ It may well *e that you are satisfied to ha:e the performance of the outsourcer demonstrated Eust prior to the end of the contract when you are considering its renewal/ 6his is pro*a*ly not a sufficient fre=uency if you ha:e a contract longer than a couple of years since if there is no demonstra*le impro:ement you need to understand this well *efore the end of say a fi:e year contract/ After pre@defined period (eg/ secondDthird year)/ 6his is a compromise *etween the first and second options and will mean that in a long contract performance is *eing monitored at se:eral intermediate stages/ 6his strategy is pro*a*ly ade=uate for contracts o:er 1 years in length/ It o*:iates the pro*lem of non@linear impro:ement which can create pro*lems in the contract relationship whilst still allowing an organisation to ha:e demonstra*le progress towards the e;pected *enefits/ Biennial/ If you are conducting internal process impro:ement then you may well wish to monitor the progress at a fre=uency greater than annually/ A *enchmark allows you to identify where impro:ement effort should *e focussed/ A *enchmark carried out at this fre=uency is likely to *e an internal *enchmark monitoring your own progress/ 6his can *e supplemented *y an e;ternal *enchmark at some lower fre=uency say e:ery two years/

&.)

Plan the Benchmar Process

6his acti:ity consists of the following tasks:



4.5.1

%haracteriAe 5rganisational unit Select measures Select the Benchmarking supplier Select Benchmark &ataset 6he 5rganiAational unit shall *e e;plicitly descri*ed/ 6he organiAational unit pro:ides the conte;t for *enchmark and therefore it is important to make e;plicit this conte;t and the assumptions that it em*odies and constraints that it imposes/ Attri*utes of the organiAational unit that are rele:ant to the interpretation of the information products should *e identified/ %haracterisation can *e in terms of organiAational processes interfaces amongst di:isionsDdepartments and organiAational structure/ !rocesses may *e characteriAed in the form of a descripti:e process model/ 6he organisational unit characteriAation should *e taken into account in
Copyright ISBSG Page 23 of 40 www.ISBSG.org

Describe organisation

Ref: ISBSG Standard v1.0

Benchmarking Standard

Draft

09/02/2014

all su*se=uent acti:ities and tasks/


4.5.2 Select %eas$res

%andidate measures that allow one or more =uestions to *e answered the selected information needs shall *e identified/ 6here should *e a clear link *etween the information needs the =uestions and the candidate measures/ Such a link can *e made using the Benchmark Information "odel descri*ed in Anne; A/ 4ew measures should *e defined in sufficient detail to allow for a selection decision (task ././9)/ 5ther International Standards eg IS5DI#% )-+1:-))- and IS5DI#% -.-.9:-))2 descri*e some commonly used software measures and re=uirements for their definition/ A new measure does not ha:e to *e defined anew *ut may in:ol:e an adaptation of an e;isting measure/ 6he selected measures should reflect the priority of the information needs/ 'urther e;ample criteria that may *e used for the selection of measures may *e found in IS5DI#% -0)9):+,,+/ "easures that ha:e *een selected *ut that ha:e not *een fully specified should *e de:eloped/ 6his may in:ol:e the definition of o*Eecti:e measures for e;ample a product coupling measure or su*Eecti:e measures such as a user satisfaction =uestionnaire to meet new information needs/ It should *e noted that conte;t information may need to *e considered as well/ 'or e;ample in some cases measures may need to *e normalised/ 6his may re=uire the selection and definition of new measures such as a nominal measure identifying the programming language used/

4.5.3 oc$"ent %eas$res

An e;ample of a unit of *enchmark is Ghours per function point dollars per function pointH/ 6he formal definition descri*es e;actly how the :alues are to *e computed including input measures and constants for deri:ed measures/ 4ote that such definitions may already e;ist in the GBenchmark #;perience BaseH which should *e consulted/ 6he method of data collection may *e for e;ample a static analyser a data collection form or a standard =uestionnaire/ Anne; A pro:ides guidelines for linking the measures to the information needs through the Benchmark Information "odel %riteria for selecting the *enchmarking supplier shall *e defined/ ,ote& the decision needs to be made whether to use a benchmark pro*ider. >o they ha*e rele*ant experience in benchmarking area 5ithin target region 5ithin Industry ector 6he *enchmarking supplier should *e assessed for the "ethodology they propose in the following areas:

4.5.4 Select Bench"ar( S$pplier

Sampling techni=ues (if proposed)

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 24 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Analysis techni=ues #;tent to which customiAed solution is possi*le Sample findings reports and conclusions/ Logistics Re=uired resource commitments A*ility to meet proEect timeta*le 5n@site data collection %ost ,ote & 5hile it is typically a client organisation that negotiates a benchmark pro*ision in an outsourcing contract, costs tend to be born by both the client and supplier organisations hence both should be acti*ely in*ol*ed in the planning acti*ities 4apabilities and costs of benchmarking organisations can differ widely. 4lient and supplier need to agree on the e*aluation criteria between them.

4.5.5 Select Bench"ar( ataset

%riteria for selecting the *enchmarking dataset shall *e defined ,ote & It is recommended that the >ataset is from !rganisations of comparable si;e 4o*erage of elected =etrics within the dataset need to be comparable with measures pre*iously defined. .he following attributes of the measures should be e*aluated. egmentation by industry sector, application type and@or business en*ironment <rocess =aturity Ce*els (e.g. 4==I, 4== or pice assessment le*el) <ro7ects profiles (e.g. <ro7ect si;e , pro7ect *olumes, de*elopment or enhancement or migration, pro7ect duration) >eli*ery =echanisms, (for example <ackage 4ustomi;ation, !pen ource, Bespoke) 4urrency of data .echnology <latforms (e.g. =ainframe, client ser*er, <4 , 5eb based, multi:tiered etc) 6he *enchmark data set should *e e:aluated to ensure it satisfies data integrity re=uirements/ ,ote& .he benchmark dataset should be checked to ensure that adequate data *alidation procedures are performed before data is accepted. >efine data collection, analysis, and reporting procedures >efine criteria for e*aluating the information products and the benchmark process 6e*iew, appro*e, and staff benchmark tasks 3cquire and deploy supporting technologies

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 2 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Information products and e*aluation results in the DBenchmark -xperience BaseE should be consulted during the performance of this acti*ity. -xamples of the benchmark planning details that need to be addressed during this acti*ity are described in B.B.
4.5.6 e!ine )roced$res

!rocedures for data collection storage and :erification shall *e defined/ 6he procedures should specify how data are to *e collected as well as how and where they will *e stored/ &ata :erification may *e accomplished through an audit/ ,ote& 4onsideration needs to be gi*en to the scope of the data collection. I.e. 5hether a sample set of pro7ects or applications is selected for benchmarking, *ersus the entire population within the organisational unit. If a sample set is chosen then consideration needs to be gi*en to the following& i;e of the sample sets ample selection technique eg. 6andom, representati*e profiling (erify that the profile of the data collection set ade=uately matches the profile for the *enchmark data collection set/ ,ote& For example if the ma7ority of your de*elopment work is small pro7ects then you should choose a benchmark data set that has sufficient small pro7ects to enable meaningful comparison. #;pecting a single result that is supposed to LmagicallyF factor in the comple;ities of the systems de:elopment and *usiness en:ironments is naN:e a waste of time and money/ Some degree of profiling must *e undertaken/ InclusionD#;clusion rules for de:elopment proEects that span the *oundaries of the *enchmark period must *e clearly defined and e=uita*le/ (>uantify what is likely to *e e;cluded *ased upon the rules/ %onsider assessing O complete) Pou need to decide if reworked functional siAe will *e included in the product output measure or Eust the deli:ered functional siAe and ensure that the *enchmark dataset has compara*le measures/ (erify that the definitions for the *ase measures and deri:e measures of the data collection correspond to or can *e deri:ed from those used for the *enchmark data set/ ,ote& if functional si;e is used as the base measure of the product deli*ered, ensure that it has been measured using an equi*alent technique. -.g. If the data collection set has had its functional si;e measured in accordance with an I ! conformant F = =ethod it not *alid to compare it with a set, which has functional si;e, deri*ed from lines of code or estimated from other parameters. !rocedures for data analysis and reporting of information products shall *e defined/ 6he procedures should specify the data analysis method(s)

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 2/ of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

and format and methods for reporting the information products/ ,ote& <rior to the benchmark proceeding all parties need to agree on the content, degree of detail, layout, re*iew and appro*al process for reporting the benchmark information products. 6he range of possi*le statistical tools that would *e needed to perform the data analysis should *e identified/
4.5.7 Con!ig$ratio n %anage"ent

Items such as the raw data information products and selected information needs should *e placed under configuration management/ 6his may *e the same configuration management procedure used in other parts of the organisational unit/ 6hese criteria would allow one to determine whether the data that are needed ha:e *een collected and analysed with sufficient =uality to satisfy the information needs/ 6he criteria need to *e defined at the *eginning and act as success criteria/ 6he criteria need to *e defined within the conte;t of the technical and *usiness o*Eecti:es of the organisational unit/ #;ample criteria for the e:aluation of information products are the accuracy of a *enchmark procedure and the relia*ility of a *enchmark method/ 8owe:er it may *e necessary to define new criteria and measures for e:aluating the information products/ 6he criteria need to *e defined within the conte;t of the technical and *usiness o*Eecti:es of the organisational unit/ #;amples of such criteria are timeliness and efficiency of the *enchmark process/ 8owe:er it may *e necessary to define new criteria and measures for e:aluating the *enchmark process/ 6he criteria for appro:al will *e *ased upon meeting those criteria defined for the e:aluation/ 6hey should also include criteria such as the process for dispute resolution and the *asis upon which any *enchmark report will *e accepted/ 6he following are e;ample elements that may *e included in a *enchmark plan:

4.5.8 E+al$ating In!or"ation )rod$cts

4.5.9 E+al$ating Bench"ar( )rocess

4.5.10 Appro+ing the Bench"ar( )rocess

characteriAation of the organisational unit *usiness and proEect o*Eecti:es prioritiAed information needs and how they link to the *usiness organisational regulatory product andDor proEect o*Eecti:es definition of the measures and how they relate to the information needs responsi*ility for data collection and sources of data when will the data *e collected (e/g/ at the end of each inspection) and fre=uency tools and procedures for data collection (e/g/ instructions
Copyright ISBSG Page 20 of 40 www.ISBSG.org

Ref: ISBSG Standard v1.0

Benchmarking Standard

Draft

09/02/2014

for e;ecuting a static analyser)


data storage re=uirements on data :erification data entry and :erification procedures data analysis plan including fre=uency of analysis and reporting any necessary organisational andDor software process changes to implement the *enchmark plan criteria for the e:aluation of the information products criteria for the e:aluation of the *enchmark process confidentiality constraints on the data and information products and actionsDprecautions necessary to ensure confidentiality schedule and responsi*ilities for the implementation of *enchmark plan including pilots and organisational unit wide implementation procedures for configuration management of data *enchmark e;perience *ase and data definitions

4.5.11 Appro+al o! )lanning

6he *enchmark planning tasks constitute all tasks from clause ./0/through ./0/-+ 6he results of *enchmark planning include the data collection procedures storage analysis and reporting procedures e:aluation criteria schedules and responsi*ilities/ Benchmark planning should take into consideration impro:ements and updates proposed from pre:ious *enchmark cycles (GImpro:ement ActionsH in 'igure -) as well as rele:ant e;periences in the GBenchmark #;perience BaseH/ %riteria such as the feasi*ility of making changes to e;isting plans in the short@term the a:aila*ility of resources and tools for the realisation of changes and any potential disruptions to proEects from which data is collected should *e considered when selecting proposed impro:ements to implement/ If *enchmark planning information already e;ists for e;ample from a pre:ious *enchmark cycle then it may only need to *e updated as opposed to *eing Gde:elopedH/ Stakeholders must re:iew and comment on the *enchmark planning information/ 6he sponsor of *enchmark will then appro:e the *enchmark planning information/ Appro:al demonstrates commitment to *enchmark/ .he benchmark planning information should be agreed to by the management of the organisational unit, and resources allocated. For appro*al, the planning information may undergo a number of iterations. ,ote that benchmark may be piloted on indi*idual pro7ects before committing to organisation:wide use. .herefore, resource a*ailability may

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 21 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

be staged.
4.5.12 Ac#$ire s$pport technologies

A:aila*le supporting technologies shall *e e:aluated and appropriate ones selected/ Supporting technology may consist of for e;ample automated tools and training courses/ 6he types of automated tools that may *e needed include graphical presentation tools data analysis tools and data*ases/ 6ools for collecting data for e;ample static analysers for the collection of product data may also *e re=uired/ 6his may in:ol:e the modification andDor e;tension of e;isting tools and the cali*ration and testing of the tools/ Based on the e:aluation and selection of supporting technologies the *enchmark planning information may ha:e to *e updated/ 6he selected supporting technologies shall *e ac=uired and deployed/ If the supporting technologies concern the infrastructure for data management then access rights to the data should *e implemented in accordance with organisational security policies and any additional confidentiality constraints/

&.+

Perform the Benchmar Process

6his acti:ity consists of the following tasks:


Integrate procedures %ollect data !resent data to the *enchmark Analyse data Acti:ity #:aluate Benchmark results

!erforming the *enchmark process should *e done in accordance with the planning information descri*ed in clause ./0/ Information products and e:aluation results in the GBenchmark #;perience BaseH should *e consulted during the performance of this acti:ity/
4.6.1 Integrate )roced$res

&ata generation and collection shall *e integrated into the rele:ant processes/ Integration may in*ol*e changing current processes to accommodate data generation and collection acti*ities. For example, the inspection process may be changed to require that the moderator of an inspection hand o*er the preparation effort sheets and defect logs to the benchmark librarian at the closure of e*ery inspection. .his would then necessitate modifying inspection procedures accordingly. Integration in*ol*es a trade:off between the extent of impact on existing

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 22 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

processes that can be tolerated and the needs of the benchmark process. .he required changes to collect data should be minimi;ed. .he extent of integration *aries depending on the type of measures and the information needs. For example, a one:of: a:kind staff morale sur*ey requires little integration. Filling in time sheets at the end of e*ery week requires the staff to keep track of their effort during the week. .he data that need to be collected may include extra measures defined specifically to e*aluate the information products or performance measures to e*aluate the benchmark process. 6he integrated data collection procedures shall *e communicated to the data pro:iders/ .his communication may be accomplished during, for example, staff training, an orientation session, or *ia a company newsletter. .he ob7ecti*e of communicating the data collection procedures is to ensure that the data pro*iders are competent in the required data collection. 4ompetence may be achie*ed, for example, through training in the data collection procedures. .his increases confidence that data pro*iders understand exactly the type of data that are required, the format that is required, the tools to use, when to pro*ide data, and how frequently. For example, the data pro*iders may be trained on how to complete a defect data form, to ensure that they understand the defect classification scheme, and the meanings of different types of effort (such as isolation and correction effort). &ata analysis and reporting shall *e integrated into the rele:ant processes/ >ata analysis and reporting may need to be performed on a regular basis. .his requires that data analysis and reporting be integrated into the current organisational and pro7ect processes as well.
4.6.2 Collect ata

&ata shall *e generated and collected/ >ata may be generated, for example, by a static code analyser that calculates *alues for product measures e*ery time a module is checked in. >ata may be collected, for example, by completing a defect data form and sending it to the benchmark librarian. 6he collected data shall *e :erified/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 30 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

#nsure that all of the contri*uting *ase measures collected co:er the same scope/ ,ote& if collecting functional si;e and effort or cost you need to ensure that& .hat all effort collected has been expended in functional si;e generating acti*ities. -.g. exclude effort related to user training when measuring de*elopment producti*ity. .he scope of the work effort breakdown of the data collected corresponds to the scope of the work effort breakdown of the benchmark data set. -.g. If a supplier only expends de*elopment effort post design then they need to be benchmarked against pro7ects, which ha*e collected effort for the same de*elopment acti*ities. .he scope of the effort collected is for the same set of pro7ect related acti*ities e.g. Is the effort for administrati*e work, user participation, quality assurance etc included in both the collected and benchmarked datasets. .he effort and costs allocated to a particular organisational unit e.g. <ro7ect ha*e actually been expended on the organisational unit to for which it has been recorded. >ata *erification may be performed by inspecting against a checklist. .he checklist should be constructed to *erify that missing data are minimal, and that the *alues make sense. -xamples of the latter include checking that a defect classification is *alid, or that the si;e of a component is not ten times greater than all pre*iously entered components. In case of anomalies, the data pro*ider(s) should be consulted and corrections to the raw data made where necessary. 3utomated range and type checks may be used. >ata *erification may also be performed after data ha*e been stored, since errors (for example, data entry errors) may be introduced during the data storage. >ata *erification should be the responsibility of the benchmark librarian in con7unction with the data pro*ider(s). &ata shall *e stored including any conte;t information necessary to :erify understand or e:aluate the data/ ,ote that the data store does not ha*e to be an automated tool. It is possible to ha*e a paper:based data store, for example, in the situation where only a handful of measures are collected for a short period of time in a small organisation.

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 31 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

4.6.3

Anal'se ata

6he collected data shall *e analysed/ >ata may be aggregated, transformed, or re:coded prior to analysis. +uidance for performing statistical analysis may be found in I ! "((")& "$$$. 6he data analysis results shall *e interpreted *y the *enchmark analyst(s)/ .he benchmark analyst(s) would be able to draw some initial conclusions based on their results. /owe*er, since the analyst(s) may not be directly in*ol*ed in the technical and management processes, such conclusions need to be re*iewed by other stakeholders as well (see B.#.B). 3ll interpretations should take into account the context of the measures. .he data analysis results make up one or more indicators. 6he collected data should *e normalised if appropriate/ ,ote& >ata measures need to be normalised prior to benchmarking to ensure comparability 4osts should be normalised to account *ariability caused by factors such as salary fluctuations, inflation and currency exchange rates. If the collected data set was recorded in 3d7usted Functional i;e and the Benchmark >ataset was Fnad7usted function points then the collected data set would need to ha*e ad7usted re*ersed. 6he data analysis results shall *e re:iewed/ .he re*iew is intended to ensure that the analysis was performed, interpreted, and reported properly. It may be an informal Dself re*iewE, or a more formal inspection process.

4.6.4 Co""$nicat e In!or"ation )rod$cts

6he information products shall *e re:iewed with the data pro:iders and the *enchmark users/ 6his is to ensure that they are meaningful and if possi*le actiona*le/ >ualitati:e information should *e considered as a support to interpreting =uantitati:e results/ 6he information products shall *e documented/ 6he information products shall *e communicated to the data pro:iders and the *enchmark users/ 'eed*ack should *e pro:ided to the stakeholders as well as *eing sought from the stakeholders/ 6his ensures useful input for e:aluating the information products and the *enchmark process/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 32 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

&.-

'#aluate the Benchmar

6his acti:ity consists of the following tasks: #:aluate measures and *enchmark process Identify potential impro:ements to the *enchmark process
4.7.1 E+al$ate %eas$res

6he information products shall *e e:aluated against the specified e:aluation criteria and conclusions on strengths and weaknesses of the information products drawn/ .he e*aluation of information products may be accomplished through an internal or independent audit. -xample criteria for the e*aluation of information products are included in uitable e*aluation criteria can be found in I !@I-4 "#$%$. .he inputs to this e*aluation are the performance measures, the information products, and the benchmark user feedback. .he e*aluation of information products may conclude that some measures ought to be remo*ed, for example, if they no longer meet a current information need.

4.7.2 E+al$ate the &ench"ar( process

6he *enchmark process shall *e e:aluated against the specified e:aluation criteria and conclusions on strengths and weaknesses of the *enchmark process drawn/ .he e*aluation of benchmark process may be accomplished through an internal or independent audit. .he quality of the benchmark process influences the quality of the information products. .he inputs to this e*aluation are the performance measures, the information products, and the benchmark user feedback. Lessons learned from the e:aluation shall *e stored in the GBenchmark #;perience BaseH/ ,ote& If the benchmark had been performed to assess a supplier in an outsourcing contract then the Benchmark 6eport may be considered for input into re:calibration of performance targets. Cessons learned may take the form of strengths and weaknesses of the information products, of the benchmark process, of the e*aluation criteria themsel*es, and@or experiences in benchmark planning (for example, Dthere was great resistance by the data pro*iders in collecting a specific measure at a specific frequencyE).

4.7.3 Identi!' potential i"pro+e"ents

!otential impro:ements to the *enchmark process shall *e identified/ uch DImpro*ement 3ctionsE should be used in future instances of the D<lan the Benchmark <rocessE acti*ity. .he costs and benefits of potential impro*ements should be
Copyright ISBSG Page 33 of 40 www.ISBSG.org

Ref: ISBSG Standard v1.0

Benchmarking Standard

Draft

09/02/2014

considered when selecting the DImpro*ement 3ctionsE to implement. It should be noted that making a particular impro*ement may not be cost effecti*e or the benchmark process may be good as it is, and therefore no potential impro*ements may be identified. !otential impro:ements to the *enchmark process shall *e communicated to the *enchmark process owner and other stakeholders/ .his would allow the benchmark process owner to make decisions about potential impro*ements to the benchmark process. If no potential impro*ements are identified in this clause then it should be communicated that there were no potential impro*ements.

).

Informati#e References

IS5DI#% +92+@-: -))9 &ata !rocessing @ (oca*ularyK !art -: 'undamental 6erms/ IS5DI#% +92+@+,: -)), Information 6echnology ? (oca*ulary/ IS5 2.,+: -)). >uality management and =uality assurance ? (oca*ulary/ IS5 ),,-: -)). >uality Systems @ "odels for =uality assurance in designDde:elopment production installation and ser:icing/ IS5DI#% -++,<: -))0 Information 6echnology @ Software Life %ycle !rocesses/ IS5DI#% )-+1: -))- Information 6echnology @ Software !roduct #:aluation @ >uality %haracteristics and Guidelines for their 7se/ IS5DI#% -.-.9:-))2 Information 6echnology @ Software Benchmark @ &efinition of 'unctional SiAe Benchmark/ IS5DI#% -.0)2@-: -))1 Information 6echnology ? Software !roduct #:aluation: !art General 5:er:iew/ IS5DI#% 6R -00,.@+: -))2 Information 6echnology @ Software !rocess Assessment @ !art +: A Reference "odel for !rocesses and !rocess %apa*ility/ IS5DI#% 6R -00,.@): -))2 Information 6echnology ? Software !rocess Assessment ? !art ): (oca*ulary/ IS5 International (oca*ulary of Basic and General 6erms in "etrology -))9/ IS5 6R -,,-<:-))) Guidance on Statistical 6echni=ues for IS5 ),,-:-))./ '/ Ro*erts: Benchmark 6heory with Applications to &ecisionmaking 7tility and the Social Sciences/ Addison@$esley -)<)/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 34 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Anne( A7 '(amples !informati#e"


6he following su*sections pro:ide e;amples of instantiations of the model that address specific information needs/ 6hese e;amples are not designed to recommend *est *enchmark practices *ut rather to show the applica*ility of the *enchmark information model in a :ariety of common situations/
+.1 Prod'"tivity e.a#p&e

6he decision@maker in this e;ample needs to select a specific producti:ity le:el as the *asis for proEect planning/ 6he measura*le concept is that producti:ity is related to effort e;pended and amount of software produced/ 6hus effort and code are the measura*le entities of concern/ 6his e;ample assumes that the producti:ity is estimated *ased on past performance/ 6hus data for the *ase measures (num*ered entries in ta*le *elow) must *e collected and the deri:ed measure computed for each proEect in the data store/ Regardless of how the producti:ity num*er is arri:ed at the uncertainty inherent in software engineering means that there is a considera*le pro*a*ility that the estimated producti:ity wonFt *e realiAed e;actly/ #stimating producti:ity *ased on historical data ena*les the computation of confidence limits that help to assess how close actual results are likely to come to the estimated :alue/ #sta*lish a:erage producti:ity of de:elopment proEects undertaken in the last year !roEect producti:ity 'unctional SiAe of proEects #ffort e;pended *y proEects 'unctional SiAe 6imecard entries (recording effort) !roEect Q 'unctional SiAe eg/ %'S7 '!s !roEect Q 8ours of effort IS5DI#% -.-.9@- %onformant 'S" "ethod Add timecard entries together for !roEect Q 5*Eecti:e 5*Eecti:e Integers from minimum 'S" for "ethod to infinity Real num*ers from Aero to infinity 5rdinal Ratio 'S" "ethod 7nit 8our !roEect Q !roducti:ity &i:ide !roEect Q 'S"" 7nits *y !roEect Q 8ours of #ffort A:erage producti:ity

Information 4eed "easura*le %oncept Rele:ant #ntities ("easura*le) attri*utes Base "easures "easurement "ethod 6ype of "ethod Scale 6ype of Scale 7nit of Benchmark &eri:ed "easure 'unction Indicator

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 3 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

"odel &ecision %riteria

+.2 S"hed'&e adheren"e

%ompute mean and standard de:iation of all proEect producti:ity :alues %omputed confidence limits *ased on the standard de:iation indicate the likelihood that an actual result close to the a:erage producti:ity will *e achie:ed/ (ery wide confidence limits suggest a potentially large departure and the need for in:estigation to determine the reasons for departure/ 6his should *e followed *y a plan for action to impro:e producti:ity of proEects 6he decision@maker in this e;ample needs to e:aluate whether or not the rate of progress on a proEect is sufficient/ 6he measura*le concept is that progress is related to the amount of work planned and the amount of work completed/ 6hus planned work items are the entities of concern/ 6his e;ample assumes that the status (degree of completion) of each unit is reported *y the de:eloper assigned to it/ 6hus data for the *ase measures (num*ered entries in ta*le *elow) must *e collected and the deri:ed measure computed for each work item in the plan/ Since the status of units is a su*Eecti:e assessment a simple numerical threshold is used as a decision criterion rather than statistical limits/ &etermine to what e;tent proEects ha:e adhered to their planned schedules !roEect schedule leadDlag !lanned proEect schedule Achie:ed proEect turn out 6ime @ months 6ime @ months !roEect Q !lanned scheduled time !roEect Q actually achie:ed time to deli:er 4ote planned time ? in months Su*tract start date from end date ? result in elapsed months 5*Eecti:e 5*Eecti:e !ositi:e Real 4um*ers to infinity !ositi:e Real num*ers to infinity Ratio Ratio "onth "onth Schedule lagDlead for proEect Q Su*tract !roEect Q schedule months from !roEect Q actual elapsed months A:erage Schedule de:iation %ompute mean and standard de:iation of all a*solute schedule leads

Information 4eed "easura*le %oncept Rele:ant #ntities ("easura*le) attri*utes Base "easures "easurement "ethod 6ype of "ethod Scale 6ype of Scale 7nit of Benchmark &eri:ed "easure 'unction Indicator "odel

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 3/ of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

&ecision %riteria

%omputed confidence limits *ased on the standard de:iation indicate the likelihood that an actual result close to the a:erage schedule de:iation will *e achie:ed/ (ery wide confidence limits suggest a potentially large schedule de:iation and the need for in:estigation to determine the reasons/ 6his should *e followed *y a plan for action to reduce schedule de:iation of proEects

+nne. B Pro"e%% 3or$ Prod'"t% 4infor#ative5


6his anne; contains a mapping *etween the work products ($!s) mentioned in this Benchmarking Standard and the acti:ities or tasks that produce them/ 4ote that this anne; only presents the final $!s not all of the intermediate $!s that may need to *e produced during the performance of the acti:ities and tasks/ All of the tasksDacti:ities that produce these work products are normati:e/ 8owe:er only the work products with an asterisk (R) ne;t to them ha:e normati:e re=uirements that they *e documented/ 6his Benchmarking Standard is not intended to prescri*e the name format or e;plicit content of the documentation to *e produced/ 6he International Standard does not imply that documents *e stored packaged or com*ined in some fashion/ 6hese decisions are left to the user of this International Standard/ $ork !roduct Acti:ityD6ask !roducing $! $ork !roducts !roduced #;ternally Re=uirements for Benchmark 6echnical and "anagement !rocesses Information 4eeds 6echnical and "anagement !rocesses Benchmark 7sers 'eed*ack 6echnical and "anagement !rocesses $ork !roducts !roduced *y the Gplan the BenchmarkH !rocess %haracterisation of the 5rganisational 7nit ./0/- &escri*e 5rganisational 7nit Selected Information 4eeds ./+/- Identify *enchmark information needs &efinition of selected measures ./0/+ Select "easures !rocedures for data collection and storage ./0/1 &efine data collection and reporting %onfiguration "anagement procedures ./0/< %onfiguration management %riteria for the e:aluation of information ./0/2 #:aluating information products products %riteria for e:aluating *enchmark process ./0/) #:aluating the *enchmark process Appro:ed Results of *enchmark planning ./0/-- Appro:ing the *enchmark plan Select supporting 6echnologies ./0/-+ A=uire support technologies $ork products produced *y the G!erform the BenchmarkH !rocess Integrated &ata collection and storage ./0/1 Integrate !rocedures procedures Stored &ata ./1/+ %ollect &ata &ata Analysis and interpretations ./1/9 Analyse &ata Information products ./1/. %ommunicate information products $ork products produced *y G#:aluate BenchmarkH process Benchmark e;perience data *ase (update) #:aluation Results ./< #:aluate "easuresD #:aluate the *enchmark !rocess

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 30 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Impro:ement actions

./1/9 Identify potential Impro:ements

Annex C Criteria for evaluating the Benchmark Process


6he goodness of a process may *e Eudged *y assessing its capa*ility (as descri*ed in IS5DI#% 6R -00,.) or *y measuring and e:aluating its performance/ $hile this International Standard as a whole may *e used as a reference model for assessing the capa*ility of a *enchmark process this section only addresses the e:aluation of the performance of the *enchmark process/ Below is a set of example criteria that may *e used for e:aluating the performance of the *enchmark process/ In some cases the criteria can *e used for a =uantitati:e e:aluation and in other situations a =ualitati:e e:aluation may *e appropriate/ 6he following criteria may *e regarded as potential information needs of the *enchmark process owner/ 6he *enchmark process descri*ed in this Benchmarking Standard may *e applied to produce information products that address the information needs identified *y the *enchmark process owner/

C.1

Timeliness

6he *enchmark process should pro:ide information products in time to support the needs of the *enchmark user/ 6imeliness may well *e critical in pro:iding information products to meet the needs of the sponsors and stakeholders/ 7ntimely information may lead to the *enchmark process *eing :iewed as unhelpful or e:en misleading/

C.2

'fficienc%

6he *enchmark process should not cost more to perform than the :alue of the information that it pro:ides/ 6he more efficient the process the lower its cost and the greater the costD*enefit/

C.3

Defect containment

6he *enchmark process should minimise the introduction of erroneous data and results while remo:ing any that do get introduced as thoroughly and soon as possi*le/ Analysis of *enchmark data should always take into account and make e;plicit the amount of any una:oida*le :ariation in the data from which the information products are deri:ed/

C.&

3ta eholder satisfaction

6he users of information products (stakeholders and sponsors) should *e satisfied with the =uality of the information products and the performance of the *enchmark process in terms of timeliness efficiency and defect containment/ Satisfaction may *e affected *y the userFs e;pectation of the le:el of =uality and performance to *e pro:ided/ A high degree of satisfaction is :ital if commitment to the *enchmark process is to *e maintained/

C.)

Process conformance

6he e;ecution of *enchmark acti:ities should conform to any plans and procedures de:eloped to descri*e the intended *enchmark process/ 6his may *e Eudged *y =uality management audits or
Ref: ISBSG Standard v1.0 Copyright ISBSG Page 31 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

process capa*ility assessments/

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 32 of 40 www.ISBSG.org

Benchmarking Standard

Draft

09/02/2014

Document Control
Chan4e 8istor%
6er%ion 0a 0.1 0.0 0.22 1.0 +'thor78ate !ony Ro&&o 1 :ov 2004 Pa# <orri% =an 200 !ony Ro&&o 20 Septe#*er 200 Ca&ro& 8e$$er% Septe#*er 2000 !ony Ro&&o Re draft for IS> %'*#i%%ion ;ina& In"orporating %'gge%ted i#prove#ent% fro# I;P?G ;ina& 8raft Inter#ediate draft 9C78ate +'thori%ed By 78ate Co##ent% ;ir%t 8R+;! for review

Cop% Control
)&e"troni" "opie%: ISBSG +d#in >ffi"e !ony Ro&&o, Pa# <orri%, )wa 3a%y&$ow%$i, Caro& 8e$$er%, Pe$$a ;or%e&i'% ISBSG <e#*er% <a%ter +'thor%

*ile re!erence

+,)))0.1-/doc

Ref: ISBSG Standard v1.0

Copyright ISBSG

Page 40 of 40 www.ISBSG.org