You are on page 1of 21

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/261253131

Measures that matters: Service Quality in IT Service Management

Article  in  International Journal of Quality and Service Sciences · January 2014

CITATIONS READS
2 1,777

2 authors:

Stefan Cronholm Nicklas Salomonson


Högskolan i Borås Högskolan i Borås
84 PUBLICATIONS   857 CITATIONS    82 PUBLICATIONS   517 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Utvärdering av Västra Götalandsregionens politiska organisation View project

Efficient IT Service Management View project

All content following this page was uploaded by Stefan Cronholm on 13 October 2016.

The user has requested enhancement of the downloaded file.


International Journal of Quality and Service Sciences

Measures that Matters: Service Quality in IT Service


Management

Journal: International Journal of Quality and Service Sciences


Fo
Manuscript ID: Draft

Manuscript Type: Research Paper


r
Keywords: IT Service Management, SERVQUAL, Measures
Re
vi
ew
On
ly
Page 1 of 19 International Journal of Quality and Service Sciences

1
2
3 Measures that Matters: Service Quality in IT Service Management
4
5
6 Abstract
7
8 Purpose - IT Service Management (ITSM) is a discipline for management and maintenance
9 of IT-systems and is claimed to play a critical role in supporting and satisfying business
10 requirements. However, from a customer perspective, ITSM is considered as being costly and
11 the outcome is not always satisfactory. Measurements used to monitor and evaluate ITSM-
12 processes are mainly suggested from a service provider perspective. There is a need for
13 measures that are more customer oriented. The aim of this paper is to suggest quality
14 measurements for IT Service Management (ITSM) from a customer perspective.
15
16 Design/methodology/approach - The SERVQUAL scale has been used for suggesting
17 customer-oriented measurements for the ITSM-field. Based on customer inquiries done by
18 five IT service providers in Sweden (car construction, forest management, IT consultants,
Fo
19 public sector and logistics), the SERVQUAL scale has been modified according to ITSM-
20 specific customer requirements. The collected customer experiences have 1) confirmed,
21
22
modified or developed new attributes of the original SERVQUAL, and 2) been a base for the
r

23 development of a new conceptual structure.


24 Findings - The paper demonstrates three types of findings: 1) confirmation of SERVQUAL
Re

25 determinants that could be reused in ITSM-field, 2) modification of attributes of the


26
determinants to better fit in the ITSM-field, and 3) development of new categories and new
27
28 attributes.
vi

29 Originality/value - The knowledge contribution consists of a developed SERVQUAL,


30 adjusted to fit the ITSM-field, and a suggested new structure of SERVQUAL consisting of
31 three concepts: determinant, category and attribute.
ew

32
33 Keywords: IT Service Management, SERVQUAL, Measures
34
35 Article Classification: Research paper
36
On

37
38 1. Introduction
39
40 IT Service Management (ITSM) is a discipline for management and maintenance of IT-
41
ly

systems. ITSM is a subset of Services Science and plays a critical role in supporting and
42
43
satisfying business requirements (Galup et al., 2007; Galup et al., 2009; Bardhan, 2010). One
44 purpose of ITSM is to fulfill these business requirements by maintaining and operating the IT
45 information infrastructure. Another purpose with ITSM is to increase the focus on customer
46 needs. This perspective is a reaction against the traditional view of systems maintenance that
47 often takes the perspective of a service provider. van Bon (2002) claims that providers of IT
48 services can no longer afford to focus on technology and their internal organization; they now
49 have to consider the quality of the services they provide and focus on the relationship to
50 customers.
51
52 The ITSM-field is a widespread area where both the private sector and the public sector have
53 to manage and maintain IT-systems and processes. Pollard et al. (2009) define ITSM as a
54 strategy by which information systems are: 1) offered under contract to customers and 2)
55 performance is managed as a service. Winniford et al. (2009) add that ITSM focuses on
56 defining, managing, and delivering IT services to support business goals and customer needs.
57
58 ITSM includes several types of processes in organizations such as service system
59
60
International Journal of Quality and Service Sciences Page 2 of 19

1
2
3 development, incident resolution and prevention, and service system transition. The broad
4 nature of ITSM implies that it has relations to other fields such as the software maintenance,
5 business processes, project management and IT governance (Galup et al., 2007). Examples of
6 interesting frameworks (that partly are borrowed from other fields) used within the ITSM
7 field are: Six Sigma (e.g. Pande et al., 2000), Total Quality Management (TQM) (e.g.
8
Dahlgaard et al., 2002), Business Process Reengineering (BPR) (Davenport, 1993; Hammer
9
10 and Champy, 1993), Capability Maturity Model (CMMI) (Paulk, 1995), Project Management
11 Body of Knowledge (PMBOK) (PMI Standards Committee, 2000), and Control Objectives
12 for Information and Related Technology (COBIT) (IT Governance Institute, 2007). Another
13 well-known framework is Information Technology Infrastructure Library (ITIL) (Office of
14 Government Commerce, 2008). ITIL is the most popular and influential framework for
15 applying ITSM and usually referred to as a “best practice” (Galup et al., 2009; McNaughton
16 et al., 2010). The growth of frameworks such as the ITIL, COBIT, and CMMI has caused IT
17 organizations to begin to develop specific service processes for their organizations.
18
Fo
19 Recent studies also show that ITSM is costly (e.g. Flemming, 2005; Orlov, 2005; Addy, 2007;
20 Haverblad, 2007; Galup et al., 2009). These scholars believe that the cost of ITSM is as high
21 as 60-90% of the total cost of the expenditure of an IT organization. Addy (2007) argues that
22 ITSM-related costs annually add up to over 300 billion dollars which is more than Norway's
r

23 GNP. Furthermore, Pigoski (1997) argues that the cost of maintenance is too high, the speed
24
of maintenance service is too slow and that there is difficulty in managing the priority of
Re

25
26 change requests. Results from a survey within the IT sector in Sweden conducted by Brandt
27 (2008) points in the same direction. Brandt (2008) reports that tasks related to systems
28 development only comprise 20% of the total work. That is, tasks related to ITSM comprise
vi

29 80%. The costs are illustrated in the magazine Computer Sweden (Nov 9, 2008). The
30 magazine describes that the city of Stockholm has prolonged their general agreement
31 concerning ITSM with their service providers. The agreement includes systems maintenances,
ew

32 service desk, and telephone services for two years and is worth more than 55 million USD.
33 This shows the size and enormous cost of ITSM. The high costs have motivated improved
34
35
efficiency in ITSM-processes and there is a vast amount of measurements used to monitor and
36 evaluate the processes (e.g. Brooks, 2006; Smith, 2008; McNaughton et al., 2010). Some
On

37 examples of measurements are percentage of incidents resolved by 1st line support, average
38 time to resolve incidents, percentage of releases on time, percentage of IT staff turnover,
39 number of reported bugs fixed, number of SLA targets missed, and average number of open
40 problems (Brooks, 2006).
41
ly

42 We acknowledge that all the frameworks discussed above (e.g. ITIL, COBIT, and CMMI) can
43 be useful for most ITSM-companies and the growth in popularity of these frameworks has
44 caused service providers to develop specific customer-oriented service processes for their
45 organizations. However, we claim that these frameworks are not sufficiently customer-
46 oriented and especially that there is a need for complementing measurements based on the
47 customers’ interests. The existing ITSM-measures are still mainly suggested from a service
48 provider perspective and focusing on the performance of the suppliers’ processes. They are
49
not primarily derived from a customer perspective and they are not focusing customer value.
50
51 One possible reason for this one-sided focus is the prevailing goods-dominant logic in
52 businesses, where value is seen as created by the firm and distributed in the market, usually
53 through exchange of goods and money (Vargo et al., 2008). This logic creates a separation
54 between producer and consumer and their value systems (Vargo et al., 2008). Customer value
55 has traditionally been defined as a trade-off between benefits (”what you get”) and sacrifices
56 (”what you give”) in the exchange (Zeithaml, 1988; Ulaga, 2003). According to the emerging
57 service-dominant logic it is the customer who in the end determines the value of the service
58
59
60
Page 3 of 19 International Journal of Quality and Service Sciences

1
2
3 (e.g. Vargo and Lusch, 2004ab). It is also possible to argue that a one-sided focus on 1)
4 shortening the lead-time of processes and 2) reduce costs will not necessary lead to
5 improvements in the service from a customer perspective.
6
7 We acknowledge that measurements suggested from the service provider perspective are
8 important but we argue that not enough attention has been paid to what customers find
9 important and how customers perceive service quality from the perspective of maintaining IT-
10 systems and processes. There is also a lack of research that has demonstrated the benefits of
11 ITSM from a customer perspective. The aim of this paper is to suggest measurements for
12 ITSM based on a customer perspective. The suggested measurements are to be seen as a
13 complement to existing measurements. The following section consists of a theoretical framing
14
15
aiming at defining and describing the most central concepts we are using. Section 3 informs
16 about how we have conducted this study and in section 4 we present the findings. Finally, in
17 section 5 conclusions are drawn.
18
Fo
19 2. Theoretical framing
20 As claimed in section 1, there is a need for an increased focus on customer value in ITSM
21
22
related work. It is not sufficient for the service providers to apply an intra-organizational
r

23 perspective that focuses on economizing their own procedures. Service providers that work
24 with management and maintenance of IT-systems need to include customers’ own value
Re

25 creating activities in order to fully understand how to plan and implement their ITSM-
26 processes. According to Grönroos (2007), value is the result of a process of co-creation where
27 the service provider through its offerings enables the customer to create value. Value is
28 perceived and determined by the customer, i.e. the customer decides the value based on
vi

29 perception of actual usefulness (Vargo and Lusch, 2004a). Vargo and Lusch (2004ab) call this
30
for value-in-use, meaning that value can only be perceived when the service is in "use".
31
Therefore, a service provider needs to understand what a customer perceives as valuable in
ew

32
33 order to define and develop market offerings (Payne and Holt, 1999). The service providers’
34 offerings need to support value creation in customers’ daily activities and processes (Grönroos,
35 2007). Customer-perceived value should therefore be considered a key determinant for
36 customer satisfaction (Eggert and Ulaga, 2002). These claims mentioned by Vargo & Lusch
On

37 (2004ab), Payne and Holt (1999), and Eggert and Ulaga (2002) strengthen our observation that
38 there is a need for measurements suggested from a customer perspective.
39
40 One key component in the customers' perceived value of a service is service quality
41 (Storbacka et al., 1994). Edvardsson (1998) define service quality as the correspondence
ly

42 between the service and the customer’s needs and requirements, as perceived by the customer.
43 Grönroos (1984, p. 37) defines perceived service quality as: "[...] the outcome of an
44 evaluation process, [whereby] the consumer compares his expectations with the service he
45 perceives he has received, i.e. he puts the perceived service against the expected service”.
46
Service quality can thus be conceptualized as the “gap” between the customers' expectations
47
48 and their perceptions of the service received (Parasuraman et al., 1988). The message sent by
49 these scholars is that it is important to focus on the customers’ perceived quality of the
50 service.
51 One popular instrument for measuring service quality is the SERVQUAL scale which was
52
developed and successively refined by Parasuraman et al. (1985, 1988, 1991, 1994). This
53
54 instrument is perhaps the most well known and most commonly used instrument for
55 measuring service quality (Buttle, 1996; Ladhari, 2009), SERVQUAL has been used to
56 measure service quality in many different service industries, including retail, banking, fast
57 food and health care (for an extensive review of SERVQUAL applications and debates, see
58 Ladhari, 2009). SERVQUAL is based on the difference between customer expectations and
59
60
International Journal of Quality and Service Sciences Page 4 of 19

1
2
3 perceptions, and originally consisted of ten determinants for service quality: tangibles,
4 reliability, responsiveness, communication, credibility, security, competence, courtesy,
5 understanding/knowing customers, and access. These determinants were then condensed into
6 five (Parasuraman et al. 1988): tangibles (physical facilities, equipment, and appearance of
7 personnel), reliability (ability to perform the promised service dependably and accurately),
8
responsiveness (willingness to help customers and provide prompt service), assurance
9
10 (knowledge and courtesy of employees and their ability to inspire trust and confidence), and
11 empathy (caring, individualized attention the firm provides its customers).
12 Although previous research has demonstrated shortcomings of the SERVQUAL scale, it
13 remains a useful tool for measuring and managing service quality (Ladhari, 2009). However,
14
15
as Ladhari (2009) concludes, researchers should either adapt the SERVQUAL methodology
16 to develop their own instrument for a specific industry or specific study context or validate
17 the instrument after data collection through reliability and validity analysis. In this paper we
18 choose the former by suggesting modifications in accordance with ITSM-specific customer
Fo
19 requirements.
20
21 3. Research Method
22
r

23 As described in section 1, we argue that there is a lack of measurements that originates from a
24 customer perspective in the ITSM-field. In order to suggest customer-oriented measurements
Re

25 we have analyzed questionnaires used by five IT service providers in Sweden and their
26 customers. The service providers represent the following sectors: car construction, forest
27 management, IT consultants, public sector and logistics. The customers consist of companies
28 that need support for their IT-systems. The questionnaires embraced 10 to 30 questions
vi

29 concerning the customers’ experiences from using the provided services. In order to deepen
30
the understanding of what customers see as especially important we choose to focus on
31
questions that enabled the customer to answer in a free format, i.e. questions that did not
ew

32
33 require an answer in a pre-categorized multiple-choice format.
34 The SERVQUAL’s original 10 determinants (Parasuraman et al. 1985, 1988) have been used
35 to analyze the questionnaires (see table 1). Our arguments for choosing SERVQUAL are: 1)
36
the purpose of SERVQUAL is to suggest service quality measures, 2) it is developed from a
On

37
38 customer perspective, and 3) it is the most well known instrument for analyzing service
39 quality and has been proven to be useful (e.g. Ladhari, 2009). We chose to use the original ten
40 determinants instead of the condensed version consisting of five determinants Parasuraman et
41 al. (1991, 1994).. The reason for this choice is that we were able to relate the customers’
ly

42 answers in the questionnaires in more precise way.


43
44
45 INSERT ABOUT HERE: Table 1. SERVQUAL (Parasuraman et al., 1985, 1988)
46
47
48 We do not take these ten determinants for granted or view them as a panacea. We have taken
49
50
a critical stand towards SERVQUAL since for obvious reasons the authors (Parasuraman et
51 al., 1985, 1988, 1991, 1994) of SERVQUAL did not have the ITSM-field at hand when
52 suggesting the determinants. A benefit of using the ten determinants is that they are
53 formulated on a general level and thus not restricted to a specific sector or area. Therefore, we
54 also believe it should be possible to transfer them to the ITSM-field. In order to adapt
55 SERVQUAL to ITSM-field we have developed, modified and confirmed SERVQUAL. That
56 is, the ten determinants have been used to analyze the questionnaires and the questionnaires
57 have been used to improve the determinants.
58
59
60
Page 5 of 19 International Journal of Quality and Service Sciences

1
2
3 We have developed the determinants when we found statements in the questionnaires that did
4 not correspond to the SERVQUAL determinants. This means that we have 1) added
5 categories and attributes, and 2) improved the conceptual structure. All the added attributes
6 are based on customer experiences as described by customers in the questionnaires. The
7 determinants consist of a title and a description (see table 1). The description can be viewed
8
as a decomposition of the determinant consisting of one or several attributes. In this way, the
9
10 determinant (the title) and the attributes (the description) represent a two-level conceptual
11 structure. We have found that two conceptual levels are not satisfactory and thus we have
12 added one conceptual level. We call this level category. That is, a determinant consists of one
13 or more categories and a category consists of one or more attributes. In order to further clarify
14 our reasoning we give an example. Take for example the determinant “Credibility” as it is
15 described in the original SERVQUAL (see table 1). The description of ‘Credibility” reads:
16 “Involves trustworthiness, believability, honesty. It involves having the customer's best
17 interests at heart. Contributing to credibility are: company name, company reputation,
18
personal characteristics of the contact personnel, the degree of hard sell involved in
Fo
19
20 interactions with the customer”. The description starts with “Involves trustworthiness,
21 believability, honesty…”. Clearly these three attributes are abstract to their character. So is
22 also the claim “…having the customer's best interests at heart”. The rest of description is
r

23 pretty concrete “Contributing to credibility are: company name, company reputation, personal
24 characteristics of the contact personnel …”. This latter part of the description constitutes
Re

25 examples of the former part. That is, there is a need for a third conceptual level.
26
27 Several scholars have argued for the advantage and role of a multilevel hierarchy. Rasmussen
28 et al. (1994) compare a multilevel hierarchy with a means-end hierarchy and claim that a
vi

29 multilevel abstraction hierarchy is often used in practical problem solving processes.


30 According to Cronholm and Bruno (2008), a structured and categorized set of criteria
31 supports the conceptual understanding. Our aim of suggesting a three-level hierarchy is to
ew

32 support the understanding and use of the ITSM-adapted SERVQUAL determinants. The sum
33 of the attributes suggested in the original structure together with the customer experiences
34
35
would constitute a too long and unstructured list of attributes representing different kind of
36 phenomena.
On

37 We modified determinants when there was a need to slightly refine or modify an attribute. We
38 confirmed the determinants by relating corresponding customer statements. Confirmation
39
means that we kept the original formulation of a determinant or attribute.
40
41
ly

42
4. Findings
43 In section 3, we described that our findings in relation to the SERVQUAL scale can be of
44 three types: confirmed, modified or developed. The aim of this section is to present our
45 analysis including examples of these three types. For each determinant, we present a
46
conceptual structure consisting of categories and attributes. All the confirmed, modified or
47
48 added attributes are supported by a quote from the questionnaires. It is important to
49 understand that we are not interested in if the content of a quote is positive or negative. What
50 is important is the quote in itself. The quote constitutes a message from a customer and it
51 informs the service provider about what the customer perceives as valuable. The determinants
52 are analyzed one by one. The sum of all developed determinants is presented in attachment A.
53
54 The first determinant “Reliability” is suggested to consist of the two categories
55 “Dependability” and “Performance” (see table 2). These terms exist in the original description
56 of “Reliability” but they are not explicitly expressed as categories (see table 1).
57 “Dependability” consists of the confirmed attributes “Honor the promises” and “Keeping
58 records correctly” and the original attribute “Accuracy in billing”. “Performance” consists of
59
60
International Journal of Quality and Service Sciences Page 6 of 19

1
2
3 the confirmed attributes “Perform the service right” and “Perform the service at the
4 designated time”. “Performance” does also consist of the added attribute “Perform the service
5 according to terms in the agreement”. The last attribute is added since usually there is a
6 service level agreement (SLA) between a service provider and a customer. One purpose of the
7 SLA is to regulate what services the service provider is expected to carry out.
8
9
10
11
INSERT ABOUT HERE: Table 2. Reliability: developed structure and content
12
13
14 The new conceptual structure for “Responsiveness” consists of three categories: “Speed”,
15 “Willingness” and “Distribution of responsibility” (see table 3). The category “Speed”
16 includes the attribute “Mailing a transaction slip immediately”, the confirmed attribute
17 “Calling the customer back quickly” and the modified attribute “Giving service according to
18 the agreed service level”. The reason for modifying the latter attribute is that the service level
Fo
19 is based on a SLA (see above). The formulation of the original SERVQUAL attribute
20 “Calling the customer back quickly” is not exactly correct since it does not measure what has
21
22
been agreed upon. Of course, the customer has an opportunity to get faster response but the
r

23 faster the more expensive. Thus, we suggest on modification of the original attribute to a
24 formulation that measures what both parties has agreed upon.
Re

25 The category “Attitude” includes the confirmed attribute “Willingness” and the new attribute
26
“Interest”. Finally, the category “Distribution of responsibility” includes the two new
27
28 attributes “Assignment of incidents” and “Matching the customer problem to the right
competence”.
vi

29
30
31
ew

32 INSERT ABOUT HERE: Table 3. Responsiveness: developed structure and content


33
34
35 The determinant “Competence” consists of the categories: “Knowledge and skill of contact
36 personal”, “Knowledge and skill of operational support personal” and “Research capability”
On

37 (see table 4). This structure is pretty much in line with the original SERVQUAL. Both
38 “Knowledge and skill of contact personal” and “Knowledge and skill of operational support
39
personal” contain the confirmed attribute “Possession”. We have not been able to relate the
40
41 customer experiences to the original attribute “Research capability”.
ly

42
43
44 INSERT ABOUT HERE: Table 4. Competence: developed structure and content
45
46
47 In the original description of the determinant “Access” we can read that it “ … involves
48 approachability and ease-of-contact”. In the original description it is not explained how
49 approachability and ease-of-contact differ. Thus, we suggest that the determinant “Access”
50 consists of the category “Approachability” and that “Approachability” consists of the
51
confirmed attribute “Convenient hours of operation”, the original attribute “Convenient
52
53
location of service facility” and the added attribute “Contact media” (see table 5). In the
54 original description the attribute “Waiting time to receive service” is listed. According to our
55 understanding this attribute are closer to “Responsiveness” and consequently we have
56 removed it from this determinant. We realize that there might be an unavoidable overlap
57 between the determinants but in order to support understanding this overlap should be
58 minimized as much as possible.
59
60
Page 7 of 19 International Journal of Quality and Service Sciences

1
2
3 INSERT ABOUT HERE: Table 5. Access: developed structure and content
4
5
6 The suggested structure and content for the determinant “Courtesy” is in line with the original
7 SERVQUAL. “Courtesy” consists of two categories “Behavior of contact personnel” and
8 "Consideration for the consumer's property” (see table 6). “Behavior of contact personnel” is
9
suggested to consist of the attributes “Politeness”, “Respect”, “Consideration” and
10
11
“Friendliness” that are described in the original SERVQUAL description. The attributes
12 “Politeness”, “Respect” and “Friendliness” are confirmed in this study.
13
14 INSERT ABOUT HERE: Table 6. Courtesy: developed structure and content
15
16
17
Our suggested structure for “Communication “ consists of four categories: “Keeping
18
customers informed about the progress of solving an incident”, “Keeping customers informed
Fo
19
20 about changes in the service”, “Listening to the customers” and “Gathering of customer
21 viewpoints” (see table 7). The original SERVQUAL does not explicitly differentiate between
22 “Keeping customers informed about the progress of solving an incident” and “Keeping
r

23 customers informed about changes in the service”. We suggest that these two categories are
24 measured separately since they are targeting two different objects. “Listening to the
Re

25 customers” is derived from SERVQUAL’s description of the determinant. We have also


26 introduced the new category “Gathering of customer viewpoints”. This reason for adding
27
category “Gathering of customer viewpoints” is that it is more proactive compared to
28
“Listening to the customers”.
vi

29
30 The category “Keeping customers informed about the progress of solving an incident”
31 includes the two new attributes “Continuously sending status reports” and “Documentation of
ew

32 the progress”. It also includes the confirmed attribute “Assuring the consumer that a problem
33
34
will be handled”. The category “Keeping customers informed about changes in the service”
35 includes the original attribute “Explaining the service itself, explaining how much the service
36 will cost, explaining the trade-offs between service and cost” and the new suggested attribute
On

37 “Involve the customer in change analysis”. The category “Gathering of customer viewpoints”
38 includes the new attributes: “Pro-activity”, “Face-to-face interaction”, and “Questionnaire”.
39 Finally, “Listening to the customers" includes the added attribute “Qualitative dialog”. Many
40 of the added categories and attributes can be viewed as recommendations for how to improve
41
ly

customer communication, how to make a change and what measures to take.


42
43
44 INSERT ABOUT HERE: Table 7. Communication: developed structure and content
45
46
47 The determinant “Credibility” is defined as "Trustworthiness", "Believability" and "Honesty"
48 in the original description of SERVQUAL. We understand these concepts as pretty abstract
49
and thus we prefer to view them as categories instead of attributes (see table 8). For the
50
51
category "Trustworthiness", we suggest the added attribute “Transparency in billing”. The
52 reason for this suggestion is that customers often do not understand the invoices they receive
53 and consequently they do not know what they are paying for. For the category “Believability”
54 we have related the confirmed attribute “Having the customer's best interests at heart” which
55 can be found in the original description.
56
57
58 INSERT ABOUT HERE: Table 8. Credibility: developed structure and content
59
60
International Journal of Quality and Service Sciences Page 8 of 19

1
2
3
4
We have not been able to relate any customer experiences to the determinant “Security” (see
5
6 table 9). One part of “Security” has to with physical safety and freedom of risk. One reason
7 for not being able to relate customer experiences to “Security” is that we believe that physical
8 safety is more relevant to situations where people are exposed to some form of danger.
9 However, based on the original description of the determinant we suggest the following
10 categories: “Physical safety”, “Financial security” and “Confidentiality”.
11
12
13 INSERT ABOUT HERE: Table 9. Security: developed structure and content
14
15
16
The determinant “Understanding the customer” consists of the two categories “Understanding
17
18
needs” and “Learning individualized requirements” (see table 10). These categories are based
on the original description. The category “Understanding needs” consists of the added
Fo
19
20 attributes “Functionality” and “User-friendliness”. Both these new attributes mirrors that the
21 service includes an IT-system which is something that is not specifically mentioned in the
22 original description of SERVQUAL. The category “Learning individualized requirements”
r

23 consists of the attributes “Individualized attention” and “Recognizing the regular customer”.
24 Both these attributes are derived from the original description of SERVQUAL and the former
Re

25 attribute is confirmed in this study.


26
27
28
INSERT ABOUT HERE: Table 10. Understanding the customer: developed structure and
vi

29
30 content
31
ew

32
33 The last determinant is “Tangibles” (see table 11). We suggest that “Tangibles” consist of the
34 category “Physical evidence of the service” and that this category consists of the five
35 attributes “Physical facilities”, “Appearance of personnel”, “Tools or equipment used to
36 provide the service”, Physical representations of the service”, and “Documentation of the
On

37 service”. The first four attributes is derived from the original description of SERVQUAL and
38 we have been successful in confirming three of these with customer experiences. The last
39
attribute is new and added by us.
40
41
ly

42 INSERT ABOUT HERE: Table 11.Tangibles: developed structure and content


43
44
45
46 5. Conclusions
47
Our overall conclusion is that the original SERVQUAL scale has been a good base for
48
49 transferring and adapting determinants to the ITSM-field. The knowledge contribution
50 consists of a developed SERVQUAL scale adjusted to fit in the ITSM-field. Concrete the
51 adjustments have consisted of 1) confirmation of SERVQUAL determinants that could be
52 reused in ITSM-field, 2) modification of attributes of the determinants to better fit in the
53 ITSM-field, and 3) development of new categories and new attributes. We have not suggested
54 any new determinants. This implies that the SERVQUAL determinants are formulated on a
55 general level that has simplified the transfer to ITSM-field.
56
57 Beside the adjustments to the ITSM-field, we have developed the original conceptual
58 structure of SERVQUAL that consisted of a title of the determinant and a description of the
59
60
Page 9 of 19 International Journal of Quality and Service Sciences

1
2
3 determinant. We have suggested a new structure consisting of three concepts: determinant,
4 category and attribute (see section 3). The reason for this development is that in the original
5 description several attributes are presented on the same abstraction level when they, in fact,
6 belong to different abstraction levels (see section 3). Consequently, we claim that attributes
7 that are examples of other attributes should be presented on different abstraction levels.
8
Presenting attributes that belong to different abstraction levels on the same level obstructs the
9
10 conceptual understanding (Cronholm and Bruno, 2008) and does not support practical
11 problem-solving (Rasmussen et al., 1994).
12 Our analysis has demonstrated that most of the determinants are also valid for the ITSM-field.
13 However, there are a couple of categories/attributes that we were not able to relate to the
14
15
customer experiences. Examples of such categories/attributes are: “Consideration for the
16 consumer's property”, “Honesty”, “Financial security” and “Confidentiality”. One possible
17 conclusion is that they are simply not relevant for the ITSM-field. However, a closer look to
18 these categories/attributes reveals that they are more general to their nature and we cannot
Fo
19 find any arguments for why they should not be valid in the ITSM-field.
20
21
Our main argument in this paper is that ITSM-providers can further improve their business by
22 taking the customers’ interests into account. The adaption of SERVQUAL to the ITSM-field
r

23 means that we have provided a tool for the service providers that emphasize the customers’
24 interests. One purpose of this tool is to support the dialog between the service provider and
Re

25 the customer. The tool points out important areas to discuss and the outcome of these
26 discussions should reduce the gap between customers’ expected service and the customers’
27 perceived service. As discussed in section 2, the customers’ interests are based on customer
28 value and it is the customer who decides what constitutes customer value (Vargo and Lusch,
vi

29
2004ab). A review of the determinants reveals that eight out of ten are oriented towards value-
30
31 in-use (access, courtesy, reliability, responsiveness, understanding/knowing the customer, and
ew

32 communication) while two (tangibles and credibility) represent something that the customer
33 can be informed of before using the service.
34 ITSM, previously referred to as Systems Maintenance (e.g. Brandt, 2008), represents more
35
than a change of a name. It constitutes a shift in perspective. ITSM applies per se a service-
36
oriented perspective. However, it is important to understand that a service-oriented
On

37
38 perspective cannot guarantee that a customer will perceive the service as value creating. A
39 service-oriented perspective can only create good conditions for enabling value creation
40 together with the customer. Thus, we believe that improved service quality can only be
41 achieved by an improved customer dialog which requires interaction. Interaction is the key in
ly

42 value creation between service providers and customers (Grönroos, 2008; Grönroos, 2011;
43 Salomonson et al., 2012) and the objective of interaction is to “jointly work towards shared
44 goals” (Cronholm et al., 2011). In this study we have analyzed questionnaires that represent
45
one way to interact with customers. As further research, we would like our findings to be
46
47 confirmed, refined, and/or criticized by other studies on ITSM and service quality, preferably
48 based on other forms of dialogs such as interviews, focus groups or workshops.
49
50 References
51 Addy, R. (2007), Effective IT Service Management – To ITIL and Beyond, Springer-Verlag,
52
New York.
53
54 Bardhan, I. R., Demirkan, H., Kannan, P. K., Kauffman, R. J. and Sougstad, R. (2010), "An
55 Interdisciplinary Perspective on IT Services Management and Service Science", Journal of
56 Management Information Systems, Vol 26 No. 4, pp. 13–64.
57
58
59
60
International Journal of Quality and Service Sciences Page 10 of 19

1
2
3 Brandt P. (2008). In Swedish: Hur bedriver man systemförvaltning och IT Service
4 Management?, Reports in Applied Information Technology, Report no. 2008:01, ISSN: 1651-
5 4769, University of Gothenburg.
6
7 Brooks, P. (2006), Metrics for IT service management. 1. ed., Van Haren Publishing,
8 Zaltbommel.
9
Buttle, F. (1996), "SERVQUAL: review, critique, research agenda", European Journal of
10
11
Marketing, Vol. 30 No. 1, pp. 8-32.
12 Cronholm S. and Bruno V. (2008), "Do you Need General Principles or Concrete Heuristics?
13 - a Model for Categorizing Usability Criteria", in Proceedings of the Australasian Computer-
14 Human Interaction Conference (OZCHI), Cairns, Australia, Dec 10-12.
15
16 Cronholm S., Göbel H., Haraldsson S., Lind M., Salomonson N. and Seigerroth U. (2011),
17 "Collaborative Practice - An Action Research Approach to Efficient ITSM", paper presented
18 at 1 st International & Inter-disciplinary Workshop on Practice Research. June 8, Helsinki,
Fo
19 Finland.
20
21 Dahlgaard, J.J., Kristensen, K. and Kanji, G.K. (2002), Fundamentals of total quality
22 management: process analysis and improvement, Nelson Thornes, Cheltenham..
r

23
24 Davenport, T. (1993), Process Innovation: Reengineering work through information
technology, Harvard Business School Press, Boston
Re

25
26 Edvardsson, B. (1998), “Service quality improvement”, Managing Service Quality, Vol. 8,
27 No. 2, pp. 142-149.
28
vi

29 Eggert, A. and Ulaga, W. (2002), "Customer perceived value: a substitute for satisfaction in
30 business markets?", Journal of Business & Industrial Marketing, Vol. 17 No. 2/3, pp. 107-
31 118.
ew

32
33 Galup, S. Quan J.J., Dattero R. and Conger S. (2007). "Information technology service
34 management: an emerging area for academic research and pedagogical development", in
35 Proceedings of the 2007 ACM SIGMIS CPR conference on Computer personnel research:
36 The global information technology workforce, pp.46-52.
On

37
38 Galup, S.D., Dattero, R., Quan, J.J., and Conger, S. (2009), "An overview of IT service
39 management", Communications of the ACM, Vol. 52 No. 5, pp. 124-127.
40
Grönroos, C. (1984), "A Service Quality Model and its Marketing Implications", European
41
ly

42 Journal of Marketing, Vol. 18 No. 4, pp. 36-44.


43 Grönroos, C. (2007), Service management and marketing: customer management in service
44 competition. 3. ed. Wiley, Chichester.
45
46 Grönroos, C. (2008), "Service logic revisited: Who creates value? And who co-creates?",
47 European Business Review, Vol. 20 No. 4, pp. 298–314.
48
49 Grönroos, C. (2011), "A service perspective on business relationships: The value creation,
50 interaction and marketing interface", Industrial Marketing Management, Vol. 40 No. 2, pp.
51 240–247.
52 Hammer, M. and Champy, J. (1993), Reengineering the Corporation: A Manifesto for
53
54
Business Revolution, Harper Business, New York.
55 Haverblad, A (2007), In Swedish: IT Service Management i praktiken, Studentlitteratur, Lund.
56
57 Fleming, W. (2005). Using Cost of Service to Align IT. Presentation at itSMF, Chicago, IL,
58 September 2005.
59
60
Page 11 of 19 International Journal of Quality and Service Sciences

1
2
3 Ladhari, R. (2009), "A review of twenty years of SERVQUAL research", International
4 Journal of Quality and Service Sciences, Vol. 1 No. 2, pp. 172–198.
5
6 McNaughton, B., Ray, P. and Lewis, L. (2010), "Designing an evaluation framework for IT
7 service management", Information & Management, Vol. 47 No. 4, pp. 219–225.
8 Office of Government Commerce (2008). "Best Management Practice: ITIL V3 and ISO/IEC
9
20000", available at: http://www.best-management-practice.com/gempdf/
10
11
ITIL_and_ISO_20000_March08.pdf. (accessed 19 November 2012).
12 Orlov, L.M. (2005), Make IT matter for business innovation, Forrester.
13
14 Pande, P.S., Neuman, R.P. and Cavanagh, R.R. (2000), The Six Sigma way: how GE,
15 Motorola, and other top companies are honing their performance,McGraw-Hill, New York.
16
Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1985), “A conceptual model of service
17
18
quality and its implications for future research”, Journal of Marketing, Vol. 49 No. 4, pp. 41-
50.
Fo
19
20 Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1988), “SERVQUAL: a multiple-item scale
21 for measuring consumer perceptions of service quality”, Journal of Retailing, Vol. 64 No. 1,
22
pp. 12-40.
r

23
24 Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1991), “Refinement and reassessment of
Re

25 the SERVQUAL scale”, Journal of Retailing, Vol. 67 No. 4, pp. 420-50.


26
27 Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1994), “Alternative scales for measuring
28 service quality: a comparative assessment based on psychometric and diagnostic criteria”,
vi

29 Journal of Retailing, Vol. 70 No. 3, pp. 201-30.


30
31 Paulk, M.C. (ed.) (1995), The capability maturity model: guidelines for improving the
ew

32 software process. Reading, Addison-Wesley, Mass..


33 Payne, A. and Holt, S. (1999), “A Review of the ‘Value’ Literature and Implications for
34
Relationship Marketing”, Australasian Marketing Journal, Vol. 7 No. 1, pp. 41-51.
35
36 Pigoski, T.M. (1997), Practical software maintenance: Best practice for managing your
On

37 software investment, John Wiley & Sons, New York, NY.


38
39 PMI Standards Committee (2000), A guide to the project management body of knowledge:
40 (PMBOK guide), (2000 ed.) Newton Square, Pa.: Project Management Institute.
41
ly

Pollard, C. & Cater-Steel, A. (2009), "Justifications, Strategies, and Critical Success Factors
42
43
in Successful ITIL Implementations in U.S. and Australian Companies: An Exploratory
44 Study", Information Systems Management, 26, pp. 164–175
45 Rasmussen, J., Pejtersen, A.M. and Goodstein, L.P. (1994), Cognitive Systems Engineering,
46 Wiley & Sons Inc, New York.
47
48 Salomonson, N., Åberg, A. and Allwood, J. (2012), "Communicative skills that support value
49 creation: A study of B2B interactions between customers and customer service
50 representatives", Industrial Marketing Management, Vol. 41 No. 1, pp. 145-155.
51
52 Smith, D.A. (2008), Implementing metrics for IT service management: a measurement
53 framework that helps align IT with the business objectives and create value through continual
54 improvements, 1. ed., Van Haren Publishing, Zaltbommel.
55
56 Storbacka, K., Strandvik, T. and Grönroos, C. (1994), "Managing Customer Relationships for
57 Profit: The Dynamics of Relationship Quality", International Journal of Service Industry
58 Management, Vol. 5 No. 5, pp. 21-38.
59
60
International Journal of Quality and Service Sciences Page 12 of 19

1
2
3 The IT Governance Institute (2007). "CobiT 4.1 Exerpt", available at:
4 http://www.isaca.org/Knowledge-Center/cobit/Documents/COBIT4.pdf (accessed 19
5 November 2012).
6
7 Ulaga, W. (2003), "Capturing value creation in business relationships: A customer
8 perspective", Industrial Marketing Management, Vol. 32, pp. 677-693.
9
Van Bon, J. (2002), IT Service Management: An Introduction, IT Service Management
10
11
Forum. Van Haren Publishing, UK.
12 Vargo, S.L. and Lusch, R.F. (2004a), "Evolving to a New Dominant Logic for Marketing",
13 Journal of Marketing, Vol. 68, January, pp. 1-17.
14
15 Vargo, S.L. and Lusch, R.F. (2004b), "The Four Service Marketing Myths: Remnants of a
16 Goods-Based, Manufacturing Model", Journal of Service Research, Vol. 6 No. 4, pp. 324-
17 335.
18
Vargo, S. L., Maglio, P. P. and Akaka, M. A. (2008), "On value and value-creation: A service
Fo
19
20 systems and service logic perspective", European Management Journal, Vol. 26 No. 3, pp.
21 145–152.
22
Winniford, M., Conger, S. and Erickson-Harris, L. (2009), "Confusion in the Ranks: IT
r

23
24 Service Management Practice and Terminology", Information Systems Management, Vol. 26
No. 2, pp. 153-163.
Re

25
26 Zeithaml, V.A. (1988), "Consumer perceptions of price, quality, and value: A means-end
27 model and synthesis of evidence", Journal of Marketing, 52, July, pp. 2-22.
28
vi

29
30
31
ew

32
33
34
35
36
On

37
38
39
40
41
ly

42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
Page 13 of 19 International Journal of Quality and Service Sciences

1
2
3 INSERT ABOUT HERE: Appendix A. SERVQUAL for ITSM
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
Fo
19
20
21
22
r

23
24
Re

25
26
27
28
vi

29
30
31
ew

32
33
34
35
36
On

37
38
39
40
41
ly

42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
International Journal of Quality and Service Sciences Page 14 of 19

1
2
3 Table 1. SERVQUAL (Parasuraman et al., 1985, 1988)
4
Reliability Involves consistency of performance and dependability. It means that the firm
5
performs the service right the first time. It also means that the firm honors its
6
promises. Specifically, it involves: accuracy in billing, keeping records
7
8
correctly, performing the service at the designated time.
9 Responsiveness Concerns the willingness or readiness of employees to provide service. It
10 involves timeliness of service: mailing a transaction slip immediately, calling
11 the customer back quickly, giving prompt service (e.g., setting up appointments
12 quickly).
13
14 Competence Means possession of the required skills and knowledge to perform the service.
15 It involves: knowledge and skill of the contact personnel, knowledge and skill
16 of operational support personnel, research capability of the organization.
17 Access Involves approachability and ease of contact. It means: the service is easily
18 accessible by telephone (lines are not busy and they don't put you on hold),
Fo
19 waiting time to receive service is not extensive, convenient hours of operation,
20 convenient location of service facility.
21
22 Courtesy Involves politeness, respect, consideration, and friendliness of contact
r

23 personnel (including receptionists, telephone operators, etc.). It includes:


24 consideration for the consumer's property, clean and neat appearance of public
Re

25 contact personnel.
26
Communication Means keeping customers informed in language they can understand and
27
listening to them. It may mean that the company has to adjust its language for
28
different consumers-increasing the level of sophistication with a well-educated
vi

29
customer and speaking simply and plainly with a novice. It involves: explaining
30
31
the service itself, explaining how much the service will cost, explaining the
ew

32 trade-offs between service and cost, assuring the consumer that a problem will
33 be handled.
34 Credibility Involves trustworthiness, believability, honesty. It involves having the
35 customer's best interests at heart. Contributing to credibility are: company
36 name, company reputation, personal characteristics of the contact personnel,
On

37 the degree of hard sell involved in interactions with the customer.


38
39 Security Is the freedom from danger, risk, or doubt. It involves: physical safety (Will I
40 get mugged at the automatic teller machine?), financial security (Does the
41 company know where my stock certificate is?), confidentiality (Are my
ly

42 dealings with the company private?).


43 Understanding the Involves making the effort to understand the customer's needs. It involves:
44 customer learning the customer's specific requirements, providing individualized
45 attention, recognizing the regular customer.
46
47 Tangibles Include the physical evidence of the service: physical facilities, appearance of
48 personnel, tools or equipment used to provide the service, physical
49 representations of the service, such as a plastic credit card or a bank statement,
50 other customers in the service facility.
51
52
53
54
55
56
57
58
59
60
Page 15 of 19 International Journal of Quality and Service Sciences

1
2
3 Table 2. Reliability: developed structure and content
4
Category Attribute Customer experience
5
6
Dependability Honor the promises “Agreed service levels are fulfilled”
7 Accuracy in billing -
8 Keeping records correctly “The response time is sometimes bad and
9 my feeling is that you do not check the
10 status of the errands”
11 Performance Perform the service right “It is important that the quality control is
12 carried out according to ISO9001”
13 Perform the service at the designated “The service should be delivered in right
14 time time”
15 Perform the right services according “The right service should be delivered
16 to the terms in the agreement according to the service level agreement”
17
18
Fo
19
20
21 Table 3. Responsiveness: developed structure and content
22 Category Attribute Customer experience
r

23 Speed Mailing a transaction slip -


24 immediately
Re

25
26 Calling the customer back “It takes too long to get an answer”
27 quickly
28 Giving service according “The problem is always solved eventually, but not
vi

29 to the agreed service level always as fast as I would like it to be solved”


30 “The simple incidents are always solved quickly
31
ew

32
Attitude Willingness “Some people I talked to were not service-minded at
33
all”
34 Interest “My contact took a big interest in our problems”
35 Distribution of Assignment of incidents “Reported incidents are not assigned to a specific
36 responsibility team or individual”
On

37 Matching the customer “Hard to get in contact with the person who can solve
38 problem to the right my problem”
39 competence
40
41
ly

42
43 Table 4. Competence: developed structure and content
44
45 Category Attribute Customer experience
46 Knowledge Possession “Broad knowledge and skill of the contact personnel are
47 and skill of required”
48 contact
49 personal
50 Knowledge Possession “They should be more informed about their products”
51 and skill of “We don not need project leaders, we need developers!”
52 operational “Hard to find a person with the right skill”
53 support
54 personal
55 Research - -
56 capability
57
58
59
60
International Journal of Quality and Service Sciences Page 16 of 19

1
2
3
4
Table 5. Access: developed structure and content
5
6 Category Attribute Customer experience
7 Approachability Convenient hours of “It is easy to get in contact with the right people”
8 (ease of operation “Fast support in the evening hours”
9 contact) Convenient location of -
10 service facility
11
Contact media (e-mail, “In general, the offered media to get in contact with the
12
telephone, electronic service provider is satisfactory”
13
forms, social media,
14
FAQ)
15
16
17
18 Table 6. Courtesy: developed structure and content
Fo
19 Category Attribute Customer experience
20 Behavior of Politeness “We have always been treated nicely by the first line
21
contact support”
22
personnel Respect “I have a feeling that I am a troublemaker when I
r

23
suggest new ideas”
24
Consideration -
Re

25
26 Friendliness “Very nice people that deserves appreciation”
27 Consideration - -
28 for the
consumer's
vi

29
30 property
31
ew

32
33 Table 7. Communication: developed structure and content
34
35 Category Attribute Customer experience
36 Keeping Continuously sending “We are aware of that all incidents cannot be solved
On

37 customers status reports immediately. But, we would like to see that the service
38 informed provider is working with a solution”
39 about the “A reasonable goal is that the status of all incidents
40 progress of should be reported to us once a week”
41 solving an Documentation of the “Provide a history of conducted changes”
ly

42 incident progress
43 Assuring the consumer “Incidents that are not viewed as routine incidents are
44 that a problem will be often delayed and there is no feedback to us”
45 handled.
46
Keeping Explaining -
47
customers - the service itself
48
informed - how much the service
49
about changes will cost
50
in the service - the trade-offs between
51
service and cost
52
53 Involve the customer in “As a customer, I am interested in taking part of the
54 the change work processes for development and testing”
55 Inform about future “We need forward planning in order to plan our
56 changes well in advance business”
57
58
59
60
Page 17 of 19 International Journal of Quality and Service Sciences

1
2
3 Gathering of Pro-activity “Be proactive. Suggest improvements based on
4 customer’ customer needs. Not just react on customers who are
5 viewpoints reporting incidents.”
6 Face-to-face interaction “Become more visible and visit the customer. Analyze
7 the customers’ daily work.”
8 Questionnaires “Ask for what we need”
9
10 Listening to Qualitative dialogs “The dialog with customer needs to be improved”
11 the customer “They always listen and understand my problem”
12
13
14
15
16
Table 8. Credibility: developed structure and content
17 Category Attribute Customer experience
18 Trustworthiness Transparency in billing “It is not obvious what we are paying
Fo
19 for”
20 Believability Having the customer's best “You have to customize your own
21 interests at heart organization in order to reduce costs.
22 Reduced costs in your organization will
r

23 reduce costs for us.”


24 Honesty -
Re

25
26
27
28
Table 9. Security: developed structure and content
vi

29
30 Category Attribute Customer experience
31 Physical safety Freedom from danger, risk, or -
ew

32 doubt
33
34 Financial security Does the company know where my -
35 stock certificate is?
36 Confidentiality Are my dealings with the company -
On

37 private?
38
39
40
41
ly

Table 10. Understanding the customer: developed structure and content


42
43 Category Attribute Customer experience
44 Understanding Functionality “The systems consist of functionality we do not need”
45 needs User-friendliness “We need more user-friendly systems and we need
46 fewer systems that can work together”
47
48
Learning Individualized attention “Offers have to be adapted for every individual
49 individualized customer”
50 requirements Recognizing the regular -
51 customer
52
53
54
55
56
57
58
59
60
International Journal of Quality and Service Sciences Page 18 of 19

1
2
3 Table 11.Tangibles: developed structure and content
4
Category Attribute Customer experience
5
6
Physical Physical facilities “We need an updated price list”
7 evidence of the Appearance of personnel “Work closer to the customer”
8 service “Come and visit us and ask for our needs”
9 Tools or equipment used “The distance between the maintenance model used and
10 to provide the service the business is too far”
11 Physical representations -
12 of the service, such as a
13 plastic credit card or a
14 bank statement, other
15 customers in the service
16 facility
17 Documentation of the “We need better information and clearer instructions
18 service when our systems are updated”
Fo
19
20
21
22
r

23
24
Re

25
26
27
28
vi

29
30
31
ew

32
33
34
35
36
On

37
38
39
40
41
ly

42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
Page 19 of 19 International Journal of Quality and Service Sciences

1
2
3 Appendix A. SERVQUAL for ITSM
4
Determinant Category Attributes
5
6
Reliability Dependability Honor the promises, Accuracy in billing, Keeping
7 records correctly
8 Performance Perform the service right, Perform the service at the
9 designated time, Perform the right services according
10 to the terms in the agreement
11 Responsive- Speed Mailing a transaction slip immediately, Calling the
12 ness customer back quickly, Giving service according to the
13 agreed service level
14 Attitude Willingness, Interest
15 Distribution of Assignment of incidents, Matching the customer
16 responsibility problem to the right competence
17 Competence Knowledge and skill of Possession
18 contact personal
Fo
19 Knowledge and skill of Possession
20 operational support
21 personal
22 Research Capability -
r

23 Access Approachability (ease of Convenient hours of operation, Convenient location of


24 contact) service facility, Contact media (e-mail, telephone,
Re

25 electronic forms, social media, FAQ)


26
Courtesy Behavior of contact Politeness, Respect, Consideration, Friendliness
27
personnel
28
Consideration for the -
vi

29
30
consumer's property
31 Communi- Keeping customers Continuously sending status reports, Documentation of
ew

32 cation informed about the the progress, Assuring the consumer that a problem
33 progress of solving an will be handled.
34 incident
35 Keeping customers Explaining: the service itself, how much the service
36 informed about changes will cost and the trade-offs between service and cost;
On

37 in the service Involve the customer in the change work; Inform about
38 future changes well in advance
39 Gathering of customer’ Pro-activity, Face-to-face interaction, Questionnaires
40 viewpoints
41 Listening to the customer Qualitative dialogs
ly

42 Credibility Trustworthiness Transparency in billing


43 Believability Having the customer's best interests at heart
44
Honesty -
45
Security Physical safety Freedom from danger, risk, or doubt
46
47 Financial security Does the company know where my stock certificate is?
48 Confidentiality Are my dealings with the company private?
49 Understanding Understanding needs Functionality, User-friendliness
50 the customer
51 Learning individualized Individualized attention, Recognizing the regular
52 requirements customer
53 Tangibles Physical evidence of the Physical facilities, Appearance of personnel, Tools or
54 service equipment used to provide the service, Physical
55 representations of the service, such as a plastic credit
56 card or a bank statement, other customers in the service
57 facility, Documentation of the service
58
59
60

View publication stats

You might also like