You are on page 1of 77

New Ways of Listening to Library Users: New Tools for

Measuring Service Quality

A. Parasuraman
University of Miami

Washington, DC
November 4, 2005
Defining, Assessing, and Measuring Service
Quality: A Conceptual Overview

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 2
Multi-Phase, Multi-Sector, Multi-Year
Program of Research to Address the
Following Issues
• How do customers perceive and evaluate service
quality?
• What are managers’ perceptions about service
quality?
• Do discrepancies exist between the perceptions
of customers and those of managers?
• Can customers’ and managers’ perceptions be
combined into a general model of service quality?
• How can service organizations improve customer
service and achieve excellence?

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 3
Determinants of Perceived Service
Quality
Word of Personal Past
Mouth Needs Experience

External
Expected Communication
Service to Customers
Service
Quality Perceived
Gap Service
Quality
Perceived
Service
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 4
A “GAPS” MODEL OF SERVICE QUALITY

CUSTOMER SERVICE ORGANIZATION

Market Organization’s Service


Information Understanding of Standards
Gap Expectations Gap
Customers’
Service Organization’s
Expectations GAP 1 Service Standards
GAP 2
Service Service
Quality GAP 5 Performance
Gap GAP 3 Gap
GAP 4 Organization’s
Customers’ Service
Service Performance
Perceptions
Organization’s Internal
Communications to Communication
Customers Gap

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 5
POTENTIAL CAUSES OF
INTERNAL SERVICE GAPS
[GAPS 1 - 4]

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 6
GAP 1

Customer
Expectations

Key Factors:
• Insufficient marketing research
• Inadequate use of marketing research
• Lack of interaction between
management and customers
• Insufficient communication between
contact employees and managers

Lack of
Management “Upward
Perceptions of Communication”
Customer Expectations

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 7
GAP 2

Management
Perceptions of
Customer Expectations

Key Factors:
• Inadequate management commitment
to service quality
• Absence of formal process for setting
service quality goals
• Inadequate standardization of tasks
• Perception of infeasibility -- that
customer expectations cannot be met

Service
Quality
Specifications

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 8
GAP 3

Service
Quality
Specifications

Key Factors:
• Lack of teamwork
• Poor employee - job fit
• Poor technology - job fit
• Lack of perceived control (contact personnel)
• Inappropriate evaluation/compensation system
• Role conflict among contact employees
• Role ambiguity among contact employees

Service
Delivery

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 9
GAP 4

Service
Delivery

Key Factors:
• Inadequate communication between
salespeople and operations
• Inadequate communication between
advertising and operations
• Differences in policies and procedures
across branches or departments
• Puffery in advertising & personal selling
Lack of
External “Horizontal
Communication”
Communications
to Customers

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 10
SUGGESTIONS FOR CLOSING
INTERNAL SERVICE GAPS
[GAPS 1 - 4]

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 11
Suggestions for Closing
the Market Information Gap

• Conduct systematic marketing research

• Make senior managers interact with customers

• Make senior managers occasionally perform


customer-contact roles

• Encourage upward communication from customer-


contact employees

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 12
Suggestions for Closing
the Service Standards Gap
• Make a blueprint of the service and standardize as
many components of it as possible

• Institute a formal, ongoing process for setting service


specifications

• Eliminate “perception of infeasibility” on the part of


senior managers

• Make a true commitment to improving service quality

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 13
Suggestions for Closing
the Service Performance Gap
• Invest in ongoing employee training

• Support employees with appropriate technology and


information systems

• Give customer-contact employees sufficient flexibility

• Reduce role conflict and role ambiguity among customer-


contact employees

• Recognize and reward employees who deliver superior


service

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 14
Suggestions for Closing
the Internal Communication Gap

• Facilitate effective horizontal communication


across functional areas (e.g., marketing and
operations)

• Have consistent customer-related policies and


procedures across branches or departments

• Resist the temptation to promise more than the


organization can deliver

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 15
Process Model for Continuous Measurement and
Improvement of Service Quality

Do your customers perceive YES Continue to monitor


your offerings as meeting customers’ expectations
or exceeding their expectations? and perceptions

NO

Do you have an accurate NO Take corrective action


understanding of
customers’ expectations?

YES

Are there specific NO


Take corrective action
standards in place to meet
customers’ expectations?

YES

Do your offerings meet or NO Take corrective action


exceed the standards?

YES

NO
Is the information Take corrective action
communicated to customers
about your offerings accurate?
YES

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 16
SERVQUAL: Development, Refinement, and
Empirical Findings

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 17
Determinants of Perceived Service
Quality
Dimensions of Service
Word of Personal Past
Quality Mouth Needs Experience
1. Access
2. Communication
3. Competence
External
4. Courtesy Expected Communication
5. Credibility Service to Customers

6. Reliability
7. Responsiveness Service Perceived
Quality Service
8. Security Gap Quality
9. Tangibles
10. Understanding/Knowing
the Customer Perceived
Service

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 18
Correspondence between SERVQUAL Dimensions and
Original Ten Dimensions for Evaluating Service Quality
Original Ten SERVQUAL Dimensions
Dimensions for
Evaluating Service
Quality TANGIBLES RELIABILITY RESPONSIVENESS ASSURANCE EMPATHY

TANGIBLES
RELIABILITY
RESPONSIVENESS
COMPETENCE

COURTESY

CREDIBILITY

SECURITY

ACCESS

COMMUNICATION

UNDERSTANDING/
KNOWING THE
CUSTOMER

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 19
Definitions of the SERVQUAL Dimensions
• Tangibles: Appearance of physical facilities, equipment, personnel,
and communication materials.

• Reliability: Ability to perform the promised service dependably and


accurately.

• Responsiveness: Willingness to help customers and provide


prompt service.

• Assurance: Knowledge and courtesy of employees and their ability


to inspire trust and confidence.

• Empathy: Caring, individualized attention the firm provides its


customers.

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 20
Relative Importance of Service
Dimensions When Respondents
Allocate 100 Points [Study 1]
RELIABILITY 32%

TANGIBLES 11%

RESPONSIVENESS
22% EMPATHY 16%

ASSURANCE 19%
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 21
Relative Importance of Service Quality Dimensions [Study 2]
Mean Number of Points Allocated out of 100 Points

37 33 32

11 14
23 9 23 21

13 15
18 15
19 18
Computer Manufacturer All Companies Retail Chain
29 28

12 12

23 23
17 18
19 20
Auto Insurer Life Insurer
Reliability Responsiveness Assurance Empathy Tangibles
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 22
Mean SERVQUAL Scores by Service Dimension [Study 1]

1.00

0.00

-1.00

-2.00
Tangibles Reliability Responsive- Assurance Empathy
ness

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 23
Nature of Service Expectations

Level Customers
Desired Service Believe Can and Should Be
Delivered

Zone
of
Tolerance
Minimum Level
Adequate Service Customers Are Willing
to Accept

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 24
The Two Levels of Expectations Imply Two
Corresponding Measures of GAP 5:

Measure of Service Perceived Adequate


= -
Adequacy (MSA) Service Service

Measure of Service Perceived - Desired


Superiority (MSS) = Service Service

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 25
TWO APPROACHES FOR
MEASURING MSA AND MSS

• Two-Column Format Questionnaire


– Direct measures of MSA and MSS
• Three-Column Format Questionnaire
– Difference-score measures of MSA and MSS

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 26
TWO-COLUMN FORMAT

Please think about the quality of service ________ offers compared to the two different levels of
service defined below:

MINIMUM SERVICE LEVEL - the minimum level of service performance you consider
adequate.

DESIRED SERVICE LEVEL - the level of service performance you desire.

For each of the following statements, please indicate: (a) how ______’s performance compares
with your minimum service level by circling one of the numbers in the first column; and (b) how
______’s performance compares with your desired service level by circling one of the numbers
in the second column.

Compared to My Minimum Compared to My Desired


Service Level ____’s Service Level ____’s
Service Performance is: Service Performance is:

The No The No
When it comes to … Lower Same Higher Opin- Lower Same Higher Opin-
ion ion

1. Prompt service 1 2 3 4 5 6 7 8 9 N 1 2 3 4 5 6 7 8 9 N
to policyholders

2. Employees who are 1 2 3 4 5 6 7 8 9 N 1 2 3 4 5 6 7 8 9 N


consistently courteous

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 27
THREE-COLUMN FORMAT

We would like your impressions about ________’s service performance relative to your expectations. Please think
about the two different levels of expectations defined below:

MINIMUM SERVICE LEVEL - the minimum level of service performance you consider
adequate.

DESIRED SERVICE LEVEL - the level of service performance you desire.

For each of the following statements, please indicate: (a) your minimum service level by circling one of the numbers
in the first column; and (b) your desired service level by circling one of the numbers in the second column; and (c)
your perception of ___________’s service by circling one of the numbers in the third column.

My Minimum My Desired My Perception


Service Service of ____’s Service
Level is: Level is: Performance is:

No
When it comes to … Low High Low High Low High Opin-
ion

1. Prompt service 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 N
to policyholders

2. Employees who are 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 N


consistently courteous

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 28
Measurement Error: Percent of
Respondents Answering Incorrectly

Type of Two-Column Three-Column


Company Format Format

Computer
Manufacturer 8.6% 0.6%

Retail Chain 18.2% 1.8%

Auto Insurer 12.2% 1.6%

Life Insurer 9.9% 2.7%

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 29
Mean Service Quality Scores
(Combined Across All Companies)

TWO-COLUMN FORMAT THREE-COLUMN FROMAT


QUESTIONNAIRE QUESTIONNAIRE
SERVQUAL
Dimensions
MSA Scores MSS Scores MSA Scores MSS Scores

Reliability 6.8 5.9 0.2 -1.0

Responsiceness 6.7 5.7 0.3 -1.1

Assurance 6.8 5.9 0.4 -0.9

Empathy 6.5 5.6 0.2 -1.2

Tangibles 7.1 6.4 1.1 -0.2

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 30
Revised SERVQUAL Items
Reliability
1. Providing services as promised
2. Dependability in handling customers' service problems
3. Performing services right the first time
4. Providing services at the promised time
5. Keeping customers informed about when services will be performed
Responsiveness
6. Prompt service to customers Tangibles
7. Willingness to help customers 17. Modern equipment
8. Readiness to respond to customers' requests 18. Visually appealing facilities
19. Employees who have a neat, professional appearance
Assurance
20. Visually appealing materials associated with the service
9. Employees who instill confidence in customers
21. Convenient business hours
10. Making customers feel safe in their transactions
11. Employees who are consistently courteous
12. Employees who have the knowledge to answer customer questions

Empathy
13. Giving customers individual attention
14. Employees who deal with customers in a caring fashion
15. Having the customer's best interest at heart
16.Employees who understand the needs of their customers

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 31
Service Quality Perceptions Relative to Zones of Tolerance by Dimension

Computer Manufacturer
9
8
7
6
5
4
3
2
1
0
Reliability Responsiveness Assurance Empathy Tangibles

Zone of Tolerance S.Q. Perception

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 32
Service Quality Perceptions Relative to Zones of Tolerance by Dimension

Computer Manufacturer
9
8
7
6
5
4
3
2
1
0
Reliability Responsiveness Assurance Empathy Tangibles

Zone of Tolerance S.Q. Perception

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 33
Service Quality Perceptions Relative to Zones of Tolerance by Dimension

On-Line Services

9
8.4 8.3 8.4 8.3
8
7.0 7.0 7.5
7 7.0 6.8 6.8
6.8 6.7 6.7 6.8
6
5.7
5
4
3
2
1
0
Reliability Responsiveness Assurance Empathy Tangibles

Zone of Tolerance S.Q. Perception

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 34
Service Quality Perceptions Relative to Zones of Tolerance by
Dimension

Tech-Support Services
9
8.5 8.4
8.3
8 8.1

7 6.9 6.7 6.8


6.6 6.4
6 6.1 6.3 6.3
5
4
3
2
1
0
Reliability Responsiveness Assurance Empathy

Zone of Tolerance S.Q. Perception


© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 35
LIBQUAL+: An Adaptation of SERVQUAL

© Association of Research Libraries, Washington DC (2003)

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 36
MULTIPLE METHODS OF LISTENING TO
CUSTOMERS
• Transactional surveys*
• Mystery shopping
• New, declining, and lost-customer surveys
• Focus group interviews
• Customer advisory panels
• Service reviews
• Customer complaint, comment, and inquiry
capture
• Total market surveys*
• Employee field reporting
• Employee surveys
• Service operating data capture

*A SERVQUAL-type instrument is most suitable for these


methods
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 37
The Role Of Technology In Service Delivery:
Electronic Service Quality (e-SQ) and Technology
Readiness (TR)

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 38
Technology’s Growing Role in Marketing to
and Serving Customers: Pyramid Model

Company
Internal External
Marketing Marketing

Technology

Employees Customers

Interactive
Marketing
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 39
Ongoing Research on e-Service
Quality: Conceptual Framework and
Preliminary Findings

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 40
Research Phases and Questions
PHASE 1:
• What is good service on the Web?
• What are the underlying dimensions of superior
electronic service quality (e-SQ?)
• How can e-SQ be conceptualized?
PHASE 2:
• How do these dimensions compare to those of
traditional service quality?
• How can e-SQ be measured and thereby
assessed?

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 41
Definition of e-Service
Quality (e-SQ)
e-SQ is the extent to which a Website facilitates
efficient and effective shopping, purchasing
and delivery of products and services

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 42
Dimensions of e-Service
Quality from Focus Groups
• Access • Responsiveness
• Ease of Navigation • Assurance/Trust
• Efficiency • Price Knowledge
• Customization/ • Site Aesthetics
Personalization • Reliability
• Security/Privacy • Flexibility

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 43
Reliability
DEFINITION SAMPLE ATTRIBUTES
• Site does not crash
Correct technical • Accurate billing
functioning of the • Accuracy of order
site and the • Accuracy of account
accuracy of service information
promises, billing • Having items in stock
and product • Truthful information
information. • Merchandise arrives
on time

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 44
Efficiency
DEFINITION SAMPLE ATTRIBUTES
• Site is well organized
The site is simple to use, • Site is simple to use
structured properly, • Site provides
and requires a
minimum of information in
information to be reasonable chunks
input by the customer. • Site allows me to click
for more information if
I need it

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 45
Means-End Model

Concrete Perceptual Higher-level


Cues Dimensions
Attributes Abstractions

SPECIFIC/ ABSTRACT
CONCRETE

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 46
Means-End Model of
e-Service Quality

Concrete Perceptual Higher-Level


Cues Dimensions
Attributes Abstractions

Tab Structuring
Easy to Maneuver
through Site
Site Map

Easy to Find Ease of


Search Engine
What I Need Navigation
Speed of
One-click Ordering
Checkout

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 47
Concrete Perceptual Higher-Level
Dimensions
Cues Attributes Abstractions

Access

Ease of
Navigation

Efficiency

Flexibility

Reliability Perceived
e-Service
Personali- Quality
zation
Security/
Privacy
Responsive-
ness
Assurance/
Trust
Site
Aesthetics
Price
Knowledge
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 48
Means-End Model of e-Service Quality
Behaviors
Higher-Level Abstractions
Purchase

Dimensions Loyalty
Perceived
Convenience
Perceptual W.O.M
Attributes Perceived
e-Service
Quality
Concrete
Cues
Perceived Perceived
Control Value

Perceived
Price

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 49
Conceptual Model for Understanding and Improving e-Service Quality
Customer

Fulfillment
Gap

Customer Customer
Perceived Perceived Purchase/
Web site Web site
e-SQ Value Repurchase
Requirements Experiences

Company Information
Gap

Design and Management’s


Marketing
Operation Beliefs
of the
of the about Customer
Web site
Web site Requirements

Communication Design
Gap Gap

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 50
Dimensions of e-SQ
Core Dimensions Recovery Dimensions
[E-S-QUAL] [E-RecS-QUAL]
• Efficiency • Responsiveness
• Fulfillment • Compensation
• System Availability • Contact
• Privacy

Source: Parasuraman, Zeithaml, and Malhotra, “E-S-QUAL: A Multiple-Item Scale for Assessing Electronic Service Quality,”
Journal of Service Research, February 2005.
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 51
Definitions of e-SQ Dimensions
E-S-QUAL Dimensions
Efficiency: The ease and speed of accessing and using the site.
Fulfillment: The extent to which the site’s promises about order delivery and
item availability are fulfilled.
System Availability: The correct technical functioning of the site.

Privacy: The degree to which the site is safe and protects customer
information.

E-RecS-QUAL Dimensions
Responsiveness: Effective handling of problems and returns through the site.
Compensation: The degree to which the site compensates customers for
problems.
Contact: The availability of assistance through telephone and online
representatives.

Source: Parasuraman, Zeithaml, and Malhotra, “E-S-QUAL: A Multiple-Item Scale for Assessing Electronic Service Quality,”
Journal of Service Research, February 2005.
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 52
An Important Implication of the
Pyramid Model
An organization’s ability to use
technology effectively in
marketing to and serving
customers critically depends on
the technology readiness of its
customers and employees

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 53
What is Technology
Readiness [TR]?

TR refers to “people’s
propensity to embrace
and use new
technologies for
accomplishing goals in
home life and at work”

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 54
Multinational Research Studies on
Technology Readiness

• Began in 1997 in the USA and still ongoing


• Being conducted in collaboration with Charles Colby,
President, Rockbridge Associates
• Have thus far involved several qualitative and
quantitative studies
• Completed studies include three “National Technology
Readiness Surveys” in the USA [NTRS 1999, 2000,
2001, 2002 and 2004]
• National studies also have been done or are underway
in Austria, Chile, Germany, Singapore and Sweden

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 55
Key Insights from Qualitative
Research Studies
• TR doesn’t just refer to possessing technical
skills; TR is much more a function of people’s
beliefs and feelings about technology
• People’s beliefs can be positive about some
aspects of technology but negative about
other aspects
• The relative strengths of the of positive and
negative beliefs determine a person’s
receptivity to technology

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 56
Technology-Beliefs Continuum

Resistant to Neutral Receptive to


Technology Technology

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 57
Link between Technology Beliefs
and Technology Readiness
High
Technology
Readiness

Medium

Low
Resistant to Neutral Receptive to
Technology Technology

Technology-Beliefs Continuum
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 58
Quantitative Survey Methodology
• Each NTRS in the U.S. included a random
sample of adults:
– 1000 respondents 1999 & 2000 and 500
respondents in 2001, 2002 & 2004
• Data collected via computer-assisted
telephone interviewing
• Survey included questions about
technology beliefs, demographics,
psychographics, and technology-related
behaviors and preferences

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 59
Key Insights from Quantitative
Research Studies
• TR consists of four facets or dimensions that are
fairly independent of one another
• People’s ratings on a set of belief statements
about technology can be combined to create a
reliable and valid measure of TR -- i.e., a
“Technology Readiness Index” [TRI]
• The TRI is a good predictor of people’s
technology-related behaviors and preferences
• A meaningful typology of customers can be
created based on their TR scores on the four
dimensions
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 60
Drivers of Technology Readiness

Contributors Optimism Innovativeness

Technology Readiness

Inhibitors Discomfort Insecurity

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 61
Definitions of the TR Drivers
• Optimism: Positive view of technology;
belief that it offers increased control,
flexibility and efficiency
• Innovativeness: Tendency to be a
technology pioneer and thought leader
• Discomfort: Perceived lack of control over
technology and a feeling of being
overwhelmed by it
• Insecurity: Distrust of technology and
skepticism about its working properly

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 62
The TRI: A 36-Item, 4-Dimensional
Scale to Measure TR
• Optimism 10 items

• Innovativeness 7 items

• Discomfort 10 items

• Insecurity 9 items

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 63
Customer Beliefs About Technology

• Example of Optimism: “Technology gives people more control


over their daily lives”
 % of respondents agreeing: 61% in 1999
68% in 2000
65% in 2001
65% in 2002
67% in 2004

• Example of Innovativeness: “You keep up with the latest


technological developments in your areas of interest”
 % of respondents agreeing: 68% in 1999
69% in 2000
65% in 2001
59% in 2002
60% in 2004

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 64
Customer Beliefs About Technology
• Example of Discomfort: “It is embarrassing when you have
trouble with a high-tech gadget while people are watching”
 % of respondents agreeing: 52% in 1999
54% in 2000
55% in 2001
51% in 2002
46% in 2004

• Example of Insecurity: “Any business transaction you do


electronically should be confirmed later with something in
writing”
 % of respondents agreeing: 87% in 1999
88% in 2000
82% in 2001
82% in 2002
78% in 2004

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 65
TR Scores by Dimension and
Overall TRI
4.5
4
3.5
3
Mean TR 2.5
Scores 2
1.5
1
0.5
0

OPT. INN. DIS. INS. TRI


1999 2000 2001 2002 2004

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 66
Online Acitivities of High and Low TR Customers (NTRS 2004)

Read newspaper
Checked bank acct info
Booked travel
High TR
Bought items > US$100
Low TR
Did business with govt.
Applied for credit card
Bought stocks

0 20 40 60 80 100
%

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 67
TRI Scores by Demographics (NTRS
2004)
Less than $40K 2.83
$40K to $75K 2.88
$75K or More 3.14

High School or Less 2.77


Some College 2.96
College Grad or More 3.03

60-88 years 2.68


48-59 years 2.85
34-47 years 3.01
18-33 years 3.13

Female 2.83
Male 3.03

2.4 2.5 2.6 2.7 2.8 2.9 3 3.1 3.2

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 68
Predicted Change in TR of Age
Cohorts over Time

Age Age
Cohort X Age
Cohort X Age
Cohort X Age
Age Cohort X Age
Age Cohort X
Cohort 1 Age Cohort X
Cohort 1 Age
Cohort 1 Age
Age Cohort 1 Age
Age Cohort 1
Cohort 2 Age Cohort 1
Cohort 2 Age Age Range
Cohort 2 Age
TR Cohort 2
Cohort 2
Age
Cohort 2
Covered in
TR Surveys

Age Age
Cohort N Age
Cohort N Age
Cohort N Age
Age Cohort N Age
Age Cohort N
Cohort Y Age Cohort N
Cohort Y Age
Cohort Y Age
Cohort Y Age
Cohort Y
Cohort Y

Year 1-5 Year 6-10 Year 11-15 Year 16-20 Year 21-25 Year 26-30
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission
Time 69
Five TR-Based Customer Segments

Optimism Innovative Discomfort Insecurity


-ness
Explorers High High Low Low

Pioneers High High High High

Skeptics Low Low Low Low

Paranoids High Low High High

Laggards Low Low High High

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 70
Typology of Technology Customers:
Percent of Population in Each Segment

30
25
20
15
10
%
5
0

1999 2000 2001 2002 2004

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 71
TR Segments and Technology
Adoption
High
Explorers

Pioneers
Technology
Readiness

Skeptics

Paranoids

Laggards
Low
Early Late

Time of Adoption of New Technologies

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 72
New Customer Composition by Age of Techno-
Based Product/Service
First-time Users

Laggards
Paranoids
Skeptics
Pioneers
Explorers

r ly t e
Ea La

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 73
High-Tech versus High-Touch
Customer Service
High
Explorers
Appeal of High-Tech

Pioneers
Service Channels

Skeptics

Paranoids

Laggards
Low
Low High
Appeal of High-Touch
Service Channels
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 74
In Conclusion, to Deliver Superior Service in Library
Environments:

• Understand customers’ service expectations and how well those expectations are
being met
• Work systematically to remove organizational barriers that lead to poor customer
service -- offline and online
• Recognize and capitalize on the increasing role of technology in serving
customers, but …
• Be cognizant of customers’ and employees’ readiness to embrace technology-
based services
• Recognize that e-service quality as perceived by customers involves much more
than having a state-of-the-art website
• Put in place a solid behind-the-scenes infrastructure -- information systems,
logistics, and human resources -- to deliver what a website’s façade promises.
• Continuously monitor customers’ and employees’ reactions to and experiences
with your electronic interfaces

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 75
Sources of Information about Customer Service and
Technology Readiness

www.technoreadymarketing.com
76
Thank You!

© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 77

You might also like