You are on page 1of 18

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/299356129

Measuring service quality in higher education

Article  in  Quality Assurance in Education · April 2016


DOI: 10.1108/QAE-06-2014-0028

CITATIONS READS

131 9,954

3 authors, including:

Viraiyan Teeroovengadum
University of Mauritius
23 PUBLICATIONS   432 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

ICT adoption in education View project

All content following this page was uploaded by Viraiyan Teeroovengadum on 18 September 2019.

The user has requested enhancement of the downloaded file.


Quality Assurance in Education
Measuring service quality in higher education: Development of a hierarchical
model (HESQUAL)
Viraiyan Teeroovengadum T.J. Kamalanabhan Ashley Keshwar Seebaluck
Article information:
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

To cite this document:


Viraiyan Teeroovengadum T.J. Kamalanabhan Ashley Keshwar Seebaluck , (2016),"Measuring
service quality in higher education", Quality Assurance in Education, Vol. 24 Iss 2 pp. 244 - 258
Permanent link to this document:
http://dx.doi.org/10.1108/QAE-06-2014-0028
Downloaded on: 28 March 2016, At: 23:09 (PT)
References: this document contains references to 60 other documents.
To copy this document: permissions@emeraldinsight.com
The fulltext of this document has been downloaded 125 times since 2016*
Users who downloaded this article also downloaded:
(2016),"Does higher education service quality effect student satisfaction, image and loyalty?: A study
of international students in Malaysian public universities", Quality Assurance in Education, Vol. 24 Iss
1 pp. 70-94 http://dx.doi.org/10.1108/QAE-02-2014-0008
(2016),"Determinants of quality education in private universities from student perspectives: A
case study in Bangladesh", Quality Assurance in Education, Vol. 24 Iss 1 pp. 123-138 http://
dx.doi.org/10.1108/QAE-09-2013-0040
(2016),"A comparative assessment of the performance of select higher education institutes
in India", Quality Assurance in Education, Vol. 24 Iss 2 pp. 278-302 http://dx.doi.org/10.1108/
QAE-02-2015-0006

Access to this document was granted through an Emerald subscription provided by emerald-
srm:158848 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald
for Authors service information about how to choose which publication to write for and submission
guidelines are available for all. Please visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as
well as providing an extensive range of online products and additional customer resources and
services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the
Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for
digital archive preservation.
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

*Related content and download information correct at time of download.


The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/0968-4883.htm

QAE
24,2
Measuring service quality in
higher education
Development of a hierarchical model
244 (HESQUAL)
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

Viraiyan Teeroovengadum
Received 9 June 2014
Revised 5 January 2015 Department of Management, University of Mauritius, Reduit, Mauritius
Accepted 3 July 2015
T.J. Kamalanabhan
Department of Management Studies, Indian Institute of Technology,
Madras, Chennai, India, and
Ashley Keshwar Seebaluck
Department of Management, University of Mauritius, Reduit, Mauritius

Abstract
Purpose – This paper aims to develop and empirically test a hierarchical model for measuring service
quality in higher education.
Design/methodology/approach – The first phase of the study consisted of qualitative research
methods and a comprehensive literature review, which allowed the development of a conceptual model
comprising 53 service quality attributes. Quantitative methods were used for the second phase so as to
test the dimensionality of the measurement instrument and assess its validity and reliability. A sample
of 207 students was surveyed, and data were analysed using exploratory factor analysis and
Cronbach’s alpha test.
Findings – The results of the factor analysis revealed the presence of sub-dimensions. A hierarchical
model was therefore considered most appropriate. The final model consisted of five primary
dimensions, which are administrative quality, physical environment quality, core educational quality,
support facilities quality and transformative quality. The instrument contained a total of 48 items. Each
scale was satisfactorily tested for validity and reliability.
Research limitations/implications – This empirical study made use of data collected in Mauritius
only, a developing country.
Practical implications – This study provides a framework and an instrument that can be used by
higher education institutions in view of continuously improving educational quality.
Originality/value – The study adopted a holistic approach by considering both the functional and
technical aspect of service quality in higher education. Moreover, technical quality was operationalised
by considering the notion of transformative quality.
Keywords Quality management, Service quality, Higher education, Universities,
Hierarchical model, Transformative quality
Paper type Research paper

Quality Assurance in Education 1. Introduction


Vol. 24 No. 2, 2016
pp. 244-258
Sustaining and improving service quality is now a sine qua non prerequisite for higher
© Emerald Group Publishing Limited
0968-4883
educational institutions. The prevailing higher educational landscape is a dynamic and
DOI 10.1108/QAE-06-2014-0028 increasingly competitive one (Cheung et al., 2011; Dehghan et al., 2014), where
universities need to maximise their efforts so as to continuously improve their services Development
(Clemes et al., 2013). Among the factors leading to this current state of affairs are of a
internationalisation of higher education (Harvey and Williams, 2010; Sultan and Wong,
2010), the increase in the number of private universities (Halai, 2013) and the decrease in
hierarchical
state funding for public universities (Quinn et al., 2009). Consequently, akin to business model
organisations that are under the stringent obligation to constantly satisfy their
customers to thrive (Calvo-Porral et al., 2013), universities need to satisfy their students 245
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

(Lagrosen et al., 2004; Srikanthan and Dalrymple, 2007), considered as being their main
customers (Telford and Masson, 2005) by providing high quality education. Crucially, to
manage and improve the quality of services they provide, universities need to regularly
measure service quality (Abdullah, 2006; Kwek et al., 2010; Chong and Ahmed, 2012). In
the words of Nadiri et al. (2009, p. 525), “if service quality is to be improved, it must be
reliably assessed and measured”. Hence, a reliable and valid instrument for measuring
service quality of higher education from the students’ perspective is indispensable.
It remains a major challenge to develop an adequate model for measuring service
quality in higher education (Chong and Ahmed, 2012). Most relevant studies (Cuthbert,
1996; Soutar and McNeil, 1996; Pariseau and McDaniel, 1997; Arambewela and Hall,
2006; Wong et al., 2012) sought to adapt the prominent SERVQUAL model of
Parasuraman et al. (1988), while some studies went a step further and attempted to
identify service quality dimensions and attributes from the perspective of students
(LeBlanc and Nguyen, 1997; Ford et al., 1999; Hill et al., 2003; Lagrosen et al., 2004;
Abdullah, 2006). Arguably, only a few of these previous studies attempting to measure
service quality in higher education adopted what could be termed as a holistic approach,
and to the best knowledge of the researchers, none of them integrated the notion of
transformative quality in the development of service quality models. A holistic
approach here refers to the consideration of both functional (process) and technical
(outcome) quality as advocated by Gronroos (1978, cited in Kang, 2006). On the other
hand, the transformative view of educational quality implies that technical (outcome)
quality is to be operationalised in terms of the widely accepted definition (Lomas, 2007;
Zachariah, 2007; Cheng, 2011) of educational quality imparted by Harvey and Green
(1993), that of “transformative quality”. In view of bridging this gap, this study sought
to develop a hierarchical model using both a holistic and transformative perspective,
thus seeking to make a useful contribution in the quest of finding appropriate
comprehensive models for measuring service quality in the higher education context.
Accordingly, the research objectives are:
• to identify service quality dimensions and sub dimensions in higher education;
and
• to test the measurement scales for reliability and validity.

2. Literature review
2.1 Operationalisation of service quality in higher education
The extant literature suggests that there are primarily two key issues to be addressed
when attempting to develop an instrument for measuring service quality: the
operationalisation of the service quality construct (Brady et al., 2002; Abdullah, 2006)
and the identification of appropriate service quality dimensions (Kang and James, 2004).
The most acknowledged and extensively used instrument for measuring service quality
is the SERVQUAL scale developed by Parasuraman et al. (1985, 1988) (Calvo-Porral
QAE et al., 2013). As noted by Narang (2012, p. 359), the SERVQUAL scale and its adaptations
24,2 have been widely used in various services such as “banking, retail, wholesale, health and
education”.
Parasuraman et al. (1988, p. 15) describe service quality “as a form of attitude, related
but not equivalent to satisfaction, and results from comparison of expectations with
perceptions of performance”. Applied to the higher educational context, service quality
246 can thus be defined as “the difference between what a student expects to receive and
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

his/her perceptions of actual delivery” (O’Neill and Palmer, 2004, p. 42). This notion has
been extensively used in various studies aiming to measure service quality in higher
education (Cuthbert, 1996; Soutar and McNeil, 1996; Tan and Kek, 2004; Arambewela
and Hall, 2006; Barnes, 2007; Wong et al., 2012). Despite its wide acceptance and
popularity, operationalisation of service quality “has been under extensive criticism”
(Trivellas and Dargenidou, 2009, p. 383). Cronin and Taylor (1992) proposed an
alternative, termed as SERVPERF, an instrument focusing on the performance level of
the various attributes, and it has even been suggested that the SERVPERF scale
possesses better psychometric properties (Brady et al., 2002). Moreover, it has also been
argued that in the context of higher education, the performance-only approach is more
suitable (Oldfield and Baron, 2000; Abdullah, 2006; Begum, 2009). This is mainly due to
problems related to attempting to capture the expectations of students. Researchers
have argued that students may not be able to have clear expectations of the higher
educational service (Joseph and Joseph, 1997; Ford et al., 1999; Angell et al., 2008). This
present study concurs with the latter view and therefore chooses to adopt a
performance-only operationalisation of service quality.

2.2 Higher educational service quality dimensions and the need for a holistic approach
As stated by Iacovidou et al. (2009), the literature suggests that quality in higher
education is not a unidimensional concept and is in fact best described as a set of
dimensions. A number of studies seeking to measure service quality in the context of
higher education (Cuthbert, 1996; Soutar and McNeil, 1996; Pariseau and McDaniel,
1997; Wong et al., 2012) elected to adopt the dimensions of the SERVQUAL model, which
proposes five dimensions, namely, responsiveness, assurance, tangibles, empathy and
reliability as originally proposed by Parasuraman et al. (1988).
Instead of building on the generic SERVQUAL framework, there are a number of
studies that sought to identify higher educational service quality (HESQUAL)
dimensions by using an exploratory phase consisting of qualitative research methods.
One such pioneering study is that of Leblanc and Nguyen (1997) who identified 38
service quality attributes. These were grouped into seven dimensions: contact
personnel, reputation, physical evidence, administration, curriculum, responsiveness
and access to facilities. In a subsequent study conducted in three different countries, UK,
Australia and Sweden, Lagrosen et al. (2004) identified 11 factors comprising 31 items.
The service quality dimensions were corporate collaboration, information and
responsiveness, courses offered, campus facilities, teaching practices, internal
evaluations, external evaluations, computer facilities, collaboration and comparisons,
post-study factors and library resources. Similar studies were carried out in various
settings. All these studies were thoroughly examined so as to develop the conceptual
framework proposed in this present study.
An important issue to be considered is that most models including the SERVQUAL Development
model take into account only the element of functional quality and neglect the technical of a
quality aspect (Kang, 2006). The extant literature suggested that such is largely the case hierarchical
in the higher education context as well. Though, there are, notably, a few exceptions
such as Holdford and Reinders (2001), Chong and Ahmed (2012) and Clemes et al. (2013).
model
Such a holistic approach is adopted in this research.
247
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

2.3 Integration of the notion transformative quality


Harvey and Green (1993) argued that education is not about presenting a service to a
customer but rather a continuous process of transformation of the participant (student).
This view has been strongly supported by empirical studies such as that conducted by
Lomas (2007), Watty (2005) and Zachariah (2007) who all reported that this was the
preferred view of educational leaders, academics, employers and students.
Transformative quality comprises two components, enhancement and empowerment of
the participant.
Enhancing the student ability involves adding value to them in terms of knowledge
and skills (Harvey and Green, 1993). “Value added is thus a measure of quality in terms
of the extent to which the educational experience” is able to do so (Harvey and Knight,
1996, p. 8). Harvey and Knight (1996) further advocated that an education of quality
consists of one that enables transformation in the students and thus improves them. The
second element of transformative quality is the empowerment of the participant, which
in itself includes two fundamental ideas; first, it involves bestowing some
decision-making authority on participants over their own transformation process. This
is in fact a need arising out of the second idea that is catering for self-transformation or
personal growth of the student. As Harvey and Green (1993) point out the result of such
an educational transformation is “increased awareness and confidence” which make
them capable of taking charge of their own self-development. This study seeks to
integrate this notion of quality in the measurement of service quality by developing a
new scale for measuring the students’ perception of transformative quality.

3. Conceptual framework
The measurement/factor model has been developed based on a review of the extant
literature and qualitative data collection and analysis. Five primary dimensions of
HESQUAL have been identified from the extensive literature review conducted and
qualitative data collection in the form of interviews and focus groups with students and
academics. These are administrative quality, physical environment quality, core
educational quality, support facilities quality and transformative quality.
The dimensions identified from the extensive review of the literature were found
consistent with the findings from the in-depth interviews and focus group interviews
with academics and students. This was expected as, as demonstrated by Ford et al.
(1999), service quality attributes tend to be basically the same across cultures. Table I
provides an account of some examples of statements from the interviewees. The
appropriateness of a hierarchical model was also further strengthened. The next phase
of the research was to test for potential sub-dimensions through exploratory factor
analysis (EFA) and also test for the validity and reliability of the measurement scales.
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

24,2

248
QAE

Table I.

identified
and sample
statements with
respect to service
Literature sources

quality dimensions
Dimensions identified Literature source Sample statements from interviews/focus groups

Administrative quality Leblanc and Nguyen (1997), Kwan and Ng (1999), “There should not be much bureaucracy”; “Procedures should be clearly stated”;
Oldfield and Baron (2000), Holdford and Reinders “Good communication between administration and students”; “Appropriate
(2001), Joseph et al. (2005), Abdullah (2006), system in place”; “Administrative staffs should always be willing to help
Zachariah (2007), Trivellas and Dargenidou students out”; “Administrative staff should be good to students”
(2009), Kwek et al. (2010), Sultan and Wong
(2011), Narang (2012)
Physical environment quality Cuthbert (1996), Owlia and Aspinwall (1996), “Having a peaceful environment”; “The campus should have good looking
Soutar and McNeil (1996), Leblanc and Nguyen infrastructure”; “Physical resources like library”; “A nice sports complex is
(1997), Joseph and Joseph (1997), Kwan and Ng important”; “Clean toilets”; “Lecture rooms must be well equipped”; “High level
(1999), Hill et al. (2003), Lagrosen et al. (2004); of security on campus”
O’Neill and Palmer (2004), Sohail and Shaikh
(2004), Telford and Masson (2005), Angell et al.
(2008), Narang (2012), Wong et al. (2012)
Core educational quality Owlia and Aspinwall (1996), Leblanc and Nguyen “Having academics who are smart”; “Knowledge of academics”; “Ability of
(1997), Kwan and Ng (1999), Holdford and lecturers to transmit enthusiasm”; “Up-to-date in their field”; “Faculty members
Reinders (2001), Telford and Masson (2005), are approachable”; “Using different teaching styles”; “Makes subject look easy”;
Abdullah (2006), Zachariah (2007), Trivellas and “Encourage participation of students in their learning process”; “Courses being
Dargenidou (2009), Kwek et al. (2010), offered”; “A culture of sharing and collaboration”; “Faculty members are willing
Shekarchizadeh et al. (2011), Sultan and Wong to help”; “Expertise of faculty”; “Course designed based in requirements of the
(2011), Narang (2012) students”; “Lecturers should have some industry experience”; “Having lot of
interaction with students”; “Research work of academics”; “Amount of
published materials”; “Relevant curriculum content”
Support facilities quality Leblanc and Nguyen (1997), Kwan and Ng (1999), “Good transport facilities”; “Adequate cafeteria”; “Opportunities for extra-
Hill et al. (2003); Lagrosen et al. (2004), Sohail and curricular activities”; “Computer laboratories available”; “Photocopy and
Shaikh (2004), Tan and Kek (2004), Joseph et al. printing services”
(2005), Kwek et al. (2010), Sultan and Wong
(2011)
Transformative quality Harvey and Green (1993), Harvey and Knight “Become more emotionally stable”; “Learning how to deal with emotions”;
(1996), Lomas (2007), Watty (2005), Srikanthan “Developing critical thinking”; “Acquiring job-related skills and knowledge”;
and Dalrymple (2003, 2005, 2007) “Become open-minded”; “Be independent thinkers”; “Learn to be responsible
citizens”; “Learning how to learn”; “Conduct research”
4. Research methodology Development
The study used a mixed methods approach, adhering to a pragmatic research of a
philosophy (Morgan, 2007). The first phase of the research undertook to identify service
quality attributes and ensure content validity of the scales developed. Content validity is
hierarchical
usually ensured through subjective expert reviews and pilot testing (Hair et al., 2010) model
and by a qualitative phase such as focus group discussion (Narang, 2012). In line with
related studies (Telford and Masson, 2005; O’Neill and Palmer, 2004; Angell et al., 2008), 249
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

qualitative research methods were used aiming at identifying service quality


dimensions in the context of higher education. In-depth interviews were conducted with
a sample of five academics and educational leaders. Three focus group discussions
comprising seven students each were carried out. The qualitative data obtained were
analysed using a thematic analysis technique. Moreover, the service quality attributes
were also identified based on an extensive review of the extant literature that ensured
high content validity (Sureshchandar et al., 2001, Wong et al., 2012). The findings were
used to build the conceptual model.
The second phase consisted of quantitative research methods. A questionnaire was
developed based on the items identified and a pilot study was conducted. The
questionnaire was refined, and a survey was then conducted among a larger sample of
students at the University of Mauritius. A five-point Likert scale was used, and a
performance-only operationalisation of service quality was adopted. The sample size for
the survey was largely determined by the requirement of EFA. In that respect, Hair et al.
(2006) point out that a satisfactory sample size would have a ratio of 10 to 1.
Accordingly, it was estimated that a satisfactory sample size should consist of about 200
students. The questionnaire was personally self-administered to a total of 250 students
using a non-probability sampling technique, and a total of 207 usable filled in
questionnaires were obtained. It is an approach used by the majority of studies in the
field (Lagrosen et al., 2004; O’Neill and Palmer, 2004; Abdullah, 2006), given that
the primary goal at this stage was to test the proposed scale rather than generalising the
findings to the population.
EFA allowed the identification of sub-dimensions of service quality with a view to
empirically testing the hierarchical model. Moreover, the quantitative phase of this
study also aimed at testing the reliability and validity of the scales. With respect to
reliability assessment, Cronbach’s alpha was used. Each of the individual scales was
thus tested for internal consistency. Cronbach’s alpha values greater than 0.70 were
considered as acceptable (Nunally, 1978 cited in Hair et al., 2006). Construct validity of
the scales was tested through EFA, whereby only items having factor loadings greater
than 0.40 were retained (Hair et al., 2006).

5. Analysis and discussion


5.1 Results of tests for determining appropriateness of data for EFA
For each set of observable variables, relevant tests were carried out so as to determine
whether the use of EFA was appropriate. For all five dimensions, from the visual
examination of the correlation matrix, it was observed that a considerable number of
correlations were above the cut-off point of 0.30. With respect to the anti-image matrix,
inspection showed that most of the partial correlations were small. The Barlett’s test of
sphericity which indicates that the null hypothesis that the population correlation is an
identity matrix was rejected (⬍ 0.05) for all five dimensions. The KMO measure of
QAE sample adequacy had values greater than 0.80, ranging from 0.81 to 0.89, considered as
24,2 meritorious (Hair et al., 2006). Thus, all the tests provided strong support to the
appropriateness of conducting an EFA together with respect to the five sets of variables
under investigation.
The five sets of variables were subjected to factor analysis corresponding to the
following higher education service quality dimensions (Table II):
250 • administrative quality (7 variables);
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

• physical environment quality (12 variables);


• core educational quality (20 variables);
• support facilities quality (6 variables); and
• transformative quality (8 variables).

5.2 Exploratory factor analysis and reliability analysis for administrative quality
EFA suggested the existence of two factors with eigen values more than 1, explaining
66.2 per cent of variance. As suggested by Hair et al. (2006), two criteria were used for
item retention; first, removal of item (s) loading highly on more than one factor, and
second, retention of items with highest loading with respect to each factor. The first
factor was termed “Attitude and Behaviour” and contained four items having factor
loadings ranging from 0.68 to 0.83, well above the suggested cut-off point of 0.40 as per
the sample size (Hair et al., 2006). The second factor was “Administrative Processes”
relating to the non-human aspects of administration and had three items retained with
factor loadings ranging from 0.83 and 0.85. Both scales were found to be reliable with
Cronbach’s alpha values of 0.72 and 0.87, respectively.

5.3 Exploratory factor analysis and reliability analysis for physical environment quality
Two items were deleted following the results of EFA and Cronbach’s alpha tests. The
final scale for physical environment quality consisted of three dimensions comprising a
total of ten quality attributes explaining 70.6 per cent of variance. These were support
infrastructure (four items), learning setting (three items) and general infrastructure
(three items). The various sub-scales showed appropriate reliability with Cronbach’s
alpha values all being above the 0.70 threshold value (Hair et al., 2006) for the three
scales (0.81, 0.72 and 0.71).

5.4 Exploratory factor analysis and reliability analysis for core educational quality
Four factors including 17 variables were extracted based upon the latent root criterion
analysis and scree test accounting for 60.4 per cent of total variance. From the rotated
factor solution, it was observed that two factors had loadings below the 0.40 cut-off point
and were therefore omitted. Except for the dimension “pedagogy”, all the
sub-dimensions within the “Core Educational Quality” dimension had Cronbach’s alpha
values greater than the threshold value of 0.70, thus demonstrating an acceptable level
of reliability. In fact, the dimension pedagogy was retained nevertheless, given that it
was still above 0.60 considered as acceptable for new scale development (Churchill,
1979, cited in Clemes et al., 2013). The final factor structure for core educational quality
comprised four factors: “Attitude and Behaviour” (six items), “Curriculum” (four items),
“Pedagogy” (four items) and “Competence” (three items).
Factors (% variance Factor
Development
explained; eigen value) Service quality dimensions and attributes loading ␣ of a
Administrative quality
hierarchical
Attitude and behavior Willingness of administrative staff members 0.825 0.775 model
(47.6%; 3.34) to help students
Ability of administrative staff members to 0.764
solve students’ problems 251
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

Politeness of administrative staffs 0.744


Behaviour of administrative staff members 0.676
imparting confidence in students
Administrative processes Well standardised administrative processes 0.845 0.801
(18.6 per cent; 1.30) so that there is not much bureaucracy and
useless difficulties
Administrative procedures are clear and 0.843
well structured so that service delivery times
are minimum
Transparency of official procedures and 0.829
regulations
Physical environment quality
Support infrastructure Availability of adequate cafeteria 0.926 0.811
(41.2%; 4.95) infrastructure
Availability of adequate library 0.923
infrastructure
Availability of adequate recreational 0.813
infrastructure
Availability of adequate sports 0.795
infrastructure
Learning setting (18.5%; Having adequate lecture rooms 0.857 0.719
2.22) Having quiet places to study within campus 0.809
Availability of adequate teaching tools and 0.781
equipments (e.g. Projector, White boards)
General infrastructure Favourable ambient conditions (ventilation, 0.906 0.706
(10.9%; 1.31) noise, odour, etc.) prevailing within the
campus
Safety on campus 0.842
Appearance of buildings and grounds 0.824
Core educational quality
Attitude and behaviour Lecturers understanding students’ needs 0.823 0.8222
(40.2%; 7.19) Lectures giving personal attention to 0.791
students
Availability of lecturers to guide and advise 0.761
students
Prevalence of a culture of sharing and 0.746
collaboration among lecturers
Behaviour of lecturers instilling confidence 0.675 Table II.
in students Results of
Lecturers appearing to have students’ best 0.637 exploratory factor
interest at heart analysis and
(continued) reliability analysis
QAE Factors (% variance Factor
24,2 explained; eigen value) Service quality dimensions and attributes loading ␣

Curriculum (11.3%; 1.99) Clearly defined course content and course 0.816 0.761
objectives
Usefulness of module content and design to 0.810
cater for the personal needs of students
252 Challenging academic standards of 0.808
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

programmes to ensure students’ overall


development
Relevance of course content to the future/ 0.789
current job of students
Pedagogy (8.2%; 1.59) Use of multimedia in teaching (e.g. use of 0.738 0.625
overhead projector, power-point
presentations)
Active participation of students in their 0.718
learning process
Provision of regular feedback to students 0.713
with respect to their academic performance
Well designed examinations and continuous 0.685
assignment to promote the enhancement of
knowledge skills
Competence (7.6%; 1.31) Theoretical knowledge, qualifications and 0.833 0.789
practical knowledge of lecturers
Communication skills of lecturers 0.787
Lecturers are up-to-date in their area of 0.781
expertise
Support facilities quality
Support facilities (62.2%; 3.73) Reasonable pricing and quality of food and 0.883 0.723
refreshments on campus
Availability of adequate IT facilities 0.860
Availability and adequacy of photocopy and 0.829
printing facilities
Availability of transport facilities 0.782
Amount of opportunity for sports and 0.767
recreational facilities
Availability and adequacy of extra- 0.572
curricular activities including those through
clubs and societies
Transformative quality
Transformative quality Enabling students to be emotionally stable 0.809 0.874
(54.9%; 4.39) Increase in self-confidence of students 0.796
Development in students’ critical thinking 0.796
Increase in self-awareness of students 0.791
Development of problem-solving skills with 0.748
respect to their field of study
Enabling students to transcend their 0.714
prejudices
Acquiring adequate knowledge and skills to 0.672
perform future job
Increase in knowledge, abilities and skills of 0.569
Table II. students
5.5 Exploratory factor analysis and reliability analysis for support facilities Development
The results of the latent root criterion analysis and scree test both provided sufficient of a
indication that this dimension did not consist of sub-dimensions. The percentage of
variance explained was 62.2 per cent. The result of the EFA suggested that the “Support
hierarchical
Facilities” dimension comprised six items. Conbach’s alpha value was above 0.70. model
5.6 Exploratory factor analysis and reliability analysis for transformative quality 253
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

EFA revealed a unidimensional structure. All the eight quality attributes (items) loaded
significantly ranging from 0.57 to 0.81. While the literature suggested that the
transformative quality concept might be broken down into two factors, namely,
empowerment and enhancement, the qualitative phase did not provide any evidence for
it. The result of the EFA lends credence to the latter, and therefore, transformative
quality is best measured using a unidimensional scale. The measurement scale for
transformative quality obtained a satisfactory reliability score, with a Cronbach’s alpha
value of 0.87, well above the 0.70 threshold.

5.7 Hierarchical model of higher educational service quality


To summarise, EFA suggested that three primary dimensions consisted of
sub-dimensions, namely, administrative quality, physical environment quality and core
educational quality. The administrative quality dimension was further divided into two
factors: “Attitude and Behaviour of Administrative Staff” and “Administrative
Processes”. Physical environment quality consisted of three sub-dimensions: support
infrastructure, learning setting and general infrastructure. For core educational quality,
the proposed sub-dimensions were attitude and behaviour, curriculum, competency and
pedagogy. The two other primary dimensions, support facilities quality and
transformative quality, were found to be unidimensional (Figure 1).

6. Limitations and future research


Despite the scientific approach adopted, the findings of this empirical study should still
be viewed in the light of a number of limitations that also provide opportunities for
further research. First, the data collected were from students in a developing country,
namely, Mauritius, and while Mauritius remains a very interesting case given its socio
cultural background, which is very diverse and has a tertiary education sector moving
towards internationalisation, the findings cannot be generalized to other countries
without caution. Previous research has shown that HESQUAL attributes are, to a large
extent, similar across cultures, but differences still exist. The study can be replicated in
other cultural settings, especially developed countries to see if the same service quality
dimensions emerge. Another limitation of this study is that it considered only the
perspective of students, taken as the main customers of higher educational service.
Future research may take into account the perspectives of other stakeholders of higher
education by adapting the model. Empirically testing the model among academic staff
in particular might be useful with a view to further confirming the validity of the model.

7. Key implications and conclusion


This study consisted of scale development procedures with a view to developing a valid
and reliable measuring instrument for assessing service quality in higher education
using a holistic and transformative approach. A comprehensive instrument was
developed while maintaining a high level of rigour at each level of its construction. The
QAE Attitude and

24,2 Behaviour

Administrative
Quality

Administrative
Processes
254
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

Support Facilities
Quality
Curriculum

Attitude and
Core Educational Service Quality
Behaviour
Quality

Competence

Pedagogy
Transformative
Quality

Support
Infrastructure

Physical
Environment
Learning Setting Quality

Figure 1. General
HESQUAL model Infrastructure

final model was a hierarchical model termed HESQUAL consisting of five primary
dimensions, nine sub-dimensions and included a total of 48 items. Moreover, as
discussed throughout the paper, a holistic approach was adopted whereby both the
functional and technical aspects of service quality were considered. The notion of
transformative quality has also been successfully integrated in the model and a valid
and reliable scale developed to measure this important and novel construct, which had
not been operationalised and used by previous studies.
As the phenomenon of internationalisation and marketisation of higher education
continues to evolve, higher education institutions are bound to constantly improve the
quality of service they provide. The instrument developed provides a useful tool for
university management in their quest for continuously improving service quality. The
level of service quality can be measured in a holistic way and remedial actions taken. At
a theoretical level, the findings of this study offer much additional insights to
researchers in understanding the fine detail of the concept of service quality when
applied to the higher educational context. Furthermore, the potential replication and
testing of the model in various higher education systems around the world would
provide valuable knowledge with respect to higher educational quality both at local and
international level.
References Development
Abdullah, F. (2006), “The development of HEdPERF: a new measuring instrument of service of a
quality for the higher education sector”, International Journal of Consumer Studies, Vol. 30
No. 6, pp. 569-581.
hierarchical
Angell, R.J., Heffernan, T.W. and Megicks, P. (2008), “Service quality in postgraduate education”,
model
Quality Assurance in Education, Vol. 16 No. 3, pp. 236-254.
Arambewela, R. and Hall, J. (2006), “A comparative analysis of international education 255
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

satisfaction using SERVQUAL”, Journal of Services Research, Vol. 6, pp. 141-163.


Barnes, B.R. (2007), “Analysing service quality: the case of post-graduate Chinese students”, Total
Quality Management & Business Excellence, Vol. 18 No. 3, pp. 313-331.
Brady, M.K., Cronin, J.J. and Brand, R.R. (2002), “Performance-only measurement of service
quality: a replication and extension”, Journal of Business Research, Vol. 55 No. 1, pp. 17-31.
Calvo-Porral, C., Lévy-Mangin, J.P. and Novo-Corti, I. (2013), “Perceived quality in higher education: an
empirical study”, Marketing Intelligence & Planning, Vol. 31 No. 6, pp. 601-619.
Cheng, M. (2011), “‘Transforming the learner’ versus ‘passing the exam’: understanding the gap
between academic and student definitions of quality”, Quality in Higher Education, Vol. 17
No. 1, pp. 3-17.
Cheung, A.C.K., Yuen, T.W.W., Yuen, C.Y.M. and Cheng, Y.C. (2011), “Strategies and policies for Hong
Kong’s higher education in Asian markets: lessons from the United Kingdom, Australia, and
Singapore”, International Journal of Educational Management, Vol. 25 No. 2, pp. 144-163.
Chong, Y.S. and Ahmed, P.K. (2012), “An empirical investigation of students’ motivational impact
upon university service quality perception: a self- determination perspective”, Quality in
Higher Education, Vol. 18 No. 1, pp. 37-41.
Clemes, M.D., Cohen, D.A. and Wang, Y. (2013), “Understanding Chinese university students’
experiences: an empirical analysis”, Asia Pacific Journal of Marketing and Logistics, Vol. 25
No. 3, pp. 391-427.
Cronin, J.J. and Taylor, S.A. (1992), “Measuring quality: a reexamination and extension”, Journal
of Marketing, Vol. 56 No. 3, pp. 55-68.
Cuthbert, P.F. (1996), “Managing service quality in HE: is SERVQUAL the answer? Part 2”,
Managing Service Quality, Vol. 6 No. 3, pp. 31-35.
Dehghan, A., Dugger, J., Dobrzykowski, D. and Balazs, A. (2014), “The antecedents of student
loyalty in online programs”, International Journal of Educational Management, Vol. 28
No. 1, pp. 15-35.
Ford, J.B., Joseph, M. and Joseph, B. (1999), “Importance-performance analysis as a strategic tool
for service marketers: the case of service quality perceptions of business students in New
Zealand and the USA”, Journal of Services Marketing, Vol. 13 No. 2, pp. 171-186.
Hair, J., Black, W., Babin, B. Anderson, R. and Tatham, R. (2006), Multivariate Data Analysis,
6th ed., Prentice Hall, Upper Saddle River, NJ.
Halai, N. (2013), “Quality of private universities in Pakistan: an analysis of higher education
commission rankings 2012”, International Journal of Educational Management, Vol. 27
No. 7, pp. 775-786.
Harvey, L. and Green, D. (1993), “Defining Quality”, Assessment and Evaluation in Higher
Education, Vol. 18 No. 1, pp. 9-34.
Harvey, L. and Knight, P. (1996), Transforming Higher Education, Society for Research into
Higher Education, London.
QAE Harvey, L. and Williams, J. (2010), “Fifteen years of quality in higher education (Part Two)”,
Quality in Higher Education, Vol. 16 No. 2, pp. 37-41.
24,2
Hill, Y., Lomas, L. and MacGregor, J. (2003), “Students’ perceptions of quality in higher education”,
Quality Assurance in Education, Vol. 11 No. 1, pp. 15-20.
Holdford, D. and Reinders, T.P. (2001), “Development of an instrument to assess student
perceptions of the quality of pharmaceutical education”, American Journal of
256 Pharmaceutical Education, Vol. 65 No. 2, pp. 125-131.
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

Iacovidou, M., Gibbs, P. and Zopiatis, A. (2009), “An exploratory use of the stakeholder approach
to defining and measuring quality: the case of a Cypriot higher education”, Quality in
Higher Education, Vol. 15 No. 2, pp. 37-41.
Jackson, M.J., Helms, M.M. and Ahmadi, M. (2011), “Quality as a gap analysis of college students’
expectations”, Quality Assurance in Education, Vol. 19 No. 4, pp. 392-412.
Joseph, M. and Joseph, B. (1997), “Service quality in education: a student perspective”, Quality
Assurance in Education, Vol. 5 No. 1, pp. 15-21.
Joseph, M., Yakhou, M. and Stone, G. (2005), “An educational institution’s quest for service quality:
customers’ perspective”, Quality Assurance in Education, Vol. 13 No. 1, pp. 66-82.
Kalayci, N., Watty, K. and Hayirsever, F. (2012), “Perceptions of quality in higher education: a
comparative study of Turkish and Australian business academics”, Quality in Higher
Education, Vol. 18 No. 2, pp. 37-41.
Kang, G.D. (2006), “The hierarchical structure of service quality: integration of technical and
functional quality”, Managing Service Quality, Vol. 16 No. 1, pp. 37-50.
Kang, G.D. and James, J. (2004), “Service quality dimensions: an examination of Grönroos’s service
quality model” Managing Service Quality, Vol. 14 No. 4, pp. 266-277.
Kwan, P.Y.K. and Ng, P.W.K. (1999), “Quality indicators in higher education – comparing Hong
Kong and China’s students”, Managerial Auditing Journal, Vol. 14 No. 1, pp. 20-27.
Kwek, L.C., Lau, T.C. and Tan, H.P. (2010), “Education quality process model and its influence on
students’ perceived service quality”, International Journal of Business and Management,
Vol. 5 No. 8, pp. 154-165.
Lagrosen, S., Seyyed-Hashemi, R. and Leitner, M. (2004), “Examination of the dimensions of
quality in higher education”, Quality Assurance in Education, Vol. 12 No. 2, pp. 61-69.
LeBlanc, G. and Nguyen, N. (1997), “Searching for excellence in business education: an exploratory
study of customer impressions of service quality”, International Journal of Educational
Management, Vol. 11 No. 2, pp. 72-79.
Lomas, L. (2007), “Zen, motorcycle maintenance and quality in higher education”, Quality
Assurance in Education, Vol. 15 No. 4, pp. 402-412.
Morgan, D.L. (2007), “Qualitative and quantitative methods”, Journal of Mixed Methods Research,
Vol. 1 No. 1, pp. 48-76.
Nadiri, H., Kandampully, J. and Hussain, K. (2009), “Students’ perceptions of service quality in
higher education”, Total Quality Management & Business Excellence, Vol. 20 No. 5,
pp. 523-535.
Narang, R. (2012), “How do management students perceive the quality of education in public
institutions?”, Quality Assurance in Education, Vol. 20 No. 4, pp. 357-371.
Oldfield, B.M. and Baron, S. (2000), “Student perceptions of service quality in a UK university
business and management faculty”, Quality Assurance in Education, Vol. 18 No. 2,
pp. 85-95.
O’Neill, M.A. and Palmer, A. (2004), “Importance-performance analysis: a useful tool for directing Development
continuous quality improvement in higher education”, Quality Assurance in Education,
Vol. 12 No. 1, pp. 39-52.
of a
Owlia, M.S. and Aspinwall, E.M. (1996), “A framework for the dimensions of quality in higher
hierarchical
education”, Quality Assurance in Education, Vol. 4 No. 2, pp. 12-20. model
Parasuraman, A., Zeithaml, A. and Berry, L. (1985), “A conceptual model of service quality and its
implications for future research”, Journal of Marketing, Vol. 49 No. 4, pp. 41-50. 257
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

Parasuraman, A., Zeithaml, V. and Berry, L. (1988), “SERVQUAL: a multiple-item scale for
measuring consumer perceptions of service quality”, Journal of Retailing, Vol. 64 No. 1,
pp. 12-40.
Pariseau, S.E. and Mcdaniel, J.R. (1997), “Assessing service quality in schools of business”,
International Journal of Quality & Reliability Management, Vol. 14 No. 3, pp. 204-218.
Quinn, A., Lemay, G., Larsen, P. and Johnson, D.M. (2009), “Service quality in higher education”,
Total Quality Management, Vol. 20 No. 2, pp. 139-152.
Shekarchizadeh, A., Rasli, A. and Hon-Tat, H. (2011), “SERVQUAL in Malaysian universities:
perspectives of international students”, Business Process Management Journal, Vol. 17
No. 1, pp. 67-81.
Sohail, M.S. and Shaikh, N.M. (2004), “Quest for excellence in business education: a study of
student impressions of service quality”, International Journal of Educational Management,
Vol. 18 No. 1, pp. 58-65.
Soutar, G. and McNeil, M. (1996), “Measuring service quality in a tertiary institution”, Journal of
Educational Administration, Vol. 34 No. 1, pp. 72-82.
Srikanthan, G. and Dalrymple, J. (2003), “Developing alternative perspectives for quality in higher
education”, International Journal of Educational Management, Vol. 17 No. 3, pp. 126-136.
Srikanthan, G. and Dalrymple, J. (2005), “Implementation of a holistic model for quality in higher
education”, Quality in Higher Education, Vol. 11 No. 1, pp. 69-81.
Srikanthan, G. and Dalrymple, J.F. (2007), “A conceptual overview of a holistic model for quality
in higher education”, International Journal of Educational Management, Vol. 21 No. 3,
pp. 173-193.
Sultan, P. and Wong, H. (2010), “Performance-based service quality model: an empirical study on
Japanese universities”, Quality Assurance in Education, Vol. 18 No. 2, pp. 126-143.
Sultan, P. and Wong, H.Y. (2011), “Service quality in a higher education context: antecedents and
dimensions”, International Review of Business Research Papers, Vol. 7 No. 2, pp. 11-20.
Sureshchandar, G.S., Rajendran, C. and Anantharaman, R.N. (2001), “A holistic model for total
quality service”, International Journal of Service Industry Management, Vol. 12 No. 4,
pp. 378-412.
Tan, K.C. and Kek, S.W. (2004), “Service quality in Higher Education using an enhanced
SERVQUAL approach”, Quality in Higher Education, Vol. 10 No. 1, pp. 17-24.
Teas, R. (1993), “Expectations, performance evaluation, and customers’ perceptions of quality”,
Journal of Marketing, Vol. 57 No. 4, pp. 18-34.
Telford, R. and Masson, R. (2005), “The congruence of quality values in higher education”, Quality
Assurance in Education, Vol. 13 No. 2, pp. 107-119.
Trivellas, P. and Dargenidou, D. (2009), “Organisational culture, job satisfaction and higher
education service quality: the case of Technological Educational Institute of Larissa”, The
TQM Journal, Vol. 21 No. 4, pp. 382-399.
QAE Watty, K. (2005), “Quality in accounting education: what say the academics?”, Quality Assurance
in Education, Vol. 13 No. 2, pp. 120-131.
24,2
Wong, K., Tunku, U. and Rahman, A. (2012), “Constructing a survey questionnaire to collect data
on service quality of business academics”, European Journal of Social Sciences, Vol. 29
No. 2, pp. 209-221.
Zachariah, S. (2007), “Managing quality in higher education: a stakeholder perspective”, PhD
258 thesis, University of Leicester.
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)

Further reading
Brady, M.K. and Cronin, J.J. (2001), “Perceived service conceptualizing approach quality: a
hierarchical”, Journal of Marketing, Vol. 65 No. 3, pp. 34-49.

About the authors


Viraiyan Teeroovengadum is currently a Lecturer in the Department of Management at the
University of Mauritius. He completed his MPhil in 2013 and is presently on course to submit his
doctoral thesis, which is in the field of service quality in higher education. He completed his first
degree in BA (Hons) law and management and also has a MA in educational leadership and
management for which he obtained a best student award. His research and teaching interests are
in service quality, quality management and research methodology in management. He has
published in various international conferences and journals such as the European Business
Review and International Journal of Learning. Viraiyan Teeroovengadum is the corresponding
author and can be contacted at: viraiyan@gmail.com
Prof T.J. Kamalanabhan (Dr) is a Professor in the Department of Management at the Indian
Institute of Technology, Madras. He is a PhD in organizational psychology, he has done his MPhil
in organizational psychology and Post Graduate Diploma in Business Management. He has
around 20 years of experience in academics and consultancy projects. He obtained a Fulbright
Fellowship for research and teaching in the Department of International Business, Washington
State University, Pullman, USA, in 2002, and he was awarded a DAAD Fellowship to visit
Germany under the German Academic Exchange Programme to do research in Entrepreneurship
at the Department of Management, University of West Saxony, Zwickau, Germany, in 1998. He is
a regular visiting faculty at the Department of Management and Law, Multimedia University,
Melaka, Malaysia, University College of Technology and Management Malaysia, (KUTPM) Shah
Alam, Selangor, Malaysia Faculty of Management, Multimedia University, Cyberjaya, Malaysia,
University College of Technology and Management Malaysia, (KUTPM) Shah Alam, Selangor,
Malaysia, and a visiting Research fellow at KNU University, Daegu, South Korea.
Dr Ashley Keshwar Seebaluck is a Senior Lecturer of quality management at the Faculty of
Law and Management of the University of Mauritius. His research interests are in quality
management and business excellence. He holds a BA (Hons) in management science and
industrial system studies from Trinity College, University of Dublin, and an MBA from the
University of Mauritius. He also has a PhD from the University of Mauritius and was supervised
by Professor David Weir, former Director of Bradford Management Centre and Dean of Newcastle
Business School. His industrial experience was as Industrial Engineer in a textile company, and he
was an assessor for the Mauritius National Quality Award competition on a number of occasions.
He is a member of the Emerald Literati Network and has published in the European Business
Review and the International Journal of Educational Management.

For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com

View publication stats

You might also like