Professional Documents
Culture Documents
net/publication/299356129
CITATIONS READS
131 9,954
3 authors, including:
Viraiyan Teeroovengadum
University of Mauritius
23 PUBLICATIONS 432 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Viraiyan Teeroovengadum on 18 September 2019.
Access to this document was granted through an Emerald subscription provided by emerald-
srm:158848 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald
for Authors service information about how to choose which publication to write for and submission
guidelines are available for all. Please visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as
well as providing an extensive range of online products and additional customer resources and
services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the
Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for
digital archive preservation.
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
QAE
24,2
Measuring service quality in
higher education
Development of a hierarchical model
244 (HESQUAL)
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
Viraiyan Teeroovengadum
Received 9 June 2014
Revised 5 January 2015 Department of Management, University of Mauritius, Reduit, Mauritius
Accepted 3 July 2015
T.J. Kamalanabhan
Department of Management Studies, Indian Institute of Technology,
Madras, Chennai, India, and
Ashley Keshwar Seebaluck
Department of Management, University of Mauritius, Reduit, Mauritius
Abstract
Purpose – This paper aims to develop and empirically test a hierarchical model for measuring service
quality in higher education.
Design/methodology/approach – The first phase of the study consisted of qualitative research
methods and a comprehensive literature review, which allowed the development of a conceptual model
comprising 53 service quality attributes. Quantitative methods were used for the second phase so as to
test the dimensionality of the measurement instrument and assess its validity and reliability. A sample
of 207 students was surveyed, and data were analysed using exploratory factor analysis and
Cronbach’s alpha test.
Findings – The results of the factor analysis revealed the presence of sub-dimensions. A hierarchical
model was therefore considered most appropriate. The final model consisted of five primary
dimensions, which are administrative quality, physical environment quality, core educational quality,
support facilities quality and transformative quality. The instrument contained a total of 48 items. Each
scale was satisfactorily tested for validity and reliability.
Research limitations/implications – This empirical study made use of data collected in Mauritius
only, a developing country.
Practical implications – This study provides a framework and an instrument that can be used by
higher education institutions in view of continuously improving educational quality.
Originality/value – The study adopted a holistic approach by considering both the functional and
technical aspect of service quality in higher education. Moreover, technical quality was operationalised
by considering the notion of transformative quality.
Keywords Quality management, Service quality, Higher education, Universities,
Hierarchical model, Transformative quality
Paper type Research paper
(Lagrosen et al., 2004; Srikanthan and Dalrymple, 2007), considered as being their main
customers (Telford and Masson, 2005) by providing high quality education. Crucially, to
manage and improve the quality of services they provide, universities need to regularly
measure service quality (Abdullah, 2006; Kwek et al., 2010; Chong and Ahmed, 2012). In
the words of Nadiri et al. (2009, p. 525), “if service quality is to be improved, it must be
reliably assessed and measured”. Hence, a reliable and valid instrument for measuring
service quality of higher education from the students’ perspective is indispensable.
It remains a major challenge to develop an adequate model for measuring service
quality in higher education (Chong and Ahmed, 2012). Most relevant studies (Cuthbert,
1996; Soutar and McNeil, 1996; Pariseau and McDaniel, 1997; Arambewela and Hall,
2006; Wong et al., 2012) sought to adapt the prominent SERVQUAL model of
Parasuraman et al. (1988), while some studies went a step further and attempted to
identify service quality dimensions and attributes from the perspective of students
(LeBlanc and Nguyen, 1997; Ford et al., 1999; Hill et al., 2003; Lagrosen et al., 2004;
Abdullah, 2006). Arguably, only a few of these previous studies attempting to measure
service quality in higher education adopted what could be termed as a holistic approach,
and to the best knowledge of the researchers, none of them integrated the notion of
transformative quality in the development of service quality models. A holistic
approach here refers to the consideration of both functional (process) and technical
(outcome) quality as advocated by Gronroos (1978, cited in Kang, 2006). On the other
hand, the transformative view of educational quality implies that technical (outcome)
quality is to be operationalised in terms of the widely accepted definition (Lomas, 2007;
Zachariah, 2007; Cheng, 2011) of educational quality imparted by Harvey and Green
(1993), that of “transformative quality”. In view of bridging this gap, this study sought
to develop a hierarchical model using both a holistic and transformative perspective,
thus seeking to make a useful contribution in the quest of finding appropriate
comprehensive models for measuring service quality in the higher education context.
Accordingly, the research objectives are:
• to identify service quality dimensions and sub dimensions in higher education;
and
• to test the measurement scales for reliability and validity.
2. Literature review
2.1 Operationalisation of service quality in higher education
The extant literature suggests that there are primarily two key issues to be addressed
when attempting to develop an instrument for measuring service quality: the
operationalisation of the service quality construct (Brady et al., 2002; Abdullah, 2006)
and the identification of appropriate service quality dimensions (Kang and James, 2004).
The most acknowledged and extensively used instrument for measuring service quality
is the SERVQUAL scale developed by Parasuraman et al. (1985, 1988) (Calvo-Porral
QAE et al., 2013). As noted by Narang (2012, p. 359), the SERVQUAL scale and its adaptations
24,2 have been widely used in various services such as “banking, retail, wholesale, health and
education”.
Parasuraman et al. (1988, p. 15) describe service quality “as a form of attitude, related
but not equivalent to satisfaction, and results from comparison of expectations with
perceptions of performance”. Applied to the higher educational context, service quality
246 can thus be defined as “the difference between what a student expects to receive and
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
his/her perceptions of actual delivery” (O’Neill and Palmer, 2004, p. 42). This notion has
been extensively used in various studies aiming to measure service quality in higher
education (Cuthbert, 1996; Soutar and McNeil, 1996; Tan and Kek, 2004; Arambewela
and Hall, 2006; Barnes, 2007; Wong et al., 2012). Despite its wide acceptance and
popularity, operationalisation of service quality “has been under extensive criticism”
(Trivellas and Dargenidou, 2009, p. 383). Cronin and Taylor (1992) proposed an
alternative, termed as SERVPERF, an instrument focusing on the performance level of
the various attributes, and it has even been suggested that the SERVPERF scale
possesses better psychometric properties (Brady et al., 2002). Moreover, it has also been
argued that in the context of higher education, the performance-only approach is more
suitable (Oldfield and Baron, 2000; Abdullah, 2006; Begum, 2009). This is mainly due to
problems related to attempting to capture the expectations of students. Researchers
have argued that students may not be able to have clear expectations of the higher
educational service (Joseph and Joseph, 1997; Ford et al., 1999; Angell et al., 2008). This
present study concurs with the latter view and therefore chooses to adopt a
performance-only operationalisation of service quality.
2.2 Higher educational service quality dimensions and the need for a holistic approach
As stated by Iacovidou et al. (2009), the literature suggests that quality in higher
education is not a unidimensional concept and is in fact best described as a set of
dimensions. A number of studies seeking to measure service quality in the context of
higher education (Cuthbert, 1996; Soutar and McNeil, 1996; Pariseau and McDaniel,
1997; Wong et al., 2012) elected to adopt the dimensions of the SERVQUAL model, which
proposes five dimensions, namely, responsiveness, assurance, tangibles, empathy and
reliability as originally proposed by Parasuraman et al. (1988).
Instead of building on the generic SERVQUAL framework, there are a number of
studies that sought to identify higher educational service quality (HESQUAL)
dimensions by using an exploratory phase consisting of qualitative research methods.
One such pioneering study is that of Leblanc and Nguyen (1997) who identified 38
service quality attributes. These were grouped into seven dimensions: contact
personnel, reputation, physical evidence, administration, curriculum, responsiveness
and access to facilities. In a subsequent study conducted in three different countries, UK,
Australia and Sweden, Lagrosen et al. (2004) identified 11 factors comprising 31 items.
The service quality dimensions were corporate collaboration, information and
responsiveness, courses offered, campus facilities, teaching practices, internal
evaluations, external evaluations, computer facilities, collaboration and comparisons,
post-study factors and library resources. Similar studies were carried out in various
settings. All these studies were thoroughly examined so as to develop the conceptual
framework proposed in this present study.
An important issue to be considered is that most models including the SERVQUAL Development
model take into account only the element of functional quality and neglect the technical of a
quality aspect (Kang, 2006). The extant literature suggested that such is largely the case hierarchical
in the higher education context as well. Though, there are, notably, a few exceptions
such as Holdford and Reinders (2001), Chong and Ahmed (2012) and Clemes et al. (2013).
model
Such a holistic approach is adopted in this research.
247
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
3. Conceptual framework
The measurement/factor model has been developed based on a review of the extant
literature and qualitative data collection and analysis. Five primary dimensions of
HESQUAL have been identified from the extensive literature review conducted and
qualitative data collection in the form of interviews and focus groups with students and
academics. These are administrative quality, physical environment quality, core
educational quality, support facilities quality and transformative quality.
The dimensions identified from the extensive review of the literature were found
consistent with the findings from the in-depth interviews and focus group interviews
with academics and students. This was expected as, as demonstrated by Ford et al.
(1999), service quality attributes tend to be basically the same across cultures. Table I
provides an account of some examples of statements from the interviewees. The
appropriateness of a hierarchical model was also further strengthened. The next phase
of the research was to test for potential sub-dimensions through exploratory factor
analysis (EFA) and also test for the validity and reliability of the measurement scales.
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
24,2
248
QAE
Table I.
identified
and sample
statements with
respect to service
Literature sources
quality dimensions
Dimensions identified Literature source Sample statements from interviews/focus groups
Administrative quality Leblanc and Nguyen (1997), Kwan and Ng (1999), “There should not be much bureaucracy”; “Procedures should be clearly stated”;
Oldfield and Baron (2000), Holdford and Reinders “Good communication between administration and students”; “Appropriate
(2001), Joseph et al. (2005), Abdullah (2006), system in place”; “Administrative staffs should always be willing to help
Zachariah (2007), Trivellas and Dargenidou students out”; “Administrative staff should be good to students”
(2009), Kwek et al. (2010), Sultan and Wong
(2011), Narang (2012)
Physical environment quality Cuthbert (1996), Owlia and Aspinwall (1996), “Having a peaceful environment”; “The campus should have good looking
Soutar and McNeil (1996), Leblanc and Nguyen infrastructure”; “Physical resources like library”; “A nice sports complex is
(1997), Joseph and Joseph (1997), Kwan and Ng important”; “Clean toilets”; “Lecture rooms must be well equipped”; “High level
(1999), Hill et al. (2003), Lagrosen et al. (2004); of security on campus”
O’Neill and Palmer (2004), Sohail and Shaikh
(2004), Telford and Masson (2005), Angell et al.
(2008), Narang (2012), Wong et al. (2012)
Core educational quality Owlia and Aspinwall (1996), Leblanc and Nguyen “Having academics who are smart”; “Knowledge of academics”; “Ability of
(1997), Kwan and Ng (1999), Holdford and lecturers to transmit enthusiasm”; “Up-to-date in their field”; “Faculty members
Reinders (2001), Telford and Masson (2005), are approachable”; “Using different teaching styles”; “Makes subject look easy”;
Abdullah (2006), Zachariah (2007), Trivellas and “Encourage participation of students in their learning process”; “Courses being
Dargenidou (2009), Kwek et al. (2010), offered”; “A culture of sharing and collaboration”; “Faculty members are willing
Shekarchizadeh et al. (2011), Sultan and Wong to help”; “Expertise of faculty”; “Course designed based in requirements of the
(2011), Narang (2012) students”; “Lecturers should have some industry experience”; “Having lot of
interaction with students”; “Research work of academics”; “Amount of
published materials”; “Relevant curriculum content”
Support facilities quality Leblanc and Nguyen (1997), Kwan and Ng (1999), “Good transport facilities”; “Adequate cafeteria”; “Opportunities for extra-
Hill et al. (2003); Lagrosen et al. (2004), Sohail and curricular activities”; “Computer laboratories available”; “Photocopy and
Shaikh (2004), Tan and Kek (2004), Joseph et al. printing services”
(2005), Kwek et al. (2010), Sultan and Wong
(2011)
Transformative quality Harvey and Green (1993), Harvey and Knight “Become more emotionally stable”; “Learning how to deal with emotions”;
(1996), Lomas (2007), Watty (2005), Srikanthan “Developing critical thinking”; “Acquiring job-related skills and knowledge”;
and Dalrymple (2003, 2005, 2007) “Become open-minded”; “Be independent thinkers”; “Learn to be responsible
citizens”; “Learning how to learn”; “Conduct research”
4. Research methodology Development
The study used a mixed methods approach, adhering to a pragmatic research of a
philosophy (Morgan, 2007). The first phase of the research undertook to identify service
quality attributes and ensure content validity of the scales developed. Content validity is
hierarchical
usually ensured through subjective expert reviews and pilot testing (Hair et al., 2010) model
and by a qualitative phase such as focus group discussion (Narang, 2012). In line with
related studies (Telford and Masson, 2005; O’Neill and Palmer, 2004; Angell et al., 2008), 249
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
5.2 Exploratory factor analysis and reliability analysis for administrative quality
EFA suggested the existence of two factors with eigen values more than 1, explaining
66.2 per cent of variance. As suggested by Hair et al. (2006), two criteria were used for
item retention; first, removal of item (s) loading highly on more than one factor, and
second, retention of items with highest loading with respect to each factor. The first
factor was termed “Attitude and Behaviour” and contained four items having factor
loadings ranging from 0.68 to 0.83, well above the suggested cut-off point of 0.40 as per
the sample size (Hair et al., 2006). The second factor was “Administrative Processes”
relating to the non-human aspects of administration and had three items retained with
factor loadings ranging from 0.83 and 0.85. Both scales were found to be reliable with
Cronbach’s alpha values of 0.72 and 0.87, respectively.
5.3 Exploratory factor analysis and reliability analysis for physical environment quality
Two items were deleted following the results of EFA and Cronbach’s alpha tests. The
final scale for physical environment quality consisted of three dimensions comprising a
total of ten quality attributes explaining 70.6 per cent of variance. These were support
infrastructure (four items), learning setting (three items) and general infrastructure
(three items). The various sub-scales showed appropriate reliability with Cronbach’s
alpha values all being above the 0.70 threshold value (Hair et al., 2006) for the three
scales (0.81, 0.72 and 0.71).
5.4 Exploratory factor analysis and reliability analysis for core educational quality
Four factors including 17 variables were extracted based upon the latent root criterion
analysis and scree test accounting for 60.4 per cent of total variance. From the rotated
factor solution, it was observed that two factors had loadings below the 0.40 cut-off point
and were therefore omitted. Except for the dimension “pedagogy”, all the
sub-dimensions within the “Core Educational Quality” dimension had Cronbach’s alpha
values greater than the threshold value of 0.70, thus demonstrating an acceptable level
of reliability. In fact, the dimension pedagogy was retained nevertheless, given that it
was still above 0.60 considered as acceptable for new scale development (Churchill,
1979, cited in Clemes et al., 2013). The final factor structure for core educational quality
comprised four factors: “Attitude and Behaviour” (six items), “Curriculum” (four items),
“Pedagogy” (four items) and “Competence” (three items).
Factors (% variance Factor
Development
explained; eigen value) Service quality dimensions and attributes loading ␣ of a
Administrative quality
hierarchical
Attitude and behavior Willingness of administrative staff members 0.825 0.775 model
(47.6%; 3.34) to help students
Ability of administrative staff members to 0.764
solve students’ problems 251
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
Curriculum (11.3%; 1.99) Clearly defined course content and course 0.816 0.761
objectives
Usefulness of module content and design to 0.810
cater for the personal needs of students
252 Challenging academic standards of 0.808
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
EFA revealed a unidimensional structure. All the eight quality attributes (items) loaded
significantly ranging from 0.57 to 0.81. While the literature suggested that the
transformative quality concept might be broken down into two factors, namely,
empowerment and enhancement, the qualitative phase did not provide any evidence for
it. The result of the EFA lends credence to the latter, and therefore, transformative
quality is best measured using a unidimensional scale. The measurement scale for
transformative quality obtained a satisfactory reliability score, with a Cronbach’s alpha
value of 0.87, well above the 0.70 threshold.
24,2 Behaviour
Administrative
Quality
Administrative
Processes
254
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
Support Facilities
Quality
Curriculum
Attitude and
Core Educational Service Quality
Behaviour
Quality
Competence
Pedagogy
Transformative
Quality
Support
Infrastructure
Physical
Environment
Learning Setting Quality
Figure 1. General
HESQUAL model Infrastructure
final model was a hierarchical model termed HESQUAL consisting of five primary
dimensions, nine sub-dimensions and included a total of 48 items. Moreover, as
discussed throughout the paper, a holistic approach was adopted whereby both the
functional and technical aspects of service quality were considered. The notion of
transformative quality has also been successfully integrated in the model and a valid
and reliable scale developed to measure this important and novel construct, which had
not been operationalised and used by previous studies.
As the phenomenon of internationalisation and marketisation of higher education
continues to evolve, higher education institutions are bound to constantly improve the
quality of service they provide. The instrument developed provides a useful tool for
university management in their quest for continuously improving service quality. The
level of service quality can be measured in a holistic way and remedial actions taken. At
a theoretical level, the findings of this study offer much additional insights to
researchers in understanding the fine detail of the concept of service quality when
applied to the higher educational context. Furthermore, the potential replication and
testing of the model in various higher education systems around the world would
provide valuable knowledge with respect to higher educational quality both at local and
international level.
References Development
Abdullah, F. (2006), “The development of HEdPERF: a new measuring instrument of service of a
quality for the higher education sector”, International Journal of Consumer Studies, Vol. 30
No. 6, pp. 569-581.
hierarchical
Angell, R.J., Heffernan, T.W. and Megicks, P. (2008), “Service quality in postgraduate education”,
model
Quality Assurance in Education, Vol. 16 No. 3, pp. 236-254.
Arambewela, R. and Hall, J. (2006), “A comparative analysis of international education 255
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
Iacovidou, M., Gibbs, P. and Zopiatis, A. (2009), “An exploratory use of the stakeholder approach
to defining and measuring quality: the case of a Cypriot higher education”, Quality in
Higher Education, Vol. 15 No. 2, pp. 37-41.
Jackson, M.J., Helms, M.M. and Ahmadi, M. (2011), “Quality as a gap analysis of college students’
expectations”, Quality Assurance in Education, Vol. 19 No. 4, pp. 392-412.
Joseph, M. and Joseph, B. (1997), “Service quality in education: a student perspective”, Quality
Assurance in Education, Vol. 5 No. 1, pp. 15-21.
Joseph, M., Yakhou, M. and Stone, G. (2005), “An educational institution’s quest for service quality:
customers’ perspective”, Quality Assurance in Education, Vol. 13 No. 1, pp. 66-82.
Kalayci, N., Watty, K. and Hayirsever, F. (2012), “Perceptions of quality in higher education: a
comparative study of Turkish and Australian business academics”, Quality in Higher
Education, Vol. 18 No. 2, pp. 37-41.
Kang, G.D. (2006), “The hierarchical structure of service quality: integration of technical and
functional quality”, Managing Service Quality, Vol. 16 No. 1, pp. 37-50.
Kang, G.D. and James, J. (2004), “Service quality dimensions: an examination of Grönroos’s service
quality model” Managing Service Quality, Vol. 14 No. 4, pp. 266-277.
Kwan, P.Y.K. and Ng, P.W.K. (1999), “Quality indicators in higher education – comparing Hong
Kong and China’s students”, Managerial Auditing Journal, Vol. 14 No. 1, pp. 20-27.
Kwek, L.C., Lau, T.C. and Tan, H.P. (2010), “Education quality process model and its influence on
students’ perceived service quality”, International Journal of Business and Management,
Vol. 5 No. 8, pp. 154-165.
Lagrosen, S., Seyyed-Hashemi, R. and Leitner, M. (2004), “Examination of the dimensions of
quality in higher education”, Quality Assurance in Education, Vol. 12 No. 2, pp. 61-69.
LeBlanc, G. and Nguyen, N. (1997), “Searching for excellence in business education: an exploratory
study of customer impressions of service quality”, International Journal of Educational
Management, Vol. 11 No. 2, pp. 72-79.
Lomas, L. (2007), “Zen, motorcycle maintenance and quality in higher education”, Quality
Assurance in Education, Vol. 15 No. 4, pp. 402-412.
Morgan, D.L. (2007), “Qualitative and quantitative methods”, Journal of Mixed Methods Research,
Vol. 1 No. 1, pp. 48-76.
Nadiri, H., Kandampully, J. and Hussain, K. (2009), “Students’ perceptions of service quality in
higher education”, Total Quality Management & Business Excellence, Vol. 20 No. 5,
pp. 523-535.
Narang, R. (2012), “How do management students perceive the quality of education in public
institutions?”, Quality Assurance in Education, Vol. 20 No. 4, pp. 357-371.
Oldfield, B.M. and Baron, S. (2000), “Student perceptions of service quality in a UK university
business and management faculty”, Quality Assurance in Education, Vol. 18 No. 2,
pp. 85-95.
O’Neill, M.A. and Palmer, A. (2004), “Importance-performance analysis: a useful tool for directing Development
continuous quality improvement in higher education”, Quality Assurance in Education,
Vol. 12 No. 1, pp. 39-52.
of a
Owlia, M.S. and Aspinwall, E.M. (1996), “A framework for the dimensions of quality in higher
hierarchical
education”, Quality Assurance in Education, Vol. 4 No. 2, pp. 12-20. model
Parasuraman, A., Zeithaml, A. and Berry, L. (1985), “A conceptual model of service quality and its
implications for future research”, Journal of Marketing, Vol. 49 No. 4, pp. 41-50. 257
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
Parasuraman, A., Zeithaml, V. and Berry, L. (1988), “SERVQUAL: a multiple-item scale for
measuring consumer perceptions of service quality”, Journal of Retailing, Vol. 64 No. 1,
pp. 12-40.
Pariseau, S.E. and Mcdaniel, J.R. (1997), “Assessing service quality in schools of business”,
International Journal of Quality & Reliability Management, Vol. 14 No. 3, pp. 204-218.
Quinn, A., Lemay, G., Larsen, P. and Johnson, D.M. (2009), “Service quality in higher education”,
Total Quality Management, Vol. 20 No. 2, pp. 139-152.
Shekarchizadeh, A., Rasli, A. and Hon-Tat, H. (2011), “SERVQUAL in Malaysian universities:
perspectives of international students”, Business Process Management Journal, Vol. 17
No. 1, pp. 67-81.
Sohail, M.S. and Shaikh, N.M. (2004), “Quest for excellence in business education: a study of
student impressions of service quality”, International Journal of Educational Management,
Vol. 18 No. 1, pp. 58-65.
Soutar, G. and McNeil, M. (1996), “Measuring service quality in a tertiary institution”, Journal of
Educational Administration, Vol. 34 No. 1, pp. 72-82.
Srikanthan, G. and Dalrymple, J. (2003), “Developing alternative perspectives for quality in higher
education”, International Journal of Educational Management, Vol. 17 No. 3, pp. 126-136.
Srikanthan, G. and Dalrymple, J. (2005), “Implementation of a holistic model for quality in higher
education”, Quality in Higher Education, Vol. 11 No. 1, pp. 69-81.
Srikanthan, G. and Dalrymple, J.F. (2007), “A conceptual overview of a holistic model for quality
in higher education”, International Journal of Educational Management, Vol. 21 No. 3,
pp. 173-193.
Sultan, P. and Wong, H. (2010), “Performance-based service quality model: an empirical study on
Japanese universities”, Quality Assurance in Education, Vol. 18 No. 2, pp. 126-143.
Sultan, P. and Wong, H.Y. (2011), “Service quality in a higher education context: antecedents and
dimensions”, International Review of Business Research Papers, Vol. 7 No. 2, pp. 11-20.
Sureshchandar, G.S., Rajendran, C. and Anantharaman, R.N. (2001), “A holistic model for total
quality service”, International Journal of Service Industry Management, Vol. 12 No. 4,
pp. 378-412.
Tan, K.C. and Kek, S.W. (2004), “Service quality in Higher Education using an enhanced
SERVQUAL approach”, Quality in Higher Education, Vol. 10 No. 1, pp. 17-24.
Teas, R. (1993), “Expectations, performance evaluation, and customers’ perceptions of quality”,
Journal of Marketing, Vol. 57 No. 4, pp. 18-34.
Telford, R. and Masson, R. (2005), “The congruence of quality values in higher education”, Quality
Assurance in Education, Vol. 13 No. 2, pp. 107-119.
Trivellas, P. and Dargenidou, D. (2009), “Organisational culture, job satisfaction and higher
education service quality: the case of Technological Educational Institute of Larissa”, The
TQM Journal, Vol. 21 No. 4, pp. 382-399.
QAE Watty, K. (2005), “Quality in accounting education: what say the academics?”, Quality Assurance
in Education, Vol. 13 No. 2, pp. 120-131.
24,2
Wong, K., Tunku, U. and Rahman, A. (2012), “Constructing a survey questionnaire to collect data
on service quality of business academics”, European Journal of Social Sciences, Vol. 29
No. 2, pp. 209-221.
Zachariah, S. (2007), “Managing quality in higher education: a stakeholder perspective”, PhD
258 thesis, University of Leicester.
Downloaded by UNIVERSITY OF MAURITIUS, Mr Viraiyan Teeroovengadum At 23:09 28 March 2016 (PT)
Further reading
Brady, M.K. and Cronin, J.J. (2001), “Perceived service conceptualizing approach quality: a
hierarchical”, Journal of Marketing, Vol. 65 No. 3, pp. 34-49.
For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com