Professional Documents
Culture Documents
Systematic Review
Systematic Review
www.emeraldinsight.com/0090-7324.htm
Business
Systematic review of research instruction
methods: the case of business
instruction
385
Ann Manning Fiegen
California State University San Marcos, San Marcos, California, USA Received 12 April 2010
Revised 4 May 2010
Accepted 7 May 2010
Abstract
Purpose – The purpose of this paper is to assess the body of business instruction literature by
academic librarians against evolving models for evidence-based research.
Design/methodology/approach – The paper used systematic review and inter-rater reliability of
the literature of business information research instruction to test two attributes of research quality:
the evidence-based levels of evidence and the EBLIP critical analysis checklist.
Findings – Intervention questions and case studies are the most popular research methods on the
EBL levels of evidence scale. The majority of articles score below 75 on the EBLIP critical appraisal
checklist. Prediction questions are represented by higher levels of evidence and study quality.
Intervention questions paired with the cohort design and exploratory questions paired with survey
design indicate strong areas of research quality. The case study method, while most popular, showes
lower scores across all question types yet revealed some high-quality benchmark examples.
Research limitations/implications – Error is possible when distinguishing between cohort and
case study – some articles may fall into one or the other study design. Rater training was conducted
only once, and best practices for inter-rater reliability recommend multiple rounds to achieve higher
rater agreement.
Practical implications – Recommendations are presented for ways to improve the evidence base of
research articles and suggest areas for professional development opportunities for librarian
researchers wishing to increase the quality of research publications.
Originality/value – The paper goes beyond the narrative review of the literature of business
instruction to measure the research methods employed in those publications against two
evidence-based standards. The results will show where the literature stands as a maturing
discipline and provide recommendations for increasing the levels of evidence for future research.
Keywords Academic libraries, Research methods
Paper type Literature review
Introduction
Evidence-based practice advocates that academic librarians look to the published
literature to find reliable and valid studies as guidance. To inform and guide this
researcher’s practice, an effort was undertaken to locate studies that could offer that
guidance for information literacy instruction for business students. The result of that
search is that business librarians are prolific authors of business information research
This study was funded in part by a BRASS Emerald Research Award 2009 and a California State
Reference Services Review
University San Marcos, Faculty Research Grant. The author wishes to acknowledge the Vol. 38 No. 3, 2010
participation of Martha Cooney, Cheryl Delson, Nancy Dewald, Patrick Ragains, Frank Vuotto, pp. 385-397
q Emerald Group Publishing Limited
and Diana Wu. Portions of this report were presented at the California Academic and Research 0090-7324
Libraries Conference 2010. DOI 10.1108/00907321011070883
RSR instruction, resulting in hundreds of published studies to read and apply to practice.
38,3 This wealth of information prompted this researcher to ask: What measure defines
high-quality evidence-based research that can reliably be applied from the many
studies published? What kind of research do business librarians undertake, and taken
in the aggregate do they suggest a maturing research methodology in the discipline?
And finally, against what standards can this body of literature be measured?
386 Before the content of the articles could reliably be applied to best practice, the
quality of the studies needed to be ascertained. The questions explored in this report
form a portion of a larger study that analyzes the content of the literature along the
dimensions of study objectives and results, institution setting sample population, and
pedagogy (theorist, standard, or model) employed. The focus of this report is only on
the research methods employed. Systematic review offers a model for summarizing
and critiquing the literature to improve future practice and possibly encourage higher
levels of research methods. A systematic literature review of 30 years should reveal
evidence toward a maturing research methodology.
Academic librarians have applied theory to practice and documented improvement
efforts through the published literature of business instruction. The case study is a
popular method for describing best practices and well suited for the action research
needed by librarians. Case studies, however, are a marker for a young discipline and
considered by some not to be as rigorous as compared to higher levels of research. How
then can this discipline strive for higher levels of evidence?
The objective of this study is to assess the body of business instruction literature in
academic libraries against evolving models for evidence-based research. The results
should indicate opportunities for higher levels of research methodology in keeping with
a maturing discipline. Eldredge (2006) challenged librarians to apply and test his
proposed evidence-based librarianship levels of evidence model he applied initially to
research studies in medical librarianship. The question is whether the same can be said
for the literature of instruction in business information research. Where on Eldredge’s
proposed matrix is the state of research in this discipline, what if any are the implications
for future research, and what is the case for business instruction literature?
Academic librarians are introduced to research methods through graduate
education and must continue to learn the process essentially independently. The
expertise is gained from continuing education, professional development, and
mentoring relationships and using library collections on research methods. Guidelines
for evidence-based research can be used as a professional development tool to guide the
new researcher and as an instrument to assess the quality of existing research. The
medical field has led this effort by developing guidelines that help the researcher
assess the quality of existing research. Examples of “critical appraisal tools” can be
accessed from the International Centre for Allied Health Evidence (2009) web site
among others. Glynn (2006) studied many models to arrive at her instrument, the
EBLIP critical appraisal tool for library research. This study will show how that
instrument can be applied to the critical appraisal of business instruction literature by
testing two attributes of research quality using the business information instruction
literature as a case analysis. A systematic review of the literature will test for levels of
evidence and evidence of research method quality:
H1. Library articles on this subject will overwhelmingly be exploratory case
studies and low on the Eldredge levels of evidence hierarchy.
H2. The majority of articles will score below 75 percent as measured by the EBLIP Business
critical appraisal checklist. instruction
Literature review
Commonly accepted social science research method definitions vary slightly depending
on the discipline and the author’s objectives. Widely followed for case analysis is
Yin (2003) who categorizes social science research into experimental, survey, archival 387
analysis, history, and case study. Gormon and Clayton’s (2005) Handbook for
Information Professionals divided qualitative research methods into observational,
interviewing, group discussion, and historical study. Fink (2005) emphasized assessing
quality of research studies for literature reviews and categorized studies into either
experimental or quasi-experimental families. Cooper (2010) cautioned about error when
evaluating the quality of studies.
There are a number of studies that have analyzed and critiqued the preferred research
methods used by librarians. Specific examples include Watson-Boone (2000, p. 87) who
examined 24 articles from Journal of Academic Librarianship ( JAL), grouped them into
six research methods, and ranked them by order of frequency of research method used:
survey research, action research, secondary data analysis, and case study, with
evaluation research and experimental tied for last. The typical JAL article emphasized
problem-solving and managerial issues, and therefore, the Watson-Boone categories
cannot be generalized to this study. Most recently, Hildreth and Aytac (2007)
summarized library practitioner articles from 2003 to 2005 into descriptive, exploratory,
explanatory, and evaluative research. A summative approach does not support the
benchmarking objective of this study.
Eldredge (2002) borrows from clinical medicine with the intent of applying his model
for evidence-based librarianship (EBL) levels of evidence to medical librarianship
research. The matrix approach associates three types of research questions (prediction
questions, intervention questions, and exploration questions) with ranked research
methods. The highest ranked level of evidence is systematic review, followed by
meta-analysis, summing up, prospective or retrospective cohort study, qualitative
studies, descriptive study or survey, and case study. He observes that typical library
research studies fall into the lower levels of descriptive survey, case study, and
qualitative methods, areas where error and author bias will more typically occur
(Eldredge, 2002, p. 294). Given (2006) continues a long-standing debate by arguing that
the levels of evidence introduced by Eldredge favor quantitative research methods and
that relegating qualitative research to the lowest level overlooks its appropriate place in
social science, including library science research, continuing a long-standing debate.
The Eldredge model of 2002 is adopted here as it most closely supports the objectives of
this study.
Edwards (1994) reported on the percent of research articles to non-research articles
between 1971 and 1991 when reviewing the research of bibliographic instruction.
She ranked frequency of research method used among the research articles and
frequency of library instruction topic. While the Edwards’ study is not directly
comparable for the present research, results can confirm or deny a trend. Literature
reviews of library instruction are characterized by the exploratory question (what was
published) and the descriptive narrative review method (summary of trends and
annotation of entries for a given time period). Typical of this genre are Rader (1974, 2002)
RSR and Johnson et al. (2007). Crawford and Feldt (2007) conducted a systematic analysis of
38,3 library instruction literature using citations from the ERIC database as their source.
It expands the narrative review by including explicit research objectives, statements of
study inclusion and exclusion, and article analysis of the articles. Koufogiannakis (2006)
reported on a systematic review and meta-analysis of the most effective method for
teaching information literacy skills to undergraduate students and found that
388 computer-aided instruction was as effective as traditional instruction and that
traditional and self-directed instructions are more effective than no instruction. She
recommended further research be conducted in comparative and validated research
methods and suggested that additional replication be included in existing high-quality
studies.
Examples of narrative review articles that summarize the state of information literacy
specific to business students include Jacobson (1993) who summarized the literature of
best practices for business instruction from 1985 to 1992. Most published articles about
business instruction include literature reviews. Cooney (2005) surveyed business
instruction librarians at AACSB-accredited colleges to assess the extent of business
instruction in libraries (indicating a trend toward more evidence-based information
literacy research) and a systematic review by Zhang et al. (2007) that compared the
effectiveness of face to face to computer-assisted instruction. This study will go beyond
the narrative review to measure the research methods employed against two standards:
the EBL levels of evidence model and the EBLIP critical appraisal checklist.
Methodology
Library and business education bibliographic databases were searched for English
language publications between 1980 and spring 2009. The database was initially created
in 2004 and repeated thereafter through spring 2009. Databases searched included
EbscoHost Premier, Emerald Fulltext, ERIC, Library Literature and Information
Science, LISA, ProQuest Inform Global, and the ISI’s Web of Knowledge. Hand searches
of cited references in the primary literature were also conducted. File drawer bias is
outside the parameters of the study as this study’s objective was to research only the
published literature. The databases searched replicate those used by Johnson et al. (2007)
in their annual review article of library instruction with the addition of Emerald and ABI
Inform Global. The later were included to expand the search to internationally published
reports in library science and business management education. Each index was
searched for the terms: library and business and (instruct * or literac * or assess* or
teach *) and (academic or higher education or college or university).
Bibliographic records and abstracts were scanned resulting in an initial set of
245 articles about library instruction for business students. Further review resulted in
69 articles as the initial set of articles for this study. Criteria for inclusion were as follows:
articles were authored or coauthored by a practicing academic librarian, the subject of
the study was instruction in business research in academic libraries, and the article
was in English and published between 1980 and 2009. Excluded were non-peer-reviewed
articles, articles appearing as columns, studies authored by business faculty or by
faculty teaching in library, and information science graduate programs but
not coauthored by a practicing librarian and business education literature about
information competencies that did not explicitly refer to library instruction with a
librarian.
The bibliographic software EndNote was used to hold the data sets. Each article was Business
read, coded, and color classified according to one of the Eldredge’s (2002) three research instruction
questions: predication, intervention, or exploratory. Table I shows Eldredge’s levels of
evidence matrix where each article is categorized into its corresponding question type
and research method of meta-analysis, summing up, prospective or retrospective cohort
study, qualitative studies, descriptive study or survey, and case study.
Eldredge (2004) distinguishes higher order levels of evidence from the lower level by 389
their “distinct hypothesis and objectives, interventions, and measurable outcomes”.
The definitions in Eldredge’s (2004) inventory of research methods guided the
classification of the data set. Cohort studies used a defined population, indicated some
kind of intervention even if it only described a change from status quo, and had a
measurable outcome. Those studies were further identified as either using proscriptive
or using retrospective data collection methods. Articles defined as case studies described
an experience. According to Eldredge (2004), case studies are distinguished by their
description and analysis of author’s experiences, have multiple sources of evidence, and
will answer how and why questions; refer to Booth and Brice (2004) and Eldredge (2004)
for more complete descriptions of definitions and study design.
The paired response inter-rater reliability method (Fink, 2005) was used to rate the
research quality of the articles against the Glynn (2006) EBLIP Critical Appraisal
Checklist. Six raters were selected to participate in the study, five of whom were authors
writing on the subject in the last ten years. One recent library and information science
(LIS) graduate with prior research methods experience was also invited to participate.
No rater was assigned to their own study, although one rater disclosed frequent
collaboration with one of the assigned articles. It was not deemed enough of a conflict to
warrant exclusion. Permissions to use the Eldredge and Glynn instruments were
obtained from the authors as were publisher supplied reprints of the Glynn article for all
raters. Exempt status was submitted and granted by the university review board since
the subjects under study, the published articles, were not human subjects.
A training packet was mailed to the raters that included instructions describing
the expectations and compensation for participation in the study, procedures for using
the rating instrument, a copy of the Glynn article and checklist, and a sample article to
rate that was not included as part of the study. Training sessions were scheduled and
conducted in summer of 2009. Raters were instructed to read the sample article, use the
checklist to rate the article independently by noting any comments or questions,
and return their rating sheet to the principle investigator prior to the scheduled WebEx
training session. At each of the three WebEx training sessions, each pair of raters and
(one systematic, two surveys, and three cases), a stronger showing in the top tier than
intervention articles. The descriptive survey design method n ¼ 8 is generally clustered
high with two studies in the top tier, five in the middle tier, and only four studies in the
lower tier. This validates the hypothesis that librarians are familiar and comfortable
with exploratory survey design methodology. Case studies on the other hand were
mixed with three high scores, two in the middle range, while three scored low as did most Business
of the studies deemed ineligible. instruction
Study limitations
Every effort was made to include all published business research instruction by
academic librarians in the initial search but some may have been missed. Owing to the
ambiguity in some of the articles, error is possible when distinguishing between cohort
393
and case study some articles may fall into one or the other study design. Some raters,
while among the higher ranked authors, expressed a need for additional training that
time and resources did not allow. Rater training was conducted only once, and best
practices for inter-rater reliability recommend multiple rounds to achieve higher rater
agreement. The reliability of this method may increase with higher knowledge of
research methodology and multiple rounds of training to increase inter-rater reliability.
Kappa statistic was not conducted for this study but relied rather on simple majority.
Article authors were not blind to the raters; therefore, bias is possible. This researcher’s
own article and in another case a rater’s colleague were in the set; nevertheless, the
articles were included. Outside the scope of this research is an important group of
writers of business instruction in libraries. Articles that were solely authored by LIS
professors in graduate schools or business faculty and were not coauthored by
practicing librarians were not included in this data set. Although they contribute
important studies, inclusion of those reports would skew results for this study
population. Case studies of reports of practice are outside the scope of this study but
continue to serve an important function for advancing the practice of library
instruction for business students and were represented in the first large data set. Many
other studies appear as book chapters and outside the indexing and abstracting
services and therefore may have been missed.
References
Booth, A. and Brice, A. (Eds) (2004), Evidence-based Practice for Information Professionals:
A Handbook, Facet, London.
Cooney, M. (2005), “Business information literacy instruction – a survey and progress report”,
Journal of Business and Finance Librarianship, Vol. 11 No. 1, pp. 3-25.
Cooper, H. (2010), Research Synthesis and Meta-analysis: A Step-by-step Approach, 4th ed., Sage,
Thousand Oaks, CA.
Crawford, G.A. and Feldt, J. (2007), “An analysis of the literature on instruction in academic
libraries”, Reference & User Services Quarterly, Vol. 46 No. 3, pp. 77-87.
Edwards, S. (1994), “Bibliographic instruction research: an analysis of the journal literature from
1977 to 1991”, Research Strategies, Vol. 12 No. 2, pp. 68-78.
Eldredge, J.D. (2002), “Evidence-based librarianship levels of evidence”, Hypothesis, Vol. 16 No. 3,
pp. 10-13.
Eldredge, J.D. (2004), “Inventory of research methods for librarianship and informatics”, Journal
of the Medical Librarian Association, Vol. 92 No. 1, pp. 83-90.
Eldredge, J.D. (2006), “Evidence-based librarianship: the EBL process”, Library Hi Tech, Vol. 24
No. 3, pp. 341-54.
RSR Fink, A. (2005), Conducting Research Literature Reviews: From the Internet to Paper, 2nd ed.,
Sage, Thousand Oaks, CA.
38,3
Given, L. (2006), “Qualitative research in evidence-based practice: a valuable partnership”,
Library Hi Tech, Vol. 24 No. 3, pp. 376-86.
Glynn, L. (2006), “A critical appraisal tool for library and information research”, Library Hi Tech,
Vol. 24 No. 3, pp. 387-99.
396 Gormon, G.E. and Clayton, P. (2005), Qualitative Research for the Information Professional:
A Practical Handbook, 2nd ed., Facet, London.
Hildreth, C.R. and Aytac, S. (2007), “Recent library practitioner research: a methodological
analysis and critique”, available at: http://myweb.cwpost.liu.edu/childret/practitioner-
research.doc (accessed 15 April 2007).
International Centre for Allied Health Evidence (2009), “Critical Appraisal Tools”, available at:
www.unisa.edu.au/cahe/CAHECATS/ (accessed 29 March 2010).
Jacobson, T.E. (1993), “Another look at bibliographic instruction for business students”, Journal
of Business and Finance Librarianship, Vol. 1 No. 14, pp. 17-24.
Johnson, A.M., Jent, S. and Reynolds, L. (2007), “Library instruction and information literacy
2006”, Reference Services Review, Vol. 35 No. 4, pp. 584-640.
Koufogiannakis, D. (2006), “Effective methods for teaching information literacy skills to
undergraduate students: a systematic review and meta-analysis”, Evidence Based Library
and Information Practice, Vol. 1 No. 3, pp. 3-43.
Rader, H. (1974), “Library orientation and instruction – 1973: an annotated review of the
literature”, Reference Services Review, Vol. 2, pp. 91-3.
Rader, H. (2002), “Information literacy 1973-2002: a selected literature review”, Library Trends,
Vol. 51 No. 2, p. 242.
Watson-Boone, R. (2000), “Academic librarians as practitioner-researchers”, Journal of Academic
Libraries, Vol. 26 No. 2, pp. 85-93.
Yin, R.K. (2003), Case Study Research: Design and Methods, 3rd ed., Sage, Thousand Oaks, CA.
Zhang, L., Watson, E.M. and Banfield, L. (2007), “The efficacy of computer-assisted instruction
versus face-to-face instruction in academic libraries: a systematic review”, The Journal of
Academic Librarianship, Vol. 33 No. 4, pp. 478-84.
Intervention – RCT
Diamond, T. and McGee, J.E. (1995), “Bibliographic instruction for business writing students:
implementation of a conceptual framework”, RQ: Research Quarterly, Vol. 34 No. 3, pp. 340-60. 397
Intervention – cohort
Fiegen, A.M., Cherry, B. and Watson, K. (2002), “Reflections on collaboration: learning outcomes
and information literacy assessment in the business curriculum”, Reference Services Review,
Vol. 30 No. 4, pp. 307-18.
Lombardo, S.V. and Miree, C.E. (2003), “Caught in the web: the impact of library instruction
on business student’s perceptions and use of print and online resources”, College & Research
Libraries, Vol. 64, January, pp. 6-24.
Intervention – case
Judd, V., Tims, B., Farrow, L. and Periatt, J. (2004), “Evaluation and assessment of a library
instruction component of an introduction to business course: a continuous process”, Reference
Services Review, Vol. 32 No. 3, pp. 274-83.
Exploratory – survey
Cooney, M. (2005), “Business information literacy instruction – a survey and progress report”,
Journal of Business and Finance Librarianship, Vol. 11 No. 1, pp. 3-25.
Dewald, N.H. (2003), “Anticipating library use by business students: the uses of a syllabus
study”, Research Strategies, Vol. 19 No. 1, pp. 33-45.