You are on page 1of 20

International Journal of Operations & Production Management

Journal ranking analyses of operations management research


Charles G. Petersen Gerald R. Aase Daniel R. Heiser

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

Article information:
To cite this document:
Charles G. Petersen Gerald R. Aase Daniel R. Heiser, (2011),"Journal ranking analyses of operations
management research", International Journal of Operations & Production Management, Vol. 31 Iss 4 pp.
405 - 422
Permanent link to this document:
http://dx.doi.org/10.1108/01443571111119533
Downloaded on: 17 March 2015, At: 20:01 (PT)
References: this document contains references to 34 other documents.
To copy this document: permissions@emeraldinsight.com
The fulltext of this document has been downloaded 3198 times since 2011*

Users who downloaded this article also downloaded:


Chris Voss, Nikos Tsikriktsis, Mark Frohlich, (2002),"Case research in operations management",
International Journal of Operations & Production Management, Vol. 22 Iss 2 pp. 195-219 http://
dx.doi.org/10.1108/01443570210414329
Paul Coughlan, David Coghlan, (2002),"Action research for operations management", International
Journal of Operations & Production Management, Vol. 22 Iss 2 pp. 220-240 http://
dx.doi.org/10.1108/01443570210417515
Mark Rainbird, (2004),"A framework for operations management: the value chain", International
Journal of Physical Distribution & Logistics Management, Vol. 34 Iss 3/4 pp. 337-345 http://
dx.doi.org/10.1108/09600030410533628

Access to this document was granted through an Emerald subscription provided by 546288 []

For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald for
Authors service information about how to choose which publication to write for and submission guidelines
are available for all. Please visit www.emeraldinsight.com/authors for more information.

About Emerald www.emeraldinsight.com


Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well as
providing an extensive range of online products and additional customer resources and services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee
on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive
preservation.
*Related content and download information correct at time of download.

The current issue and full text archive of this journal is available at
www.emeraldinsight.com/0144-3577.htm

Journal ranking analyses of


operations management research

Operations
management
research

Charles G. Petersen and Gerald R. Aase

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

Operations Management and Information Systems, College of Business,


Northern Illinois University, DeKalb, Illinois, USA, and

Daniel R. Heiser
Department of Management, College of Commerce, DePaul University,
Chicago, Illinois, USA
Abstract

405
Received August 2008
Revised December 2009,
April 2010,
June 2010,
July 2010
Accepted July 2010

Purpose Several published studies have ranked journals based on perceived quality according to
operations management (OM) researchers. The purpose of this paper is to examine the ranking of
journals for OM research using meta analysis.
Design/methodology/approach The study begins by using a meta-analysis approach to combine
results of five recent OM journal ranking studies. A new citation analysis using OM research articles
published in International Journal of Operations & Production Management, Journal of Operations
Management, and Production and Operations Management between 1999 and 2005 is presented.
Findings Results of the meta-analysis and the citation analysis have many similarities, but there
are some striking differences suggesting the evolution of OM research away from operations research
and engineering. The results also illustrate the diversity of OM research ranging from analytical
modeling to empirical studies influenced by other business disciplines and the behavioral sciences.
Originality/value This is the first time meta-analysis has been used to examine the ranking of
journals for OM research. This research also provides a current citation analysis of the OM field for the
first time in over a decade. While the results of these two analyses show some similarities with many
of the same journals, there are some marked differences that may support the notion of an evolving
OM field.
Keywords Operations management, Serials, Operational research
Paper type Research paper

Introduction
The field of operations management (OM) has been evolving for many years and may
broadly be construed as incorporating supply chain management, quality management,
product and process design, project management, and other topical areas. OM is defined
as the design, operation, and improvement of the systems that create and deliver the
firms primary products and services (Chase et al., 2006). Given the breadth of the
discipline and the overlap with other fields of inquiry, it is often difficult to distinguish
between OM research and research related to the fields of general management (GM),
industrial engineering (IE), and management information systems (MISs). This is
particularly true of the overlapping, but distinct field of operations research/
management science (OR/MS), which is defined as the application of quantitative
methods to decision making in all fields (Chase et al., 2006). As a logical consequence
there are many viewpoints about what constitutes high-quality OM research. To help
identify quality OM research, several studies have examined journal rankings according
to OM researchers.

International Journal of Operations &


Production Management
Vol. 31 No. 4, 2011
pp. 405-422
q Emerald Group Publishing Limited
0144-3577
DOI 10.1108/01443571111119533

IJOPM
31,4

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

406

There are many reasons for our interest in evaluating the quality of journals (Coe and
Weinstock, 1984; Barman et al., 2001) including:
.
Promotion and tenure decisions of faculty.
.
Faculty merit evaluation.
.
Authors determining appropriate outlets for their work.
.
Editors seeking feedback on their article selection process.
.
Faculty and graduate students monitoring research topics and to select a
research agenda.
.
Library administrators making journal acquisition decisions.
.
Practitioners keeping current with major advancements in their area of interest.
The next section examines recent journal ranking studies focused on the OM field. This
review focuses on identifying the survey respondent or citation analysis base journals
(i.e. basis for study), the measure of journal ranking used by each study (e.g. quality,
influence, or relevance), and the degree to which authors distinguish the OM and OR/MS
fields. We then present a meta-analysis of five recent journal ranking studies focused on
the OM field. This technique allows the authors to develop a composite ranking of
journals that mitigates inherent biases found in all studies of this nature. We then
examine OM journals using a new citation analysis based on articles published in three
highly regarded OM journals. Results of the two analyses are then compared and
discussed.
Literature review
Opinion surveys and citation analyses are well-established methodologies for
evaluating journals within the various business fields such as finance (Oltheten et al.,
2005), information systems (Peffers and Ya, 2003; Katerattanakul et al., 2003;
Barnes, 2005), management (Podsakoff et al., 2005), international business (DuBois and
Reeb, 2000), and marketing (Koojaroenprasit et al., 1998; Baumgartner and Pieters, 2003;
Guidry et al., 2004). This section examines similar studies focused on the OM field.
The first published studies in the OM field are Saladin (1985) and Barman et al. (1991).
Saladin surveyed members of the Operations Management Association by asking them
to rate journals based on perceived quality. Barman et al. (1991) surveyed Decision
Sciences Institute members who listed OM as their primary interest by asking them to
rate journals according to their perceived quality and relevance to OM research. The
base list of journals considered within these studies influenced subsequent studies
within the OM field. Three survey studies have since been published where each study
focuses on a different respondent population. These studies are discussed next while
their respective results are summarized in Table I.
Soteriou et al. (1999) sampled European members of INFORMS (524 members) and
EurOMA (240 members) and received a total of 106 valid responses. The survey
respondents rated journals based on perceived quality and relevance to OM research.
These authors suggested the uniqueness of the OM field by omitting from their survey
several IE and OR journals considered in previous studies.
Barman et al. (2001) provided a 10-year update on the rankings of perceived relevance
and quality of selected POM journals by surveying US members of the Production

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32

Management Science (MS)


Operations Research (OR)
Journal of Operations Management ( JOM)
Decision Sciences (DS)
Production and Operations Management (POM)
International Journal of Production Research (IJPR)
IIE Transactions (IIET)
Harvard Business Review (HBR)
European Journal of Operational Research (EJOR)
Naval Research Logistics (NRL)
Mathematics of Operations Research (MOR)
International Journal of Operations & Production Management (IJOPM)
Operations Research Letters (ORL)
Interfaces (IF)
Journal of the Operational Research Society ( JORS)
Academy of Management Journal (AMJ)
Strategic Management Journal (SMJ)
Journal of Manufacturing Systems ( JMS)
Production and Inventory Management Journal (PIMJ)
Omega (OME)
Sloan Management Review (SMR)
Industrial Engineering (IE)
International Journal of Production Economics (IJPE)
Academy of Management Review (AMR)
Journal of Industrial Engineering ( JIE)
Journal of Service Management ( JSM)
International Transactions in Operations Research (ITOR)
International Journal of Quality and Reliability Management (IJQRM)
Journal of Supply Chain Management ( JSCM)
Computers and Operations Research (COR)
Quality Management Journal (QMJ)
Computers and Industrial Engineering (CIE)

Journal
0.02
0.05
0.17
0.24
0.22
0.37
0.15
0.41
0.20
0.12
0.10
0.49
0.34
0.29
0.39
0.44
0.46
0.63
0.54
0.32
0.59
0.80
0.78
0.73
0.76
0.61
0.66
0.90

1
2
7
10
9
15
6
17
8
5
4
20
14
12
16
18
19
26
22
13
24
33
32
30
31
25
27
37

22
24
26
29
21
32
30

15
17

0.63
0.69
0.74
0.83
0.60
0.91
0.86

0.43
0.49

0.66
0.54
0.34

0.11
0.51
0.57
0.31
0.26
0.37

4
18
20
11
9
13
23
19
12

0.03
0.06
0.09
0.29
0.17
0.14
0.46
0.23
0.20
0.40

1
2
3
10
6
5
16
8
7
14

0.95
0.90
1.00

21

0.81
0.67

20
19

17
14

0.86
0.71

0.48
0.62
0.57

10
13
12
18
15

0.76

0.05
0.10
0.14
0.19
0.24
0.52
0.33
0.29
0.43
0.38

16

1
2
3
4
5
11
7
6
9
8

18

27
45

20
15
31
21
10
7
16
25
13
23
24
17

0.37

0.55
0.92

0.41
0.31
0.63
0.43
0.20
0.14
0.33
0.51
0.27
0.47
0.49
0.35

0.18

0.02
0.06
0.08
0.22
0.29

1
3
4
11
14
9

0.04
0.10
0.12
0.16

2
5
6
8

24

16
20

19
23

11
15
13
18
21
8
14
25
22

2
6
7
10
9
17
12

1
3
5
4

0.96

0.64
0.80

0.76
0.92

0.44
0.60
0.52
0.72
0.84
0.32
0.56
1.00
0.88

0.08
0.24
0.28
0.40
0.36
0.68
0.48

0.04
0.12
0.20
0.16

0.036
0.085
0.144
0.209
0.210
0.227
0.248
0.258
0.290
0.310
0.389
0.406
0.428
0.438
0.446
0.484
0.496
0.522
0.522
0.536
0.543
0.573
0.573
0.601
0.633
0.717
0.733
0.737
0.746
0.767
0.786
0.817

Theoharakis
Soteriou
Barman
Goh
Vokurka
Average
survey
survey
survey
citation
citation
composite
Rank Score Rank Score Rank Score Rank Score Rank Score
Score

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

Operations
management
research
407

Table I.
Meta-analysis ranking of
journals for OM research

IJOPM
31,4

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

408

and Operations Management Society (POMS) whose primary interest is OM. Results
showed the perception of journals had changed since their initial 1991 study and they
found several journals perceived as being high quality were not perceived as being
relevant to OM research. For example, Management Science (MS) and Operations
Research (OR) were the top-ranked journals with regard to perceived quality, but they
were ranked much lower with regard to relevance to OM. These findings were quite
insightful, but the authors fell short of distinguishing OM and OR/MS as being differently
bounded domains. The authors also identified a new journal Manufacturing and Service
Operations Management (MSOM) as being relevant to OM based on survey write-ins.
Theoharakis et al. (2007) examined how the diversity or background of self-identified
OM researchers affects their perceived quality and relevance of a journal. The authors
took great care to clarify relevant research as that which stems from real industry
problems, rather than being relevant to the OM field. Their definition of quality remained
consistent with prior studies as being the perceived quality of OM-related articles that the
journal publishes. They e-mailed surveys to 9,674 researchers worldwide and obtained
888 useable responses. The authors found the journal ratings differed significantly based
on the region of the world and by whether the researchers identified themselves as an
empiricist or as a modeler. This research also made great strides towards distinguishing
the OM and OR/MS fields by reporting results for only the 11 academic journals
perceived as being most relevant (i.e. based on real industry problems). This decision
resulted in the exclusion of journals focused primarily on OR/MS, but they stop short of
explicitly distinguishing the two fields. The authors provided a complete set of survey
results in their working paper (Theoharakis et al., 2007).
Vokurka (1996) was the first citation analysis study of OM journals. The author used
Decision Sciences (DS), Journal of Operations Management ( JOM), and MS as the base
journals for his study of citations from 1992 to 1994. After culling non-OM articles,
Vokurka identified 146 OM articles with a total of 4,049 citations. Of the 332 unique
journals cited, his analysis identified 25 journals that accounted for nearly 80 percent of
the total journal citations. Vokurka stated this study shows the journals with the most
importance to OM research and the relative importance of various OM journals.
We cautiously agree with the first statement, but we believe the second statement does
not represent the data accurately. While few will argue whether OR is a high-quality
journal, the mission of that journal to serve the entire OR/MS community does not
support it as being an important OM journal. We interpret these results to imply OR is
an important or influential journal for OM research.
Goh et al. (1996) describe two basic methodologies for conducting citation analysis.
The first method inspects data published in the Social Science Citation Index (SSCI), but
the authors note this is problematic since some OM-related journals are not indexed in
the SSCI. A second approach, which is generally used within published citation
analyses, involves selecting a set of base journals and then manually collecting citation
data for all OM-related articles within those base journals. Goh et al. (1997) use the
second methodology to identify the most influential journals for OM research by
examining JOM, International Journal of Production Research (IJPR), and International
Journal of Operations & Production Management (IJOPM) as the base journals for their
analysis of citations from 1989 to 1993. The results are normalized to avoid undue
weighting towards one of the three base journals. This was accomplished by dividing
the number of times a journal is cited in a base journal by the total journal citations

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

in that base journal. The results of this study illustrate the breadth of the OM field by
identifying citations of a large number of engineering, OR/MS and management journals
in addition to OM journals.
All of these studies are subject to inherent limitations associated with the respective
methodology used. MacRoberts and MacRoberts (1989) and Vastag and Montabon
(2002) identify and discuss several problems with citation analysis including: uncited
primary research, biased citing, self-citing, and other problems associated with the type
of publication, nationality, specialty areas, and coverage of literature. Survey studies
also have several potential problems when a list of journals is provided. Beed and Beed
(1996) found a typical economist knows no more than eight journals in their relevant
field and asking them to rank a long list of journals often reveal personal biases to certain
journals. They also identify a study where economists claimed knowledge of journals
that did not exist. A recent example of this is found in a study of electronic commerce
journals where a journal is ranked in the top ten even though its first issue was not
printed for more than two years after the survey study was completed (Bharati and
Tarasewich, 2002).
Several citation studies and survey-based journal-ranking studies are published
focusing on the OM field. Although these methodologies have some limitations, they are
well established methods for evaluating journals. The results from the five most recent
studies are summarized in Table I and are the subject of the meta-analysis described in
the next section. A glance at the results reveals some obvious similarities across the five
studies. We also believe the subtle evolution of the OM field based on the discussions in
these journal-ranking studies is quite interesting. Beginning with the first study of
Saladin (1985), authors gradually distinguished OM from the OR/MS and other related
fields. However, none of the extant studies explicitly distinguish OM and OR/MS as
distinct fields of inquiry.
Meta-analysis methodology
We use a meta-analysis approach to combine results from five recent studies while
mitigating the bias associated with the individual studies. Based on the stated goals of
the five studies, this analysis will provide a ranking of journals loosely based on
perceived quality according to OM researchers. According to Lipsey and Wilson (2001),
meta-analysis avoids problems associated with statistical significance testing by
limiting the focus to the direction and magnitude of the effects across studies. They
indicate meta-analysis is applicable to collections of research that:
.
are empirical and have quantitative results;
.
examine the same relationships; and
.
have findings that can be configured in a comparable form to the question at
hand.
The five studies examined satisfy these requirements.
This study utilizes the technique developed by Rainer and Miller (2005) to assess MIS
journals. Their method uses a common denominator to account for differences in the
number of journals in previous journal ranking studies. For each journal they divide the
rank of each journal by the number of journals ranked in that study. For example,
when MS is ranked fifth in a study of 50 journals (Mylonopoulos and Theoharakis, 2001),
the score for MS is 0.10. The scores for a journal determined across all studies

Operations
management
research
409

IJOPM
31,4

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

410

are then averaged to develop an average composite score for that journal. Scores close to
zero indicate highly ranked journals and scores closer to one indicate lower ranked
journals.
The effect size, which is the key to what makes meta-analysis possible, is the
dependent variable used to standardize findings across studies. We use the average
composite score as the effect size to minimize scaling problems associated with using
the raw scores, ranks of quality, or number of citations. Lipsey and Wilson (2001) go on
to say that any standardized index can be an effect size as long as it is:
.
comparable across studies generally through standardization;
.
represents the magnitude and direction of the relationship of interest; and
.
is independent of sample size.
The composite score satisfies these requirements.
We use explicit exclusion and inclusion criteria that strike a balance appropriate for
the research question at hand. Criteria that are too restrictive may limit the ability to
generalize, while criteria that are too inclusive may weaken the confidence in our
findings (Lipsey and Wilson, 2001). Consequently, the meta analysis examines two
citation analyses with unique sets of base journals, a survey of European researchers
(Soteriou et al., 1999), a survey of US members of POMS (Barman et al., 2001), and a
survey of worldwide members of several academic societies (Theoharakis et al., 2007).
However, Olson (2005) was excluded from the meta-analysis because her study did not
rank journal quality with respect to OM research.
To eliminate bias resulting from any single journal ranking study, results are
reported only for 32 verified journals ranked by two or more of the five studies. Of the
32 journals summarized in Table I, 23 journals are ranked by three or more of the five
studies considered. Journals omitted based on this criterion were primarily IE, quality,
information systems/computer science, and other engineering journals.
Meta-analysis results
Table I presents a composite ranking of journals based upon the meta-analysis of five
journal ranking studies focused on the OM field. This analysis indicates the journals
with the highest composite rankings are MS, OR, JOM, DS, Production and Operations
Management (POM), and IJPR. While the collection of top journals would probably
surprise few OM researchers, the specific order would likely be the subject of
much debate. Nevertheless, the composite list of journals illustrates the diversity of
publication outlets available to OM researchers. Though mentioned in the discussion by
Barman et al. (2001) and ranked highly in Theoharakis et al. (2007), one notable omission
is MSOM, which was too new to be included in the other studies. As a result, MSOM was
excluded from our analysis to avoid the potential bias from a single source.
An examination of the meta-analysis results reveals that many of the highly ranked
journals are not OM journals. Rather, their primary focus is on other research domains
frequently considered by OM researchers. Table II classifies the same list of journals
according to five research domains: OM, OR/MS, GM, industrial and other engineering,
and sector/miscellaneous. The OM category is comprised of journals we consider highly
relevant OM journals. Journal of Supply Chain Management is included within this
domain because the authors feel that supply chain management is closely related to OM.

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

OM journals

OR/MS journals

GM journals

IE/ENG journals

Sector/miscellaneous journals

JOM (3)
POM (5)
IJPR (6)
IJOPM (12)
PIMJ (19)
IJPE (23)
JSCM (29)

MS (1)
OR (2)
DS (4)
EJOR (9)
NRL (10)
MOR (11)
ORL (13)
IF (14)
JORS (15)
OME (20)
ITOR (27)
COR (30)

HBR (8)
AMJ (16)
SMJ (17)
SMR (21)
AMR (24)

IIET (7)
JMS (18)
IE (22)
JIE (25)
CIE (32)

JSM (26)
IJQRM (28)
QMJ (31)

Note: Number in parentheses is the meta-analysis ranking from Table I

The OR/MS domain includes many OR journals, a quantitative logistics journal (Naval
Research Logistics) focused on modeling, and Interfaces which focuses on the application
of OR to specific industrial case studies. DS is also included in this domain since DS
publishes articles related to the theme of decision-making across all functional business
disciplines. As mentioned earlier, the common thread of these journals is their focus on
publishing highly rigorous modeling research. These journals clearly publish research
concerning operational issues, but their stated scope is much broader than OM.
The industrial engineering/other engineering (IE/ENG) domain includes several IE
journals as well as Journal of Manufacturing Systems. The GM domain includes three of
the premier management journals, Academy of Management Journal, Academy of
Management Review, and Strategic Management Journal, as well as two widely known
general business journals Harvard Business Review and Sloan Management Review.
The sector/miscellaneous domain includes two quality journals commonly
associated with both the OM and IE domains. It also includes the Journal of Service
Management which is a common outlet for marketing, management and OM researchers
focusing on the service sector. This grouping also aligns with that used within the
Academic Journal Quality Guide published by the ABS (2009). It is particular important
to note that this guide also distinguishes OM from the OR/MS research domain calling
them Operations and Technology Management and Management Science and
Operations Research,, respectively.
Citation analysis methodology
The first part of this study examined five published ranking studies using a
meta-analysis to generate a composite ranking of journals. The survey results reported
by Theoharakis et al. (2007) are fairly recent, but the results from both citation analyses
are based on data from nearly two decades ago. The second part of this research entails a
new citation analysis of articles published in three OM journals. We selected IJOPM,
JOM, and POM as the base journals for this study.
These OM journals were selected for several reasons. Articles published in these
journals tend to focus solely on the OM field. Craighead and Meredith (2008) state
these journals are moving toward the OM paradigm, while MS and DS are not focused

Operations
management
research
411

Table II.
Research domain
classification of journals
ranked by meta-analysis

IJOPM
31,4

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

412

Table III.
Journal and citation
statistics for IJOPM,
JOM, and POM for the
years 1999 through 2005

on the OM paradigm. Including these two highly regarded journals in the base journal
list would require us to determine which articles are OM related. We prefer to avoid any
bias associated with this type of subjective decision.
The three journals selected also represent the broad cross section of research
methodologies employed by OM researchers, which will likely instill some heterogeneity
into the study. Results of this study support this premise. Over time and with
encouragement of many leading academics, the OM field has grown to include both
empirical and modeling-research. OM journals such as JOM and IJOPM currently cater
more to empirical research, while POM tends to publish more modeling based OM
research that some OM researchers may feel falls into the OR/MS domain. Consequently,
these base journals accurately represent a breadth of the best work published in the OM
field.
Citations from articles published in the three base journals between 1999 and 2005
were entered into a common database. Of the 954 articles included in this study, all were
OM related and thus included in our analysis. Table III summarizes the journal
and citation statistics for the three base journals. These articles contained a total of
41,632 citations of which 28,089, or 67.5 percent, were research journal articles. Results
show citations from all three base journals are comprised of approximately the same
percentages between journal citations, books and other categories. These values are
consistent with previous studies (Vokurka, 1996; Goh et al., 1997).
Based on the procedure used by Goh et al. (1997), we normalized the data for each base
journal by dividing the number of citations for each cited journal by the total number of
citations for that base journal. These normalized indices essentially represent the
percentage of citations for a journal in each base journal. For example, IJOPM was
referenced a total of 1,142 times within the issues published by IJOPM between the years
of 1999 and 2005. This value is normalized by the total journal citations of 12,748 yielding
a normalized index of 8.96 percent. The three normalized indices for a journal are then
averaged across the three base journals. These average normalized indices found in the
far right column of Table IV are then used to rank the journals in the citation analysis.
Citations within a journal are often biased toward a few select journals. This
phenomenon is common, and perhaps expected, particularly in diverse fields such
as OM. Moreover, journals commonly contain a larger percentage of self-citations.
As pointed out by Goh et al. (1997), normalizing the data helps compensate for biases

Number of issues
Number of articles
Articles per issue
Total citations
Citations per article
Journal citations
As a percentage
Book citations
As a percentage
Other (proceedings, dissertations, theses, etc.)
As a percentage
Number of unique journals cited

IJOPM

JOM

POM

Total

72
503
7.0
20,418
40.6
12,748
62.4
6,418
31.4
1,252
6.1
1,175

41
244
6.0
14,169
58.1
10,408
73.5
2,510
17.7
1,251
8.8
759

27
207
7.7
7,045
34.0
4,933
70.0
1,368
19.4
744
10.6
631

140
954
6.8
41,632
43.6
28,089
67.5
10,296
24.7
3,247
7.8
1,700

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24

3
1
2
4
6
5
7
8
11
9
10
13
12
20
14
16
17
15
18
19
28
24
21
25

Normalized Un-normalized
rank
rank
MS
JOM
HBR
IJOPM
POM
SMJ
DS
AMR
EJOR
SMR
IJPR
Journal of Marketing ( JMK)
AMJ
OR
PIMJ
Journal of Marketing Research (JMR)
California Management Review (CMR)
IJPE
Administrative Science Quarterly (ASQ)
JSCM
IIET
IF
Journal of Product Innovation Management ( JPIM)
Journal of Management ( JM)

Journal
354
572
861
1,142
148
482
221
252
158
314
270
142
149
46
204
74
187
196
136
47
25
68
98
84

IJOPM
citations
673
1,126
560
359
200
369
404
309
165
203
161
234
276
68
203
245
134
148
198
260
57
94
155
109

605
242
222
54
416
82
134
83
175
89
87
93
57
185
44
69
61
47
34
37
109
62
21
27

JOM
POM
citations citations
1,632
1,940
1,643
1,555
764
933
759
644
498
606
518
469
482
299
451
388
382
391
368
344
191
224
274
220

7.2
6.7
5.5
4.5
3.8
3.0
2.8
2.2
2.1
2.1
1.8
1.7
1.7
1.6
1.5
1.4
1.3
1.3
1.2
1.2
1.0
0.9
0.9
0.8
(continued)

Total
Average normalized
citations
indices (%)

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

Operations
management
research
413

Table IV.
Citation analysis ranking
of journals
for OM research

Table IV.

23
26
29
27
30

36
31
32
47
46
39
33
52
37
50

26
27
28
29
30

31
32
33
34
35
36
37
38
39
40

International Journal of Physical Distribution and


Logistics Management (IJPDLM)
Journal of Business Logistics ( JBL)
OME
Organization Science (OS)
IJQRM
IEEE Transactions on Engineering
Management (ITEM)
JORS
Psychological Bulletin (PB)
JSM
Marketing Science (MKS)
Journal of Retailing ( JR)
Journal of Applied Psychology ( JAP)
International Journal of Logistics Management (IJLM)
NRL
Business Horizons (BH)
Manufacturing and Service Operations
Management (MSOM)

Note: Citations of articles published between 1999 and 2005

22

25

Journal

18
29
70
16
13
24
86
5
58
2

115
90
102
138
72

153

IJOPM
citations

46
97
47
28
46
74
47
12
49
25

108
103
61
43
82

96

57
19
25
47
35
17
4
49
10
43

9
20
20
14
20

JOM
POM
citations citations

121
145
142
91
94
115
137
66
117
70

232
213
183
195
174

254

0.6
0.5
0.5
0.4
0.4
0.4
0.4
0.4
0.4
0.4

0.7
0.7
0.6
0.6
0.6

0.7

Total
Average normalized
citations
indices (%)

414

Normalized Un-normalized
rank
rank

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

IJOPM
31,4

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

associated with differences in the number of journal issues published during a year and
the number of articles published annually. It also mitigates the bias associated with the
self-citation phenomenon for the base journals.
Citation analysis results
Table IV shows results of the citation analysis by ranking the top 40 journals according
to the average normalized indices. These results show considerable agreement with the
results of the meta-analysis in that six of the top ten ranked journals appear on both lists.
JOM is ranked in the top ten for each of three base journals showing its broad influence
across a diverse field of OM researchers. Whereas the research published within IJOPM
and POM do not appear to influence one another as extensively. IJOPM, for example, was
only cited a total of 54 times within the 207 POM articles. These findings affirm our
earlier discussion regarding the diversity of the OM field. Since, IJOPM and POM cater to
different research methodologies, this outcome is reasonable. These results also support
the findings of Theoharakis et al. (2007) who found the diversity and background of OM
researchers affects their perceived quality and relevance of a journal.
The top 40 journals listed in Table IV account for 51.6 , 73.7 , and 69.5 percent of
the journal citations within IJOPM, JOM, and POM, respectively, while the top five
journals account for 26.8, 30.1, and 33.9 percent. These values are lower than the value
identified by Vokurka (1996) suggesting the research examined within the current
citation analysis is influenced by a broader array of journals. As seen in Table III, IJOPM
referenced 1,175 unique journals, which is far more than JOM or POM. Therefore, this
data may suggest that research published in IJOPM is influenced by a broader spectrum
of journals and other sources when compared to research published in JOM and POM.
Results of this citation analysis strongly support the existence of a self-citation
phenomenon. Two of the three base journals ( JOM and IJOPM) cited articles published
in its own journal more than any other journal. POM cited 605 articles published in MS
and 416 articles published in its own journal, which was the second most for that base
journal. Self-citations clearly represent an important bias that could affect results
significantly when ignored. This realization explains why it is important to normalize
the raw data to mitigate bias towards journals having the most total citations.
Comparison of the normalized and un-normalized ranks found in Table IV reveals the
top 13 ranked journals on both ranking lists contain the identical set of journals. Similar
statements may be made for the top 20 and the top 30 ranked journals. Statistical
analysis using the Sign test (Siegel and Castellan, 1988) reveals there is no significant
difference at the 10 percent level between the normalized and un-normalized ranks. JOM
is ranked first according to the un-normalized rank since it has the most total citations,
while MS is ranked first according to the normalized rank.
Our final observation concerns the low citation ranking of MSOM. Many may be
surprised that MSOM was cited only 70 times in our citation analysis, despite exceptional
ratings by Theoharakis et al. (2007) and the ABS (2009). Perhaps our results would differ
slightly if we selected different base journals focused more on modeling. However, we
attribute our findings to the relative newness of MSOM as a publication outlet.
Comparison of meta-analysis and citation analysis results
Examination of the journals listed in Table IV reveals some similarities to the
meta-analysis results regarding the breadth of research influencing OM research.

Operations
management
research
415

IJOPM
31,4

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

416

Table V.
Research domain
classification of journals
ranked by citation
analysis

Table V classifies these journals again into five domains: OM, OR/MS, GM, marketing,
and sector/miscellaneous. These domains are similar to those shown in Table II except a
marketing domain replaces the IE/ENG domain. In general, most of the IE journals
identified within the meta-analysis are not cited enough to appear on Table IV except for
the seventh ranked IIE Transactions which we moved into the sector/miscellaneous
category.
The seven OM journals identified in Table II remain the top OM journals listed in
Table V. Five additional OM journals are listed in Table V with three of the five journals
focusing on supply chain management. Supply chain management has clearly emerged
as an important focus within the business and academic communities as shown by the
number of citations.
The citation analysis results found in Table V also show a greater number of premier
journals referenced from the GM, marketing, and psychology fields. This confirms that
many OM researchers are aware of and are being influenced by other business fields and
the behavioral sciences. These observations suggest an evolution of OM research based
on the comparison of results for the current citation analysis to the citation analysis
results of Vokurka (1996) and Goh et al. (1997).
Further comparison of results found in Tables II and V reveals fewer OR/MS
journals are highly ranked in the current citation analysis compared to the meta-analysis
ranking. However, a more detailed comparison is necessary to conclude that journals
from the IE/ENG and OR/MS journal domains are less influential or that journals from
the OM and GM journal domains are more influential. To facilitate such comparisons,
citation analysis rankings are summarized in Table VI for the 32 journals ranked in the
meta-analysis. Since, several of the journals ranked in the meta-analysis fall outside the
top 40 journals ranked in our citation analysis, the citation rankings listed in Table VI
are based on the complete list of 1,700 unique journals identified in the citation analysis.
To evaluate the direction of differences between the journal rankings shown in
Table VI the Sign test was used. Statistical results for the Sign tests are summarized in
Table VII, which compares the meta-analysis results to various rankings from the
current citation analysis. The overall rankings of the 32 journals examined in the
meta-analysis are not statistically different from the normalized rankings from our
citation analysis ( p-value 0.068). Of the 32 journals, ten journals were ranked better
OM journals

OR/MS journals

GM journals

Marketing journals

Sector/miscellaneous journals

JOM (2)
IJOPM (4)
POM (5)
IJPR (11)
PIMJ (15)
IJPE (18)
JSCM (20)
IJPDLM (25)
JBL (26)
ITEM (30)
IJLM (37)
MSOM (40)

MS (1)
DS (7)
EJOR (9)
OR (14)
IF (22)
OME (27)
JORS (31)
NRL (38)

HBR (3)
SMJ (6)
AMR (8)
SMR (10)
AMJ (13)
CMR (17)
ASQ (19)
JM (24)
OS (28)
BH (39)

JMK (12)
JMR (16)
MKS (34)
JR (35)

IIET (21)
JPIM (23)
IJQRM (29)
PB (32)
JSM (33)
JAP (36)

Note: Number in parentheses is the normalized ranking from Table IV

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

Research
Journal domain
MS
OR
JOM
DS
POM
IJPR
IIET
HBR
EJOR
NRL
MOR
IJOPM
ORL
IF
JORS
AMJ
SMJ
JMS
PIMJ
OME
SMR
IE
IJPE
AMR
JIE
JSM
ITOR
IJQRM
JSCM
COR
QMJ
CIE

OR/MS
OR/MS
OM
OR/MS
OM
OM
IE/ENG
GM
OR/MS
OR/MS
OR/MS
OM
OR/MS
OR/MS
OR/MS
GM
GM
IE/ENG
OM
OR/MS
GM
IE/ENG
OM
GM
IE/ENG
SECTOR
OR/MS
SECTOR
OM
OR/MS
SECTOR
IE/ENG

Metaanalysis
ranking
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32

Normalized
citation ranking
1
14
2
7
5
11
21
3
9
38
100
4
100
22
31
13
6
76
15
27
10
58
18
8
100
33
100
29
20
100
66
56

IJOPM
citation
ranking
5
48
3
9
16
7
73
2
13
100
100
1
100
36
58
15
4
70
10
25
6
46
11
8
100
34
100
18
46
100
100
76

JOM citation POM citation


ranking
ranking
2
29
1
4
14
17
32
3
16
88
100
6
100
26
36
8
5
100
12
23
12
75
19
7
100
34
100
39
9
93
53
57

1
5
3
7
2
11
8
4
6
20
64
19
89
15
17
17
13
89
23
33
10
100
21
12
100
29
100
42
25
30
100
48

Note: 100 indicates a ranking of 100 or more

in the citation analysis, 19 were ranked better in the meta-analysis, and three journals
had identical ranks across the two analyses. However, this conceals some interesting
differences at a more detailed level. At the level of the three base journals, the citation and
meta-analysis rankings are statistically different at the 5 percent level. This finding
illustrates the subtle differences associated with individual journal citation analyses.
An interesting pattern emerges when we compare the citation and meta-analysis
rankings for the OM, OR/MS, GM, and IE/ENG domain subgroups identified in Table II.
Results show the citation ranking for journals assigned to the OR/MS and IE/ENG
domain subgroups are statistically lower than in the meta-analysis. Not only are there
fewer highly ranked journals in these two domains, the difference in rankings are
statistically significant.
Comparison of journals in the GM domain subgroup reveals the normalized citation
rankings are significantly higher than the rankings associated with the meta-analysis.

Operations
management
research
417

Table VI.
Comparison of
meta-analysis and
citation analysis rankings

IJOPM
31,4

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

418

Table VII.
Sign test results
comparing meta-analysis
to citation analysis
rankings

Comparison group
Normalized citation ranking
OM domain subgroup
OR/MS domain subgroup *
GM domain subgroup *
IE/ENG domain subgroup *
IJOPM citation ranking *
OM domain subgroup
OR/MS domain subgroup *
GM domain subgroup *
IE/ENG domain subgroup *
JOM citation ranking *
OM domain subgroup
OR/MS domain subgroup *
GM domain subgroup *
IE/ENG domain subgroup *
POM citation ranking *
OM domain subgroup
OR/MS domain subgroup *
GM domain subgroup
IE/ENG domain subgroup *

Meta-analysis
rank is lower

Meta-analysis
rank is higher

No change in rank

p-value

10
5
0
5
0
9
3
0
5
0
10
5
0
5
0
8
3
1
4
0

19
1
10
0
5
22
3
12
0
5
21
2
11
0
5
21
3
9
1
5

3
1
2
0
0
1
1
0
0
0
1
0
1
0
0
3
1
2
0
0

0.068
0.109
0.001
0.031
0.031
0.015
0.656
, 0.001
0.031
0.031
0.035
0.227
, 0.001
0.031
0.031
0.012
0.500
0.011
0.188
0.031

Notes: *Statistical significance at: 5 percent level; p-values are one-tailed probabilities for binomial
test when q p 0.50

Again, not only are there more highly rated journals in the GM domain, the increase in
rankings is statistically significant. However, for the journals ranked in the OM domain
subgroup results indicate no significant statistical difference exists between the
rankings of the seven OM journals. However, a quick comparison of results found in
Tables II and V reveals a significant difference in the ranking of IJOPM which is 12th
according to the meta-analysis and fourth according to the citation analysis.
As mentioned previously, the citation rankings for IJOPM, JOM, and POM are each
statistically different from the meta-analysis ranking with p-values of 0.015, 0.035, and
0.012, respectively. Similar comparisons focused on the domain subgroups reveal some
interesting results supporting the notion of a diverse OM field. Aside from the OM
domain subgroup, almost every domain subgroup is statistically different across the
three base journals except for the GM domain subgroup for articles published in POM
which has a p-value of 0.188. These results suggest research published in IJOPM and
JOM are influenced more by research published in GM journals. This premise becomes
more apparent by comparing the rankings directly across the three base journals as
shown in Table VIII. These comparisons reveal the rankings of GM journals are
statistically more important for IJOPM compared to POM at the 5 percent level.
However, there is no statistical difference between JOM and the other two base journals.
These results collectively suggest GM research is more influential to IJOPM, JOM, and
POM in that order.
Similar comparisons of journals in the OR/MS domain subgroup yield additional
results worth noting. The results in Table VII show the change in rankings
are statistically more significant for IJOPM and JOM than for POM, though all three

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

Journal A
Journal B
No change
Direct comparisons of base journals A and B rank is higher rank is higher in rank
p-value
IJOPM (A) and JOM (B) citation rankings
OM domain subgroup
OR/MS domain subgroup *
GM domain subgroup
IE/ENG domain subgroup
IJOPM (A) and POM (B) citation ranking
OM domain subgroup
OR/MS domain subgroup *
GM domain subgroup *
IE/ENG domain subgroup
JOM (A) and POM (B) citation ranking
OM domain subgroup
OR/MS domain subgroup
GM domain subgroup
IE/ENG domain subgroup

11
4
1
3
2
13
4
1
5
2
14
5
2
4
1

16
3
8
2
2
15
2
10
0
2
16
2
9
1
3

5
0
3
0
1
4
1
1
0
1
2
0
0
0
1

0.221
0.500
0.020
0.500
0.688
0.425
0.344
0.011
0.031
0.688
0.428
0.227
0.055
0.188
0.312

Notes: *Statistical significance at: 5 percent level; p-values are one-tailed probabilities for binomial
test when q p 0.50

base journals are significant at the 5 percent level. These subtle differences again
become more apparent by comparing the rankings directly across the three base
journals. These comparisons in Table VIII reveal the rankings of OR/MS journals are
statistically more important for both POM and JOM compared to IJOPM with p-values
of 0.011 and 0.020, respectively. However, there is no statistical difference between POM
and JOM. These results suggest the OR/MS journals are more influential to research
published in POM, followed by JOM and IJOPM in that order. These results concur with
previous statements that IJOPM, JOM, and POM have different missions and that there
is a diversity of journal research within the field of OM.
Discussion and conclusions
This is the first time meta-analysis has been used to examine the ranking of journals for
OM research. This research also provides a current citation analysis of the OM field for
the first time in over a decade. While the results of these two analyses show some
similarities with many of the same journals, there are some marked differences that may
support the notion of an evolving OM field.
A major issue underlying this paper examines how unique OM research is compared
with OR/MS, IE and other related fields. Our citation analysis results show lower levels
of citations of OR/MS and engineering journals compared to results published in
previous citation analyses. In addition, the diverse OM field has evolved to include a
notable number of journals focused on supply chain management. Singhal et al. (2007)
discuss how Elwood Buffa influenced the development of the modern academic field of
operations in roughly the three decades of the 1960s, 1970s, and 1980s. One could say
that the drift of OM away from OR/MS and engineering began in the post-Buffa era of the
last two decades, starting with the modern quality movement and continuing to grow
through the rise of the Internet and globalization of the worlds economies.

Operations
management
research
419

Table VIII.
Sign test results of direct
comparisons between
base journal citation
rankings

IJOPM
31,4

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

420

The results of this study show that OM researchers consider and are being influenced
by other business fields and the behavioral sciences. The current citation analysis shows
a significant increase in references to management, marketing, and psychology journals
affirming an increase in the breadth of research conducted in the OM field particularly
with regard to empirical research. Fisher (2007) advocates improving the field of OM
by increasing the empirical elements of OR. However, as pointed out by Krajewski (2002)
and Theoharakis et al. (2007), OM is a diverse field that continues to embrace both
empirical and modeling research. The results of our citation analysis strongly support
this notion.
While OM is trying to distinguish itself as a separate discipline, the OR/MS field is
also attempting to distinguish itself from OM. Bell (2007) advocates advancing student
interest in the study of OR by clearly differentiating OR/MS from OM: Operations
management is a business function [. . .] operations research provides a way of
approaching problems in finance, investment banking, marketing and consulting.
These conclusions are not novel, but rather part of a growing argument being made by
both OR and OM advocates. That being said, this is the first journal ranking paper that
explicitly distinguishes OM from OR/MS and other related research domains.
The meta-analysis ranks journals loosely based on perceived quality according to
OM researchers and not on the importance of those journals to OM research. The citation
analysis shows what journals OM researchers are referencing in their research
published in OM journals. This partially explains some of the differences between these
two approaches, though we found no statistical difference between the overall results of
the two approaches. In addition, while there appears to be a difference in the focus of the
base journals used for our citation analysis, we found no statistical difference between
the overall rankings within the three base journals. However, a more detailed analysis
revealed the diversity of OM field where IJOPM tends to be more empirical and POM
tends to be more focused on modeling. Results showed JOM is positioned somewhere
between IJOPM and POM on this diversity spectrum.
A limitation to this study is that only IJOPM, JOM, and POM were used as base
journals for the citation analysis of current research. We chose these three journals for
the citation analysis because they are considered to be three highly regarded OM
journals publishing a breadth of both empirical and modeling research. Other
limitations of citation analysis are that they do not show the relevance of each citation
and self-citing of journals can be a problem. However, citation analysis is a widely used
and accepted method of determining journal influence.
References
ABS (2009), Academic Journal Quality Guide, The Association of Business Schools, available at:
www.the-ABS.org.uk/
Barman, S., Hanna, M.D. and LaForge, R.L. (2001), Perceived relevance and quality of POM
journals: a decade later, Journal of Operations Management, Vol. 19 No. 3, pp. 367-85.
Barman, S., Tersine, R.J. and Buckley, M.R. (1991), An empirical assessment of the perceived
relevance and quality of POM-related journals by academicians, Journal of Operations
Management, Vol. 10 No. 2, pp. 194-212.
Barnes, S.J. (2005), Assessing the value of IS journals, Communications of the ACM, Vol. 48
No. 1, pp. 110-12.

Baumgartner, H. and Pieters, R. (2003), The structural influence of marketing journals: a citation
analysis of the disciplines and its sub-areas over time, Journal of Marketing, Vol. 67 No. 2,
pp. 123-39.
Beed, C. and Beed, C. (1996), Measuring the quality of academic journals: the case of economics,
Journal of Post-Keynesian Economics, Vol. 18 No. 3, pp. 369-96.

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

Bell, P.C. (2007), Marketing the profession to our students: eight ideas to effectively sell an
operations research course to MBAs, OR/MS Today, pp. 30-3 (August).
Bharati, P. and Tarasewich, P. (2002), Global perceptions of journals publishing e-commerce
research, Communications of the ACM, Vol. 45 No. 5, pp. 21-6.
Chase, R.B., Jacobs, F.R. and Aquilano, N.J. (2006), Operations Management for Competitive
Advantage, 11th ed., McGraw-Hill, New York, NY.
Coe, R.K. and Weinstock, I. (1984), Evaluating the management journals: a second look,
Academy of Management Journal, Vol. 27 No. 3, pp. 660-6.
Craighead, C.W. and Meredith, J. (2008), Operations management research: evolution and
alternate paths, International Journal of Operations & Production Management, Vol. 28
No. 8, pp. 710-26.
DuBois, F.L. and Reeb, D. (2000), Ranking the international business journals, Journal of
International Business Studies, Vol. 31 No. 4, pp. 689-704.
Fisher, M. (2007), Strengthening the empirical base of operations management, Manufacturing
& Service Operations Management, Vol. 9 No. 4, pp. 368-82.
Goh, C.H., Holsapple, C.W., Johnson, L.E. and Tanner, J.R. (1996), An empirical assessment of
influences on POM research, Omega, Vol. 24 No. 3, pp. 337-45.
Goh, C.H., Holsapple, C.W., Johnson, L.E. and Tanner, J.R. (1997), Evaluating and classifying
POM journals, Journal of Operations Management, Vol. 15 No. 2, pp. 123-38.
Guidry, J.A., Hollier, B.N., Johnson, L.E., Tanner, J.R. and Veltsos, C. (2004), Surveying the cites:
a ranking of marketing journals using citation analysis, Marketing Education Review,
Vol. 14 No. 1, pp. 45-59.
Katerattanakul, P., Han, B. and Hong, S. (2003), Objective quality ranking of computing
journals, Communications of the ACM, Vol. 46 No. 10, pp. 111-14.
Koojaroenprasit, N., Weinstein, A., Johnson, W.C. and Remington, D.O. (1998), Marketing
journal rankings revisited: research findings and academic implications, Marketing
Education Review, Vol. 8 No. 1, pp. 95-102.
Krajewski, L.J. (2002), Reflections on operations management research, Journal of Operations
Management, Vol. 20 No. 1, pp. 2-5.
Lipsey, M.W. and Wilson, D.B. (2001), Practical Meta-Analysis, Sage, Thousand Oaks, CA.
MacRoberts, M.H. and MacRoberts, B.R. (1989), Problems of citation analysis: a critical review,
Journal of the American Society for Information Science, Vol. 40 No. 5, pp. 342-9.
Mylonopoulos, N.A. and Theoharakis, V. (2001), Global perceptions of IS journals,
Communications of the ACM, Vol. 44 No. 9, pp. 29-33.
Olson, J.E. (2005), Top-25-business-school professors rate journals in operations management
and related fields, Interfaces, Vol. 53 No. 4, pp. 323-38.
Oltheten, E., Theoharakis, V. and Travlos, N.G. (2005), Faculty perceptions and readership
patterns of finance journals: a global view, Journal of Financial and Quantitative Analysis,
Vol. 40 No. 1, pp. 223-39.

Operations
management
research
421

IJOPM
31,4

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

422

Peffers, K. and Ya, T. (2003), Identifying and evaluating the universe of outlets for information
systems research: ranking the journals, Journal of Information Technology Theory and
Application, Vol. 5 No. 1, pp. 63-84.
Podsakoff, P.M., MacKenzie, S.B., Bachrach, D.G. and Podsakoff, N.P. (2005), The influence of
management journals in the 1980s and 1990s, Strategic Management Journal, Vol. 26,
pp. 472-88.
Rainer, R.K. and Miller, M.D. (2005), Examining differences across journal rankings,
Communications of the ACM, Vol. 48 No. 2, pp. 91-4.
Saladin, B. (1985), Operations management research: where should we publish, Operations
Management Review, Vol. 3 No. 4, pp. 3-9.
Siegel, S. and Castellan, N.J. Jr (1988), Nonparametric Statistics, 2nd ed., McGraw-Hill,
New York, NY.
Singhal, K., Singhal, J. and Starr, M. (2007), The domain of production and operations
management and the role of Elwood Buffa in its delineation, Journal of Operations
Management, Vol. 25, pp. 310-27.
Soteriou, A.C., Hadjinicola, G.C. and Patsia, K. (1999), Assessing production and operations
management related journals: the European perspective, Journal of Operations
Management, Vol. 17 No. 2, pp. 225-38.
Theoharakis, V.T., Voss, C.A., Hadjinicola, G.C. and Soteriou, A.C. (2007), Insights into factors
affecting production and operations management (POM) journal evaluations, Journal of
Operations Management, Vol. 25 No. 4, pp. 932-55.
Vastag, G. and Montabon, F. (2002), Journal characteristics, rankings, and social acculturation
in operations management, Omega, Vol. 30, pp. 109-26.
Vokurka, R.J. (1996), The relative importance of journals used in operations management
research: a citation analysis, Journal of Operations Management, Vol. 14 No. 4, pp. 345-55.
Corresponding author
Charles G. Petersen can be contacted at: cpetersen@niu.edu

To purchase reprints of this article please e-mail: reprints@emeraldinsight.com


Or visit our web site for further details: www.emeraldinsight.com/reprints

Downloaded by MANAGEMENT AND SCIENCE UNIVERSITY At 20:01 17 March 2015 (PT)

This article has been cited by:


1. Yang Cheng, Sami Farooq, John Johansen. 2015. International manufacturing network: past, present,
and future. International Journal of Operations & Production Management 35:3, 392-429. [Abstract] [Full
Text] [PDF]
2. Timothy D. Fry, Joan M. Donohue, Brooke A. Saladin, Guangzhi Shang. 2015. The internationalisation
of operations management research. International Journal of Production Research 1-31. [CrossRef]
3. Maria Fischl, Maike Scherrer-Rathje, Thomas Friedli. 2014. Digging deeper into supply risk: a systematic
literature review on price risks. Supply Chain Management: An International Journal 19:5/6, 480-503.
[Abstract] [Full Text] [PDF]
4. Timothy D. Fry, Joan M. Donohue. 2013. Outlets for operations management research: a DEA
assessment of journal quality and rankings. International Journal of Production Research 51, 7501-7526.
[CrossRef]
5. Toyin Clottey, W. C. Benton. 2013. Guidelines for Improving the Power Values of Statistical Tests
for Nonresponse Bias Assessment in OM Research. Decision Sciences 44:10.1111/deci.2013.44.issue-4,
797-812. [CrossRef]