You are on page 1of 21

POLITICAL STUDIES REVIEW: 2004 VOL 2, 293313

A Global Ranking of Political Science Departments


Simon Hix
London School of Economics Rankings of academic institutions are key information tools for universities, funding agencies, students and faculty. The main method for ranking departments in political science, through peer evaluations, is subjective, biased towards established institutions, and costly in terms of time and money. The alternative method, based on supposedly objective measures of outputs in scientic journals, has thus far only been applied narrowly in political science, using publications in a small number of US-based journals. An alternative method is proposed in this paper that of ranking departments based on the quantity and impact of their publications in the 63 main political science journals in a given ve-year period. The result is a series of global and easily updatable rankings that compare well with results produced by applying a similar method in economics.

Rankings of academic institutions are key information tools for universities, public and private funding agencies, students and faculty. For example, to investigate whether and why European universities lag behind their competitors in the US, the European Economics Association commissioned research into the ranking of economics departments on a global scale (see, especially, Coup, 2003). A variety of different ranking methods have been used in the natural sciences and have started to emerge in the social sciences, especially in economics (see, for example, Scott and Mitias, 1996; Dusansky and Vernon, 1998). All methods have disadvantages and trade-offs. Nevertheless, the best methods tend to have three elements: (1) they rank institutions on a global scale rather than in a single country; (2) they use objective measures of research outputs, such as publications in journals, rather than subjective peer evaluations; and (3) they are cheap to update, for example by allowing for mechanized annual updates. However, no such global, objective or easily updated method exists in political science. This research aims to ll this gap by proposing and implementing a new method for ranking departments in this eld. To this end, in the next section I review the existing methods in our discipline. In the third section I then propose and justify an alternative method, based on research outputs in the main political science journals in a particular ve-year period. And in the fourth section I present the results of an analysis of the content of 63 journals between 1993 and 2002.
Political Studies Association, 2004. Published by Blackwell Publishing Ltd, 9600 Garsington Road, Oxford OX4 2DQ, UK and 350 Main Street, Malden, MA 02148, USA

294

SIMON HIX

Existing Rankings of Political Science Departments


As in other disciplines, two main methods have been used to rank political science departments: peer assessments, and content analysis of scientic journals. However, both methods, as applied thus far, have their limitations.

Peer Assessments
The most widely used method for ranking political science departments is peer assessments where senior academics are asked to evaluate the quality of other departments. For example, this method is used by the US National Research Council and the U.S. News and World Report to rank doctoral programs in the US (see PS: Political Science and Politics, 1996a, b), and in the Research Assessment Exercise in the UK for the allocation of central government research funding. The problems with this method are well known. First, peer assessments are subjective, by denition. No ranking method is perfectly objective. For example, the content of scientic journals is determined by the subjective judegments of journal editors and article reviewers. However, journal editors and article reviewers are experts on the subjects of the papers they publish or review. Also, peer evaluation in the journal publishing process is repeated thousands of times, which reduces bias. In contrast, rankings based on periodic peer assessments rely on a small sample of academics, who cannot possibly be experts in all areas of research produced by the institutions they rank. As a result, rankings based on peer assessments are less objective than rankings based on the content analysis of journals (if a sufciently large sample of journals is used). The resulting biases of this subjectivity have been well documented. Because the sample of academics has only limited information about the output of all institutions, they are forced to base their judgements on other factors. This results in a bias towards large established departments and against new and up-and-coming departments (Katz and Eagles, 1996). The overall reputation of the university has an effect on the respondents expected performance of a political science department known as the halo effect (Lowry and Silver, 1996; Jackman and Siverson, 1996). Second, the peer assessment method is highly costly and time-consuming. This is because of the need either to survey a large number of senior faculty (as in the cases of the US National Research Council and the U.S. News and World Report) or to prepare and read the submissions of all the universities (as in the case of the Research Assessment Exercise). Hence, rankings based on peer assessments are invariably updated only every ve years (in the case of the Research Assessment Exercise and the U.S. News and World Report) or even longer (in the case of the US National Research Council). Third, all existing peer assessment rankings are nationally specic. If similar methods were used in different countries, a global ranking based on peer assessments could be produced. However, different methods tend to be used

POLITICAL SCIENCE DEPARTMENTS

295

in different countries. For example, in the U.S. News and World Report, departments are scored out of ve in a number of criteria, and then averaged (with the top departments scoring between 4.7 and 4.9). In the Research Assessment Exercise, departments are scored in bands from 5* to 2 (with the top departments scoring 5*). As a result, relative performance on a global scale is difcult to establish.

Content Analysis of Scientic Journals


In a conscious effort to improve on these peer assessment results, political scientists have begun to develop more objective methods of ranking political science departments. Following the practice in other disciplines, the most popular method is the analysis of the content of the leading political science journals (see, for example, Welch and Hibbing, 1983). The assumption behind this method is that, in contemporary political science, the main output for research results is publication in a professional journal. Publication of books is more common in political science than in economics. Hence, ideally, a ranking based on book publications as well as journal articles would produce the best results (see Rice et al., 2002). However, analysing the content of books and the number of citations to particular book series is costly, since there is not a single database of book publications and book citations like the Social Science Citation Index (SSCI) for journal publications. One solution could be to use peer assessments to rank book publishers (see, for example, Goodson et al., 1999). But this would go against the aim of creating a ranking using only non-subjective assessments. Also, a ranking based on book publications may have little added value, because there is probably a high correlation between the outputs of departments in books and journals. At the individual level, some political scientists prefer to write books while others prefer to write articles. However, top departments probably produce a lot of books as well as a lot of articles, whereas lessgood departments probably produce less books and articles. Hence, the rankings resulting from these two measures should be similar, at least for the larger departments. Consequently, most researchers have analysed journal publications rather than book publications. But there are problems with the way this method has been applied thus far. First, existing studies have counted only a small number of journals. Miller et al. (1996) only looked at the content of the American Political Science Review (APSR); Teske (1996) looked at APSR, the Journal of Politics (JOP), and the American Journal of Political Science (AJPS) (see Garand and Graddy, 1999); McCormick and Rice (2001) counted articles in APSR, AJPS, JOP, the Western Political Quarterly (WPQ) and Polity; and Ballard and Mitchell (1998) looked at APSR, JOP, AJPS, World Politics, Comparative Politics, the British Journal of Political Science (BJPolS), WPQ, Polity and Political Science Quarterly. But even nine journals is a rather limited sample of the main journals in political science. The SSCI contains 143 journals in the elds of political science, international relations and public administration. With modern

296

SIMON HIX

computer technology, there is no reason why all, or at least a larger and more representative sample, of these journals cannot be counted. Second, and partly due to the limited sample of journals coded, the existing rankings based on the content of journals have tended to be biased towards institutions in the US. For example, although APSR is widely respected as the top political science journal, it is nonetheless the in-house journal of the American Political Science Association. Not surprisingly, only 7 per cent of articles in APSR between 1996 and 1999 were by scholars based outside the US (Schmitter, 2002). This small number may be a fair reection of the quality or quantity of research outside the US. However, studying the content of one journal inevitably risks a high degree of error. Even in Ballard and Mitchells (1998) study of nine journals, only one journal based outside the US was included (BJPolS). Not surprisingly, not a single nonAmerican department appeared in their top 50. It might be the case that no department outside the US is good enough to be in the top 50. But one would be more inclined to accept this conclusion if this result was based on the content of a larger sample of journals and more journals based outside the US.

An Alternative Method
Building on existing bibliometric research, the method proposed here ranks academic institutions on the basis of the quantity and impact of articles published in the main journals in political science in a given period. To establish this ranking, decisions were made about the following: (1) what time period to use; (2) what to count as the main political science journals; (3) what to count as a publication in a journal; (4) how to measure the impact (quality) of a journal; (5) how to construct a ranking from this information; and (6) how to create a quasi-error term.

Time Period
Creating annual rankings would have the advantage of being able to track short-term changes in the performance of departments. However, looking at the content of only one year of each journal would be a small sample size, and so would produce a high degree of measurement error. Conversely, a new ranking every ten years would be more accurate, but would not measure more subtle changes. As a result, counting articles on a rolling ve-year basis would probably be the most effective method. This allows for a larger sample in each journal and allows a new ranking to be produced every year in other words, 19931997, 19941998, and so on. This is also a similar time period to other rankings, such as the U.S. News and World Report and the Research Assessment Exercise.

The Main Political Science Journals


Four steps were taken to dene the main political science journals. Step one involved the full list of journals in the eld in the SSCI, which contained 143

POLITICAL SCIENCE DEPARTMENTS

297

journals in political science, international relations and public administration in 2002. Step two involved adding some missing journals to this list. The SSCI does not include all major political science journals. The Institute for Scientic Information (ISI) follows a careful procedure for selecting which journals to include in the SSCI.1 However, several prominent international political science journals are not listed in the SSCI. For example, whereas the main journals of the British, German and Scandinavian political science associations are in the SSCI, the main journals of the French, Italian and Dutch associations are not. Also, several major sub-eld journals were not included before 2002, such as the Journal of Public Policy, European Union Politics, Nations and Nationalism, History of Political Thought, the Journal of Legislative Studies, and Democratization. Adding these journals to the SSCI list makes a total of 152 journals.2 Step three involved setting and applying two simple criteria for divining the main political science journals from this list of 152. First, many journals are in fact journals in other elds of social science, such as law, economics, geography, sociology, history, psychology, social policy, communications, philosophy, or management. For the sake of simplicity, a political science journal can be dened as a journal that is (a) edited by a political scientist and (b) has a majority of political scientists on its editorial board (in departments or institutes of political science, politics, government, international relations, public administration or public policy). Second, many journals in the SSCI list have a marginal impact on the discipline of political science. For example, almost one third of the journals had less than 100 citations to articles published in any issue of these journals by the articles published in the over 8,000 other journals in the SSCI in 2002. Removing these non-political-science journals and journals that have only a marginal impact left 60 journals. Step four, however, involved adding back three journals that have a low impact but are the national political science association journals of three countries: the Australian Journal of Political Science, Politische Vierteljahresschrift (published by the German political science association) and Scandinavian Political Studies. It is reasonable to include these journals despite their low impact, since the ISI had already decided that these are important journals. In other words, national political science association journals are included in the analysis either if they are in the SSCI or if they are not in the SSCI list but receive more than 100 citations per year. This left 63 journals for the analysis, which are listed in Table 1. For the 54 journals in the SSCI, data on the content of these journals between 1993 and 2002 was purchased from the ISI. For the nine journals not in the SSCI and for the issues of the SSCI journals that are not in the database (for example, where a journal existed for a number of years prior to being included in the SSCI), the content was coded by hand. In total, the content of 495 annual volumes was collected electronically and the content of 117 volumes was collected by hand.

298

SIMON HIX

Table 1: Journals included in the Analysis


Journal American Political Science Review American Journal of Political Science International Organization Foreign Affairs Journal of Politics International Security Journal of Conict Resolution World Politics Journal of European Public Policy International Studies Quarterly Public Choice Journal of Common Market Studies British Journal of Political Science Journal of Peace Research Journal of Law Economics and Organization Comparative Political Studies Journal of Democracy Europe-Asia Studies European Union Politics Political Research Quarterly West European Politics Political Studies PS: Political Science and Politics European Journal of Political Research Public Administration Party Politics European Journal of International Relations Comparative Politics Electoral Studies Post-Soviet Affairs Review of International Studies Security Studies Politics and Society Governance Legislative Studies Quarterly Political Communication Political Behavior International Interactions Journal of Theoretical Politics American Politics Quarterly Millennium-Journal of International Studies Publius-The Journal of Federalism Political Theory Journal of Public Policy International Affairs Philosophy and Public Affairs Political Science Quarterly International Political Science Review Democratization Nations and Nationalism Australian Journal of Political Science Journal of Legislative Studies Canadian Journal of Political Science Political Quarterly East European Politics and Societies Scandinavian Political Studies Polity Politische Vierteljahresschrift Revue franaise de science politique Cooperation and Conict History of Political Thought Acta Politica Rivista Italiana di Scienza Politica Volumes coded by hand Volumes in SSCI 19932002 19932002 19932002 19932002 19932002 19932002 19932002 19932002 19972002 19932002 19932002 19932002 19932002 19932002 19932002 19932002 19952002 19932002 19932002 20002002 19932002 19932002 19932002 19932002 19952002 19972002 19932002 19932002 19932002 19952002 19962002 19932002 19952002 19932002 19942002 19972002 19932002 19932002 19932000 19932002 19932002 19932002 19932002 19932001 19932002 19932002 19942002 19952002 19932002 19952003 19932002 19932002 19932002 19942002 19932002 19932002 Impact Score 8.82 6.91 5.21 4.72 4.13 3.93 3.72 3.66 3.34 3.28 3.22 2.94 2.84 2.82 2.80 2.79 2.75 2.64 2.59 2.58 2.58 2.56 2.53 2.46 2.44 2.38 2.30 2.27 2.26 2.18 2.18 2.17 2.14 2.09 2.08 2.08 2.06 2.00 2.00 1.99 1.96 1.93 1.91 1.85 1.82 1.81 1.75 1.74 1.70 1.70 1.69 1.69 1.64 1.64 1.63 1.60 1.53 1.52 1.49 1.45 1.40 1.38 1.33

19941996

19931994 20002002 19931999

19951996

19931994 19931995 19931994 1993 19931996

20012002

19932002

1993

19932002 19932002 19932002 19932002 19932002

Note: All issues of journals between 1993 and 2002 were coded. So, if a year is missing in the table, either a journal had not been published yet, or a journal was not published in that particular year.

POLITICAL SCIENCE DEPARTMENTS

299

Counting Articles
Several different types of articles are published in these journals. All main articles and research notes were included, and all editorial comments, book reviews and short notes were excluded. I decided to treat each article or research note in the same journal as equivalent regardless of its length, because I see no justication for assuming that a shorter article is less important than a longer article in the same journal. There were just over 18,000 such publications in the 63 journals between 1993 and 2002. Each article was then counted as follows: an article by a single author with a single institutional afliation, or by two or more authors from a single institution, scored 1.0 for the institution; an article by two authors from two different institutions, or by a single author with two institutional afliations, counted as 0.5 for each institution; an article by three authors or three institutions counted as 0.333 for each institution; and so on. This method is not ideal, as it undervalues collaborative research. However, the alternative is worse: in other words, counting multi-authored articles as having more value than single authored articles. Observations where an institutional afliation could not be derived from the editorial information were excluded. This left a total of approximately 24,000 single observations for analysis.

Measuring Impact
Some articles are more signicant than others. I assume that an article is as signicant as the overall impact of the journal in which it is published. Two different articles in the same journal may have vastly different impacts on the eld. Conversely, some articles may be cited because of the fame of the author. Hence, if one assumes that a journal applies a common standard for acceptance of a paper for publication, it is reasonable to assume that all articles in a particular journal are of approximately equal quality. A common measure of the relative impact of a journal is the average number of citations to a journal in a given period. For example, the ISI calculates an impact factor, which is the total number of citations by all other articles in the ISI database to all articles in a journal in the previous two years, divided by the number of articles in the journal in the previous two years. Using a similar method, we could calculate the average annual citations to all articles in a journal in the ten-year period. However, because it takes time for an article to be noticed, recently published articles are less cited than articles published several years ago. Hence, simply counting the average annual citations would create a bias against recently established journals that have not had long enough to build up their stock of citations. However, if we assume that the evolution in the number of citations follows the same functional form, a xed-effect regression model of annual citations can be estimated. This would produce a constant for each journal that is a measure of its relative importance. But the common trend in citations for a particular journal is non-linear: there tends to be a plateau in the number of

300

SIMON HIX

citations for several years followed by a decline in the number of citations in the most recent years. Hence, the appropriate common functional form is a negative quadratic equation:
2 ANNUAL _ CITES jy = b1JOURNALy - b2YEARjy - b3YEARjy + e jy

where j (journal) = 1, ... , 63; y (year) = 1, ... , 10; and JOURNAL is a vector of 63 binomial variables, one for each journal. Estimating this model using ordinary least-squares regression produces the following results (t-statistics in parentheses): b2 = 17.944 (2.65), b3 = 0.709 (0.590), and 63 constants, ranging from a high of 882.49 citations per year for APSR to a low of 133.49 for the Rivista Italiana di Scienza Politica (RISP).3 An impact score for each journal was than produced from the constants by dividing each journals constant by 100 (see Table 1). In other words, a paper in APSR is about as important as seven papers in RISP. The journal impact scores calculated by this method are highly correlated (0.757) with the SSCI impact scores in 2002 for the 54 journals in both the SSCI and my list.4 The high correlation between my index and the SSCI impact index is not surprising, as both methods are based on the number of citations to articles in journals in a given period. However, there are two advantages of my impact scores over the SSCI scores. First, my method allows for impact scores to be calculated for journals that are not included in the SSCI. Second, by assuming a common trend in the number of citations over time, my method corrects for an inherent bias against new journals in the SSCI method. Finally, it should be noted that because journals that were not mainstream political science journals were removed from the SSCI list, the ranking does not include outputs published elsewhere in the social sciences. This may produce a bias against departments that try to contribute to general social science rather than the narrow discipline of political science. Nevertheless, the method used to calculate an impact score for each journal reintroduces a measure of the breadth of a contribution, as the impact score for a journal is calculated from all citations to articles in the journal from any journal in the SSCI.

Construction of the Ranking


Some people may be interested in the total output of a department, whereas others may be interested in the average quality of these outputs or the average productivity of a department. For example, the central administration of a university may wish to know the relative per capita productivity of a department, whereas a prospective graduate student may seek a large department with a lot of research-active staff. So, ve separate rankings were created from the data: Rank 1 (Quantity) the total number of articles in the journals by scholars from a particular institution in a ve-year period.

POLITICAL SCIENCE DEPARTMENTS

301

Rank 2 (Impact) the total number of articles in the journals by scholars from a particular institution in a ve-year period multiplied by the impact score of the journal in which the article was published. Rank 3 (Quantity/Faculty Size) the total number of articles in the journals by scholars from a particular institution in a ve-year period (as used to produce Rank 1) divided by the faculty size of the political science department of that institution. Rank 4 (Impact/Faculty Size) the total number of articles in the journals by scholars from a particular institution in a ve-year period multiplied by the impact score of the journal in which the article was published (as used to produce Rank 2) divided by the faculty size of the political science department of that institution. Overall Rank the average position of the institution in the other four ranks. The overall ranking is consequently an unweighted sum of the other four rankings (compare with Coup, 2003). Invariably, people will have different opinions about the relative importance of Ranks 1, 2, 3 and 4. Hence, the positions of the institutions in each of the four individual ranks are also reported so that an interested person can calculate a different overall rank using a different set of weighting of the other ranks. The information on the size of a department was gathered from two sources. First, for the British universities, the data is the number of full-time staff submitted in the Politics and International Relations section of the 2001 Research Assessment Exercise. Second, for all other universities (including those British universities who did not make a submission for this section in 2001), we counted the number of full-time staff with a rank of full, associate or assistant professor (or equivalent) listed on a departments website in November to December 2003.5 In other words, this includes only the number of staff in a political science department plus related institutes, or the number of political scientists in a department or faculty of social science. For example, according to the Harvard University website, the number of permanent faculty in the Department of Government plus the number of permanent faculty in the Kennedy School of Government who describe themselves as political scientists is 87. Several things are worth noting here. First, this method of counting the size of a department assumes that the number of political scientists in a particular institution remains constant, which clearly is not the case. Second, this method only counts academics in political science departments, whereas the method for counting research output counts anyone publishing in one of the journals from a particular institution, regardless of where they are based in an institution. For example, if someone from a business school, an economics department or a law department publishes in one of the journals, this person is counted as a political scientist, but is not included as a member of the political science faculty in their institution. However, although there may be people outside a political science department who do political science research, the

302

SIMON HIX

size of the political science department is probably a reasonable proxy for the size of the overall political science community in an institution.

A Quasi-Error
Finally, a quasi-error in the overall rank of each institution was calculated. There are two sources of measurement error in this analysis. First, in counting the number of articles published by an institution, an article may have been missed. For example, in the computer data, an article may have been mislabelled as a minor note rather than a proper article, the institutional afliation of an author may have been entered incorrectly (although each entry in the data was carefully checked), or an author who was listed as having no institutional afliation may have been based in a particular academic institution. Second, it is extremely difcult to accurately measure the faculty size of a department. For example, different academic systems have different ways of describing their faculty (for example, many German universities only list their full professors). Also, information on the departments websites is invariably out of date or inaccurate. Using these two sources of error, a quasi-error was worked out by calculating where an institution would have been placed in the overall ranking if the institution had produced one more/less article in a journal with a mean impact score (2.52) and if the department was 5 per cent smaller/larger than it had been measured. For example, in 19982002, the London School of Economics, with a faculty size of 76, produced 143.31 articles with an impact of 338.87. This placed it 2nd in Rank 1 (Quantity), 4th in Rank 2 (Impact), 31st in Rank 3 (Quantity/Faculty Size), 57th in Rank 4 (Impact/Faculty Size), and 15th overall. If it had produced one more article in a mean-impact score journal and had 5 per cent less staff, its position would not have changed in Ranks 1 and 2, but would have risen to 24th in Rank 3, 51st in Rank 4, and 12th overall. Conversely, if it had one less article and 5 per cent more staff, it would have been 18th overall. So, the quasierror at 15th was 1218 (or plus/minus three places).

Results
Table 2 lists the Global Top 200 political science institutions on the basis of their output in the main political science journals in the ve years between 1998 and 2002.6 Anyone with a cursory knowledge of the discipline would recognize most of the names on the list. One way of assessing the validity of the method is to compare the results to those using a similar method in economics (Coup, 2003). In the political science rankings for 19982002, there was one department outside the US in the top 10, ve in the top 20, fourteen in the top 50, thirty-six in the top 100, and 103 in the top 200. In the comparable ranking in economics, there were no departments outside the US in the top 10, one in the top 20, ten in the top 50, thirtyfour in the top 100, and eighty-eight in the top 200.

POLITICAL SCIENCE DEPARTMENTS

303

One obvious criticism is that these rankings are biased towards Englishspeaking countries, since nine of the top 10, nineteen of the top 20, forty-eight of the top 50, ninety-one of the top 100, and 163 of the top 200 are from the US, the UK, Australia, Canada or Ireland. However, the equivalent rankings in economics are equally as dominated by Anglo-Saxon institutions: with all of the top 10, all of the top 20, forty-seven of the top 50, eighty-seven of the top 90, and 155 of the top 200 coming from these same ve English-speaking countries. In other words, the dominance of institutions from these countries may simply be a reection of the dominant position of English as the global language in the social sciences. So, if one assumes that the global spread of good and bad departments is broadly similar in the disciplines of political science and economics, then the method outlined and applied here is as good as the most similar ranking in economics. Table 3 shows the rank-order correlations between the ve rankings, using the results for the top 200 in the 19982002 period. As would be expected given the calculation method, there are high correlations between Ranks 1 and 3 and between Ranks 2 and 4. However, the correlations suggest that each ranking method measures something different. Finally, Table 4 shows the rolling rankings for the six ve-year periods between 1993 and 2002. One of the striking things here is the stability of the top three, with Stanford, Harvard and Columbia swapping places at the top of the list, and none of these institutions dropping below third. Only two other institutions remained in the top 10 throughout this period (Indiana and the University of California, San Diego), and thirty-six institutions remained in the top 50 throughout the period. The biggest climbers, who climbed more than thirty places between 19931997 and 19982002, were the State University of New York, Binghamton (from 117th to 19th), Aberystwyth (136th to 39th), Penn State (101st to 33rd), Geneva (104th to 43rd), Trinity College Dublin (96th to 40th), University College London (83rd to 46th), Illinois-Urbana Champaign (80th to 44th) and Georgetown (50th to 16th). Nevertheless, almost fty per cent (twenty-four) of the institutions in the top 50 in 19982002 rose or fell less than ten places from their positions in 19931997.

Conclusion
A reasonably objective, easily updated and global ranking of departments in political science can be produced by borrowing a method used in other disciplines that of measuring the research output of institutions in the main journals in the eld in a given period, and controlling for the number of full-time staff in a department. This method produces a series of rankings that seem intuitively correct and compare well with the equivalent rankings in economics, in terms of the regional and national balance of institutions in the top 10, 20, 50, 100 and 200. One possible problem with these rankings is the apparent English-language bias in the results, which undermines the aspiration to be truly global.

304

Table 2: The Global Top 200 Political Science Departments, 19982002


Quantity (1) Faculty Size Rank 4 1 6 10 17 12 13 16 7 5 8 24 14 9 2 11 25 28 18 50 3 33 32 45 31 54 15 47 20 35 49 61 53 59 29 250.72 122.48 225.92 181.00 123.39 126.96 150.62 83.04 177.51 115.82 106.41 167.69 113.01 161.10 52 62 26 53 29 12 47 15 23 45 44 32 78 24 233.42 140.44 201.42 127.51 120.86 302.99 14 34 20 43 48 9 1.812 2.427 1.797 2.461 2.625 1.744 2.454 2.736 1.741 2.291 1.735 1.410 2.049 1.433 1.526 1.928 1.848 1.602 2.736 1.445 36 13 38 10 8 44 11 5 45 16 46 77 20 74 61 28 34 53 5 72 268.34 204.69 238.39 317.13 338.87 10 19 13 8 4 1.925 1.966 1.820 1.678 1.886 29 25 35 48 31 5.963 7.058 6.113 6.218 4.459 5.428 6.106 6.714 5.100 7.554 4.328 5.791 5.912 6.708 5.137 7.004 5.223 5.832 5.510 5.656 5.609 6.348 6.025 5.931 4.798 254.53 214.71 224.68 319.40 323.59 11 17 16 7 6 2.006 2.229 2.045 1.786 1.767 22 18 21 39 41 6.879 6.710 6.808 6.518 6.223 16 19 17 22 25 31 13 28 26 57 40 29 18 45 10 60 35 33 20 43 14 41 34 39 37 38 23 30 32 50 420.10 700.93 342.88 327.87 157.97 2 1 3 5 30 2.682 2.349 2.393 1.976 3.652 7 15 14 24 2 9.336 8.057 9.023 7.625 9.292 2 6 4 9 3 Rank Rank Rank 45 87 38 43 17 37 32 33 49 52 45 29 39 51 76 43 23 30 25 16 70 20 18 25 22 23 48 21 41 32 22 20 25 14 37 42.41 36.95 40.06 38.31 53.45 67.69 43.03 58.75 48.83 49.08 49.25 43.53 50.41 39.91 77.91 55.83 53.91 61.53 42.00 122.08 86.62 57.00 70.98 85.56 143.31 74.22 71.34 67.49 87.50 91.90 120.69 204.37 90.94 84.95 62.08 3.75 5.75 6.75 12.00 13.00 15.25 16.75 17.50 18.75 19.25 19.50 20.25 22.50 22.75 23.50 25.25 25.25 26.00 29.00 29.00 29.00 32.75 33.00 34.00 35.75 35.75 36.25 37.00 37.00 39.00 40.00 40.50 42.00 43.50 43.75 No. of Articles Articles* Impact Articles/ Fac. Size Impact/ Fac. Size Average of Ranks 1 to 4 Impact (2) Quantity/Size (3) Impact/Size (4) Quasi-Error Best 1 2 2 4 5 6 6 7 6 7 6 8 9 10 12 13 13 13 19 19 16 19 22 19 24 22 22 24 22 24 25 25 30 31 30 Worst 1 3 3 5 5 9 8 12 12 14 12 15 18 18 18 18 18 21 21 21 21 27 27 29 29 30 32 32 33 36 34 38 39 38 41

Overall Rank

University

Country

1 2 3 4 5

Columbia Harvard Stanford Ohio State EUI

USA USA USA USA Italy

6 7 8 9 10

UC, San Diego UC, Irvine Indiana Princeton Yale

USA USA USA USA USA

11 12 13 14 15

UC, Berkeley Michigan State Chicago UC, Los Angeles LSE

USA USA USA USA UK

16= 16= 18 19= 19= 19=

Georgetown Essex MIT ANU SUNY, Bimghamton Oxford

USA UK USA Australia USA UK

22 23 24 25= 25=

Birmingham Cambridge Florida State Shefeld Washington

UK UK USA UK USA

27 28= 28= 30

Michigan Johns Hopkins Texas A&M Emory

USA USA USA USA

SIMON HIX

31 32 33 34 35

Colorado American Pennsylvania Bristol UNC Chapel Hill

USA USA USA UK USA

36 37= 37= 39 40= 40= 31 12 40 15 5 15 25 22 35 44 33 8 29 22 28 16 38 55 35 29 24 21 45 19 15 27 31 29 9 18 27 21 18 18 25.33 38.50 31.33 28.33 28.66 99 58 72 85 82 82.17 87.31 75.39 65.71 64.33 81 74 88 98 100 26.33 32.08 32.56 38.92 14.75 94 71 70 56 161 65.79 117.25 144.52 103.44 79.38 97 50 33 64 84 1.755 1.188 1.050 1.342 1.639 1.407 1.426 1.492 1.574 1.592 43.33 32.75 35.75 48.97 29.00 46 69 65 34 80 100.61 102.85 75.76 156.67 79.90 69 66 87 31 83 1.494 1.365 1.702 1.088 1.526 67 84 47 130 61 43 112 139 88 49 78 75 68 56 55 37.33 23.67 47.58 57.79 42.50 60 104 40 23 48 119.75 100.74 135.96 195.42 130.13 49 68 38 21 41 1.333 1.479 1.252 1.051 1.214 90 70 107 138 109 4.277 6.296 3.578 3.553 3.718 3.469 4.285 3.608 3.482 4.205 4.386 4.343 4.662 3.567 8.820 4.565 3.234 3.590 3.651 3.574 55.73 44.75 19.45 41.33 38.81 26 42 128 51 57 170.44 122.84 63.30 112.75 82.21 25 46 105 54 80 1.267 1.356 2.431 1.425 1.764 103 86 12 76 42 3.874 3.722 7.913 3.888 3.737 75 77 8 73 76 64 24 88 92 78 96 63 86 95 67 58 59 53 90 5 56 107 87 84 89 57.25 62.75 63.25 63.50 63.75 65.75 66.50 68.25 68.50 69.00 69.50 70.50 71.25 72.50 72.75 73.00 73.00 73.75 74.50 74.75 78.50 78.50 78.75 80.75 81.50 25.67 24.33 36.30 35.25 44.37 96 102 63 67 44 51.75 109.52 127.69 108.51 164.19 117 55 42 57 28 5.134 1.622 1.452 1.602 1.268 1 51 71 53 102 10.350 7.301 5.108 4.932 4.691 1 12 44 48 51 53.75 55.00 55.00 56.25 56.25 46 44 42 45 44 45 52 52 52 52 52 52 52 52 56 52 56 55 55 56 58 58 58 57 58 62 63 62 68 69 48.08 30.17 52.28 28.16 39 76 30 87 133.22 74.58 182.68 99.00 40 89 22 70 1.551 2.514 1.307 1.877 59 9 97 32 4.297 6.215 4.567 6.600 61 27 55 21 49.75 50.25 51.00 52.50 36 39 39 39 48 48 48 51 50 51 51 52 51 51 67 57 62 63 63 68 68 68 70 70 73 73 73 74 74 74 73 75 75 80 80 80 83 83

George Washington Cardiff UW Madison Aberystwyth TCD Vanderbilt

USA UK USA UK Ireland USA

44 10 39 25 9 15

59.18 30.33 57.94 48.50 27.50 28.83

19 75 22 38 88 81

205.19 74.52 167.32 106.76 71.42 103.23

18 90 27 60 93 65

1.345 3.033 1.486 1.940 3.056 1.922

87 4 69 26 3 30

4.663 7.452 4.290 4.270 7.936 6.882

52 11 62 65 7 15

44.00 45.00 45.00 47.25 47.75 47.75

31 33 31 33 39 34

43 39 42 44 43 46

42 43 44 45

Cornell Geneva Illinois, Urbana-Champaign Rice

USA Switzerland USA USA

POLITICAL SCIENCE DEPARTMENTS

46 47= 47= 49= 49=

UCL SUNY, Stony Brook UC, Davis Arizona Virginia

UK USA USA USA USA

51 52 53 54 55

Duke Oslo Claremont Graduate Pittsburgh Leiden

USA Norway USA USA Netherlands

56 57 58 59 60

Iowa New Mexico New York Minnesota George Mason

USA USA USA USA USA

61 62 63 64 65

Hebrew, Jerusalem Arizona State Hull Maryland Georgia

Israel USA UK USA USA

66= 66= 68 69 70

SUNY, Buffalo Houston Northwestern Pennsylvania Cal Tech

USA USA USA USA USA

71= 71= 73 74 75

Southern California Warwick Tel Aviv Mannheim Strathclyde

USA UK Israel Germany UK

305

306

Table 2: The Global Top 200 Political Science Departments, 19982002: Continued
Quantity (1) Faculty Size Rank 90 41 27 74 66 99 93 108 84 114 119 64 83 97 123 43 55 21 91 163 106 111 152 110 73 158 142 119 112 76 52.81 56.11 31.88 49.86 74.40 32.68 50.40 43.38 70.28 103.61 106.62 140.25 80.23 56.06 61 35 82 114 116 113 174 124 91 167 123 145 94 63 0.976 0.879 1.086 1.329 1.363 1.309 1.938 1.406 1.107 1.854 1.321 1.500 1.040 0.838 47.52 87.43 101.34 77.40 46.20 136.30 127 73 67 85 134 37 1.615 1.164 0.988 1.109 1.558 0.840 52 118 150 122 57 180 152 169 131 92 85 96 27 79 124 33 93 65 141 182 82.22 58.72 51.42 84.83 51.62 79 110 119 77 118 1.267 1.554 1.631 1.182 1.518 103 58 50 113 63 4.111 3.454 3.673 3.535 3.687 3.655 2.820 3.494 3.365 3.554 2.572 2.666 2.125 3.209 5.096 3.106 3.301 3.985 3.116 2.657 4.085 3.877 3.099 3.347 2.878 86.09 137.04 116.37 107.74 86.52 76 36 51 58 75 1.294 1.044 1.178 1.067 1.321 99 140 114 133 93 4.100 3.045 2.530 3.715 3.204 70 119 147 79 110 69 97 82 93 81 83 131 94 99 91 145 139 183 109 46 116 104 72 115 140 71 74 117 101 129 Rank Rank Rank 21 45 46 29 27 20 17 14 24 14 13 31 29 23 13 53 40 66 25 11 17 17 8 16 28 8 13 14 21 36 14.83 17.17 21.00 21.83 30.17 23.17 22.25 15.50 22.50 31.00 39.03 58.00 27.15 14.62 21.00 36.08 28.65 25.50 20.25 44.53 25.33 26.42 22.83 28.37 21.25 27.17 46.98 54.17 30.95 35.67 No. of Articles Articles* Impact Articles/ Fac. Size Impact/ Fac. Size Impact (2) Quantity/Size (3) Impact/Size (4) Quasi-Error Best 83.75 84.00 84.75 86.00 86.00 87.50 89.50 89.75 91.75 94.00 95.25 96.50 98.50 100.75 101.25 101.25 101.75 102.00 103.25 103.75 105.75 106.00 106.25 107.00 107.00 107.25 108.00 111.50 112.00 112.50 68 71 74 74 71 71 74 76 76 81 81 84 82 85 84 86 86 86 85 85 87 86 87 86 89 87 87 88 92 96 Worst 81 83 83 86 84 85 88 92 88 100 102 95 96 103 106 100 102 102 104 105 107 106 109 107 107 109 107 109 109 107

Overall Rank

University

Country

Average of Ranks 1 to 4

76 77 78 79= 79=

Missouri Washington Aarhus North Texas Sussex

USA USA Denmark USA UK

81 82 83 84 85

UW, Milwaukee Aberdeen Newcastle-Upon-Tyne UC Santa Barbara Glasgow

USA UK UK USA UK

86 87 88 89 90= 90=

Leicester Manchester Rochester Louisiana State Birkbeck, London Rutgers

UK UK USA USA UK USA

92 93 94 95

Syracuse Toronto Kansas UC, Riverside

USA Canada USA USA

96 97 98 99= 99=

Bradford Humboldt Western Australia Edinburgh Leeds

UK Germany Australia UK UK

101 102 103 104 105

Durham Alabama QMUL Dartmouth South Carolina

UK USA UK USA USA

SIMON HIX

106 107 108 109 110 19 22 23 16 29 62 8 48 6 25 16 16 17 20 17 21 4 50 9 7 14 20 29 22 13 11 12 21 8 15 15.17 18.75 11.00 16.50 155 130 197 146 29.81 47.24 25.07 33.67 184 130 213 163 16.00 21.33 25.50 20.81 13.50 14.50 148 113 97 122 173 165 38.81 44.76 60.92 49.31 38.50 28.97 150 138 107 126 152 189 1.143 1.067 0.879 0.946 1.038 1.318 1.264 0.893 1.375 1.100 21.08 8.50 36.33 13.00 12.50 115 236 62 177 182 50.78 20.35 95.90 29.04 22.85 121 241 71 188 223 1.004 2.125 0.727 1.444 1.786 145 19 212 73 39 120 133 169 158 142 95 106 167 81 127 2.418 5.088 1.918 3.227 3.264 2.772 2.238 2.101 2.241 2.962 2.634 2.484 2.250 3.134 2.245 18.50 19.25 20.25 18.67 18.67 134 129 123 132 132 45.13 42.94 43.54 58.89 47.06 136 146 144 109 131 1.156 1.203 1.191 0.934 1.098 119 110 111 160 129 2.821 2.684 2.561 2.945 2.768 130 137 146 126 135 153 47 201 108 105 134 173 186 172 123 141 149 169 113 170 48.66 12.33 34.83 10.83 23.83 37 183 68 199 103 109.31 29.16 107.03 25.18 61.18 56 186 59 212 106 0.785 1.541 0.726 1.805 0.953 193 60 214 37 157 1.763 3.645 2.230 4.197 2.447 222 85 174 68 152 127.00 128.50 128.75 129.00 129.50 129.75 130.50 131.00 131.75 131.75 133.50 135.75 136.50 136.50 137.25 138.00 139.25 139.75 144.50 147.50 147.50 148.50 149.00 151.00 151.50 21.08 23.17 21.00 18.75 27.33 115 106 119 130 89 50.93 57.17 69.08 47.31 66.94 120 111 95 129 96 1.109 1.053 0.913 1.172 0.942 122 137 162 115 159 2.681 2.599 3.003 2.957 2.308 138 143 121 125 162 123.75 124.25 124.25 124.75 126.50 107 108 107 107 108 108 108 108 107 110 108 108 108 110 110 110 108 117 108 110 116 115 119 122 126 122 124 131 126 131 127 126 126 129 131 126 136 129 133 133 133 133 133 133 133 134 149 134 143 143 139 142 138 143 147 151 156 151 158 158

Montreal Tufts Max Planck Texas, Austin Groningen

Canada USA Germany USA Netherlands

22 18 32 59 15

24.90 19.83 29.83 41.33 18.25

101 127 78 51 135

63.46 60.43 76.63 133.38 46.27

103 108 86 39 133

1.132 1.102 0.932 0.701 1.217

121 125 161 222 108

2.885 3.357 2.395 2.261 3.085

128 100 154 168 118

113.25 115.00 119.75 120.00 123.50

95 96 103 103 103

109 114 121 117 126

111 112= 112= 114 115

Southampton Georgia State Kentucky Liverpool British Columbia

UK USA USA UK Canada

POLITICAL SCIENCE DEPARTMENTS

116 117 118 119 120

Amsterdam Manchester Metropolitan Notre Dame Denver SUMY, Albany

Netherlands UK USA USA USA

121 122 123 124= 124=

Kent Exeter Helsinki Brown West Virginia

UK UK Finland USA USA

126 127 128= 128= 130

Nottingham Liverpool John Moores CUNY Lafayette Murdoch

UK UK USA USA Australia

131 132 133 134 135= 135=

East Anglia Melbourne Konstanz QUB UCD St Andrews

UK Australia Germany UK Ireland UK

137 138 139 140

Twente Texas Technological Truman State Bremen

Netherlands USA USA Germany

307

308

Table 2: The Global Top 200 Political Science Departments, 19982002: Continued
Quantity (1) Faculty Size Rank 180 95 137 105 150 204 149 92 184 166 143 144 78 176 86 108 158 140 162 153 181 158 141 340 317 223 170 175 39.90 32.75 35.17 20.77 14.64 19.57 27.76 32.64 63.40 36.23 45.44 50.50 32.11 104 157 135 122 173 149 166 159 238 319 253 198 168 0.652 0.872 0.757 0.667 0.902 0.740 0.872 0.867 1.375 2.000 1.286 1.000 0.836 31.62 63.75 73.26 30.42 57.16 177 102 92 181 112 1.063 0.623 0.635 1.013 0.743 135 252 246 144 207 238 173 200 233 164 208 173 175 81 23 100 147 183 40.14 34.62 65.31 43.58 31.76 148 161 99 143 175 0.848 0.994 0.713 0.817 1.014 177 149 217 187 143 3.345 2.164 1.719 2.905 2.269 1.976 2.361 1.559 2.340 1.504 1.811 2.131 1.976 2.295 1.889 2.347 1.926 1.759 5.193 4.880 2.796 1.983 2.040 38.76 55.87 44.00 64.04 41.57 151 115 139 101 147 0.975 0.839 0.900 0.707 0.879 154 181 165 221 169 2.982 1.802 2.200 1.941 2.309 122 217 176 199 161 102 177 231 127 166 196 157 244 159 257 213 181 196 164 204 158 200 225 42 49 133 195 189 Rank Rank Rank 13 31 20 33 18 12 16 38 15 14 16 27 47 13 38 35 17 23 22 17 17 17 20 4 3 7 14 16 9.00 14.00 13.37 12.58 14.83 17.33 5.50 6.00 22.83 14.83 17.42 14.67 15.33 17.00 16.83 29.83 13.17 28.25 10.17 15.91 27.08 12.25 14.20 12.67 26.00 18.00 23.33 15.83 No. of Articles Articles* Impact Articles/ Fac. Size Impact/ Fac. Size Impact (2) Quantity/Size (3) Impact/Size (4) Quasi-Error Best 151.75 152.00 154.25 156.50 156.75 157.75 159.00 159.75 160.25 162.50 162.75 163.75 165.00 165.00 165.50 165.75 167.25 167.75 170.25 173.50 174.00 174.25 175.00 175.25 177.00 177.25 177.50 178.75 130 134 134 135 134 134 134 135 135 135 135 142 144 137 143 143 141 144 143 147 150 147 148 138 139 139 146 150 Worst 152 149 158 158 159 164 161 158 160 168 168 168 167 171 167 167 171 171 174 177 176 178 174 197 197

Overall Rank

University

Country

Average of Ranks 1 to 4

141 142 143 144 145

William and Mary Keele Simon Fraser Boston NUST

USA UK Canada USA Norway

146 147 148 149 150

Mississippi Nijmegen Uppsala Nebraska Loyola

USA Netherlands Sweden USA USA

151 152 153= 153= 155

Grifth Iowa State Florida UWE Copenhagen

Australia USA USA UK Denmark

156 157 158 159 160

Oklahoma Bern Vienna Illinois, Chicago New South Wales

USA Switzerland Austria USA Australia

161 162 163 164 165

Southern Illinois Nottingham Trent Sydney Reed Portland State

USA UK Australia USA USA

SIMON HIX

166 167 168

Stirling Tasmania Reading

UK Australia UK

188 179 180

169 170= 170= 4 15 5 5 18 8 13 7 22 34 24 6 6 15 11 5 30 31 2 8 10 33 20 16 19 15 35 4 9 12.83 10.50 19.90 5.50 8.17 179 202 126 340 244 29.49 27.21 43.70 13.77 18.50 185 202 142 337 269 0.675 0.700 0.569 1.375 0.908 7.83 8.00 17.50 14.00 11.92 257 248 139 170 186 18.97 22.79 47.33 30.57 26.11 261 224 128 180 207 0.979 0.800 0.530 0.700 0.745 151 189 291 225 206 231 225 280 81 163 9.00 6.33 21.00 18.17 4.50 223 306 119 136 389 23.40 16.20 36.37 43.91 11.52 221 296 155 140 381 0.818 1.266 0.700 0.586 2.250 186 105 225 269 17 2.127 3.240 1.212 1.416 5.760 2.371 2.279 1.434 1.529 1.632 1.552 1.814 1.249 3.443 2.056 21.00 15.67 7.00 8.00 11.00 119 151 283 248 197 44.86 36.30 17.75 14.88 27.86 137 156 279 317 196 0.618 0.653 1.167 1.333 0.733 253 237 116 90 210 1.319 1.513 2.958 2.480 1.857 287 254 124 150 205 182 106 317 273 36 155 165 269 253 235 245 211 308 98 188 199.00 199.50 200.50 201.25 202.00 203.00 203.25 204.00 204.50 205.75 206.00 206.50 206.75 207.00 208.50 210.00 210.00 214.00 214.00 216.00 14.17 9.33 11.00 6.83 16.50 167 216 197 291 146 31.71 18.46 25.97 22.35 32.35 176 271 210 227 171 0.787 1.166 0.846 0.976 0.750 191 117 178 152 203 1.762 2.308 1.998 3.193 1.470 223 162 193 112 267 189.25 191.50 194.50 195.50 196.75 159 155 161 153 169 172 172 153 159 172 170 157 172 173 159 165 172 177 172 175 177 175 179 165 172 5.50 8.50 6.50 7.58 340 236 299 267 18.52 49.60 18.51 15.64 267 125 268 308 1.375 0.567 1.300 1.516 81 281 98 64 4.630 3.307 3.702 3.128 54 103 80 114 185.50 186.25 186.25 188.25 143 150 143 143 206 195 207 207 195 207 200 208 199 204 205 217 226 208 214 228 202 205 231 228 220 208 211 215 217 215 211 251 240

Lancaster Science-Po Bath

UK France UK

18 85 19

15.23 48.83 14.58

154 35 164

32.26 91.03 35.18

172 72 158

0.846 0.574 0.767

178 277 198

1.792 1.071 1.852

218 342 206

180.50 181.50 181.50

153 160 157

180 175 183

172 173= 173= 175

INSEAD Bowdoin TU Darmstadt GIIS

France USA Germany Switzerland

POLITICAL SCIENCE DEPARTMENTS

176 177 178 179 180

Gothenburg Westminister De Montfort Lehigh Queensland

Sweden UK UK USA Australia

181 182 183 184 185

Lund Leuven (KUL) UCLAN Tubingen Victoria

Sweden Belgium UK Germany Canada

186 187 188 189 190

EU Viadrina Bryn Mawr Florence CEU Erlangen Nurnberg

Germany USA Italy Hungary Germany

191 192 193 194 195

Staffordshire Fern of Hagen Connecticut York Bergen

UK Germany USA UK Norway

196= 196= 198= 198= 200

McMaster Southern Methodist Carleton Juan March Ulster

Canada USA Canada Spain UK

309

310

SIMON HIX

Table 3: Correlations between the Ranks of the Top 200 Political Science Institutions, 19982003
Rank 1 Rank 1 (Quantity) Rank 2 (Impact) Rank 3 (Quantity/Faculty Size) Rank 4 (Impact/Faculty Size) Overall Rank
Method: Spearmans rank-order correlation.

Rank 2 0.405 0.583 0.879

Rank 3

Rank 4

0.962 0.429 0.507 0.862

0.896 0.759

0.832

However, English is the international language for the publication and citation of research in political science, as in other social sciences and the natural sciences. Because of the ease of reading, publishing in and teaching from these international journals, scholars in English-speaking universities are inevitably more closely integrated into the global discipline than scholars outside the English-speaking world. As a result, a ranking of departments using research published in the top international journals in a eld is inevitably not a fair representation of the quality of departments outside the English-speaking world. One possible solution would be to include more non-English-language journals in the analysis. However, given the low number of citations to research published in non-English-language journals, it is hard to make a case for including some non-English journals while omitting others, or even for including nonEnglish-language journals with low citations while excluding some journals with higher citations. A second problem is that book publications are more common and important in political science than in economics. As discussed, if one assumes that a good department would produce a lot of articles as well as books, then only measuring journal publications may not make a difference to the ranking of institutions at the departmental level. Nevertheless, this hypothesis can only be checked if a similar ranking could be established using book publications, and the results of the two rankings are compared and perhaps integrated. Despite these shortcomings, two major advantages of the method proposed here are that it would be (i) simple to mechanize and (ii) easy to add other journals or books to the dataset. If the discipline, perhaps via a committee of the International Political Science Association, could agree a set of English and non-English-language journals and book publishers that are the main vehicles for research output in the global discipline, it would not be too difcult to modify this method and establish a mechanized system for entry and updating of the dataset and for calculating new rankings every year. Ideally, each insti-

POLITICAL SCIENCE DEPARTMENTS

311

Table 4: The Rolling Global Top Fifty, 19972002


19931997 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 Stanford Harvard Columbia Indiana = UC Berkeley ANU Essex Houston Iowa UCSD EUI Princeton Arizona Warwick Chicago Georgia Yale = Oxford UW Madison Johns Hopkins Pittsburgh = UCLA Ohio State UC Davis Birmingham MIT Michigan Washington U UCol Boulder Michigan State American Florida State Texas A&M Pennsylvania SUNY Stony Brook UC Irvine Strathclyde Cambridge Leiden Glasgow = LSE Cal Tech UW Milwaukee Arizona State South Carolina Rice GWU Maryland Vanderbilt Georgetown 19941998 Stanford Harvard Columbia Indiana UC Berkeley EUI Houston Essex UCSD Princeton Yale UCLA Ohio State Warwick ANU Birmingham Iowa Chicago UC Davis = Michigan = Oxford Arizona Washington U Georgia UCol Boulder = UC Irvine UW Madison Michigan State Vanderbilt Johns Hopkins American MIT Cambridge Texas A&M Glasgow Leiden SUNY Stony Brook Pennsylvania LSE Florida State GWU Cal Tech Pittsburgh = South Carolina Strathclyde Emory = Georgetown UC Riverside Shefeld = Hull 19951999 Stanford Harvard = Columbia Essex EUI Indiana UC Berkeley UCSD Ohio State Yale UCLA Princeton Oxford Michigan State Vanderbilt American = UC Davis UW Madison Texas A&M = Houston Johns Hopkins Washington U Chicago Birmingham UC Irvine ANU Warwick UCol Boulder Georgia Michigan Bristol Cambridge Georgetown Arizona = GWU Iowa LSE Shefeld Emory MIT Leiden SUNY Stony Brook Florida State UNC Chapel Hill Rice New Mexico Pennsylvania Hull Maryland Glasgow 19962000 Stanford Harvard Columbia EUI UC Berkeley UCSD Essex = Indiana Ohio State Yale Princeton Michigan State Birmingham UC Irvine UCLA Chicago Vanderbilt = Washington U UW Madison Oxford UC Davis Cambridge Johns Hopkins Texas A&M Bristol Shefeld = Georgetown MIT ANU Florida State American = Warwick SUNY Binghamton GWU Houston Georgia LSE Cal Tech SUNY Stony Brook Michigan Rice Penn State = Emory Iowa Hebrew Pennsylvania Arizona Maryland UCol Boulder Claremont 19972001 Columbia Stanford = Harvard EUI UC Berkeley Ohio State UCSD Indiana Princeton Yale Michigan State = Chicago MIT = UCLA UC Irvine Essex Birmingham Vanderbilt Johns Hopkins Cambridge SUNY Binghamton Oxford Georgetown American = LSE Florida State UW Madison Texas A&M Penn State = Emory = ANU UC Davis Michigan Washington U Shefeld Bristol GWU Rice SUNY Stony Brook Aberystwyth Trinity (Dublin) = Arizona Georgia Cardiff = Cornell UNC Chapel Hill Claremont Geneva Houston = UCol Boulder 19982002 Columbia Harvard Stanford Ohio State EUI UCSD UC Irvine Indiana Princeton Yale UC Berkeley Michigan State Chicago UCLA LSE Essex = Georgetown MIT Oxford = SUNY Binghamton = ANU Birmingham Cambridge Florida State Shefeld = Washington U Michigan Johns Hopkins = Texas A&M Emory UCol Boulder American Penn State Bristol UNC Chapel Hill GWU Cardiff = UW Madison Aberystwyth Trinity (Dublin) = Vanderbilt Cornell Geneva Illinois Rice UCL (London) SUNY Stony Brook = UC Davis Arizona = Virginia

Note: = means that an institution is in the same position as the institution listed immediately before it.

312

SIMON HIX

tution that wanted to be included in the rankings could be asked to provide accurate and up-to-date information about the size of their faculty. (Accepted: 7 May 2004)

About the Author


Simon Hix, Department of Government, London School of Economics, Houghton Street, London, WC2A 2AE, UK; email: s.hix@lse.ac.uk; website: http://personal.lse.ac.uk/hix

Notes
Research for this paper was partially funded by The Nufeld Foundation (Grant No: SGS/00889/G). I would like to thank Gabriele Birnberg, Ryan Barrett and Bjorn Hoyland for data collection; Nancy Beyers at the Institute for Scientic Information for help with purchase of the Social Science Citation Index data; and Daniele Archibugi, Rodney Barker, Kenneth Benoit, Jeff Checkel, Tom Coup, Philip Cowley, Pepper Culpepper, Nicole Deitelhoff, Vedran Dzihic, Matthew Gabel, Fabrizio Gilardi, Nils Petter Gleditsch, Robert Goodin, Justin Greenwood, Metin Heper, Karl Magnus Johansson, Peter Kurrild-Klitgaard, Iain McLean, Martin Lodge, Peter Mair, Paul Mitchell, Andrew Moravcsik, Cas Mudde, Michael Munger, Yannis Papadopoulos, Thomas Pluemper, Ben Reilly, Christian Reus-Smit, Gerald Schneider, David Shambaugh, Gunnar Sivertsen, Ulf Sverdrup, Alec Sweet, Jon Tonge, Erik Voeten, Albert Weale and the three Political Studies Review referees for comments on the research and previous versions of this paper. 1 See The ISI Database: The Journal Selection Process: <http://www.isinet.com/isi/hot/essays/ selectionofmaterialforcoverage/199701.html>. 2 I considered adding journals of other national political science associations (such as the journals of the Belgian, Swiss, Austrian, Irish and Japanese associations) and a number of other political science journals (such as Aussenwirtschaft). However, none of these journals met the threshold of at least 100 citations per year. 3 The adjusted R2 for the model is 0.781. 4 Part of the difference between these scores and the SSCI scores is explained by the fact that my index is an average impact across several years, whereas the scores I have compared them against are only for the impact of a journal in 2002. 5 More detailed information about how this was calculated for each university can be obtained from the author. 6 Tables showing the top 400 in each ve year period between 1993 and 2002 can be found on my website: <http://personal.lse.ac.uk/hix>.

References
Ballard, M. J. and Mitchell, N. J. (1998) The Good, the Better, and the Best in Political Science, PS: Political Science and Politics, 31 (4), 8268. Coup, T. (2003) Revealed Preferences: Worldwide Rankings of Economists and Economics Departments, 19692000, Journal of the European Economic Association, 1 (4), <http://student.ulb.ac.be/~tcoupe/ranking.html>. Dusansky, R. and Vernon, C. J. (1998) Rankings of U.S. Economics Departments, Journal of Economic Perspectives, 12 (1), 15770. Garand, J. C. and Graddy, K. L. (1999) Ranking Political Science Departments: Do Publications Matter?, PS: Political Science and Politics, 32 (1), 11316. Goodson, L. P., Dillman, B. and Hira, A. (1999) Ranking the Presses: Political Scientists Evaluations of Publisher Quality, PS: Political Science and Politics, 32 (2), 25762. Jackman, R. W. and Siverson, R. M. (1996) Rating the Rating: An Analysis of the National Research Councils Appraisal of Political Science Ph.D. Programs, PS: Political Science and Politics, 29 (2), 15560.

POLITICAL SCIENCE DEPARTMENTS

313

Katz, R. and Eagles, M. (1996) Ranking Political Science Departments: A View from the Lower Half, PS: Political Science and Politics, 29 (2), 14954. Lowry, R. C. and Silver, B. D. (1996) A Rising Tide Lifts All Boats: Political Science Department Reputation and Reputation of the University, PS: Political Science and Politics, 29 (2), 1617. McCormick, J. M. and Rice, T. W. (2001) Graduate Training and Research Productivity in the 1990s: A Look at Who Publishes, PS: Political Science and Politics, 34 (3), 67580. Miller, A. H., Tien, C. and Peebler, A. A. (1996) Department Rankings: An Alternative Approach, PS: Political Science and Politics, 29 (4), 70417. PS: Political Science and Politics (1996a) National Research Council Relative Rankings for ResearchDoctorate Programs in Political Science, PS: Political Science and Politics, 29 (2), 1447. PS: Political Science and Politics (1996b) U.S. News and World Report Ranking of Graduate Political Science Departments, PS: Political Science and Politics, 29 (2), 148. Rice, T. W., McCormick, J. M. and Bergmann, B. D. (2002) Graduate Training, Current Afliation and Publishing Books in Political Science, PS: Political Science and Politics, 35 (4), 7515. Schmitter, P. (2002) Seven (Disputable) Theses Concerning the Future of Transatlanticised or Globalised Political Science, European Political Science, 1 (2), 2340. Scott, L. C. and Mitias, P. M. (1996) Trends in Rankings of Economics Departments in the U.S.: An Update, Economic Inquiry, 34, 378400. Teske, P. (1996) Rankings of Political Science Departments Based on Publications in the APSR, JOP, and AJPS, 19861995, State University of New York, Stony Brook, manuscript. Welch, S. and Hibbing, J. R. (1983) What Do the New Ratings of Political Science Departments Measure?, PS: Political Science and Politics, 16 (3), 53240.

You might also like