This action might not be possible to undo. Are you sure you want to continue?
News and World Report College Rankings
An Exclusive Interview with Bob Morse
By Top Test Prep, “The Leader in Test Prep and Admissions Consulting” Meet the man behind the single most influential list in college admissions. Bob Morse is the Director of Data Research at U.S. News & World Report, the head of its revered college ranking system. As the force behind a series of annual publications that have achieved unanticipated fame within higher education, Bob Morse has helped to create the college ranking system as it exists today. He was nice enough to sit down with Top Test Prep and answer some questions. Start by telling us a little bit about yourself. I’ve been at U.S. News since 1976. I have a BA in economics and an MBA in finance, so I have a research and quantitative background. Doing the rankings is a research and quantitative analysis project. It’s not journalism in the sense that even though I do have a blog, the rankings themselves aren’t reporting … they’re creating information, while typical journalism is reporting on an event or analyzing an event or giving context to something that’s happened You have a blog? I write the blog once or twice a week called “Morse Code: Inside the College Rankings.” Prior to the blog, U.S. News wouldn’t really write about rankings except at the time that we published the college and grad rankings, so the blog gives us the ability to … make announcements. How did you get connected to U.S. News & World Report? I worked on Wall Street briefly, at a company called E.F. Hutton. A lot of them don’t exist anymore – they merged away – but I used to work there in the mid-
70’s. I was at U.S. News, but in another department. It doesn’t exist anymore … a research department called the economic unit. U.S. News was moving from doing the rankings just based on reputation only – in the very beginning, before I was involved, they were done very simplistically, in ’83 and ’85. They wanted to make them more sophisticated. How did the college rankings come about? At the beginning … we didn’t have the thick guidebook and we didn’t have the web, so it was just something that appeared in the weekly magazine in a very limited sense, sort of a top ten list. It was not some guerilla force in admissions or higher ed – it was just information for consumers and our readers. Nobody thought that it was going to evolve into anything but an occasional feature or cover story. In ’87 I was put in charge. We were going to make it more sophisticated, a combination of reputation and quantitative data, and we were going to start doing this annual guidebook. I got involved in it because they wanted someone with a quantitative research background. How do you assess a school’s reputation? It’s become one of the more controversial parts of the rankings … controversial among people in the higher education establishment. The rankings themselves aren’t controversial to the public. The public, obviously, uses them and is attracted to them to a significant degree – otherwise we wouldn’t keep doing them. We give college presidents and admissions deans and provosts a list of schools and we ask them to rate which ones are excellent and good, so it’s a subjective judgment about the relative standing of schools based on their academic reputations. The academic establishment doesn’t like that – or some of them don’t. Maybe liberal arts schools don’t. I think research universities do. What’s most interesting to you about the rankings? A couple things. One, how it’s become this force in higher education. Some colleges are trying publicly to do better in the rankings and … make educational decisions to improve in the rankings. I think that’s pretty interesting. I think that we’ve filled an informational gap. There’s been a decrease in high school counseling – not at private schools, but at public schools – high school counseling has been diminished by budget cuts, and the public is really searching for tools to help them decide what’s the best school for them. So they’re forced to make decisions on their own and fend for themselves. It’s been satisfying that we’ve been able to fill this informational void. People are becoming more quantitative in judging the best schools. Another interesting thing is that we’ve been part of this accountability movement. Schools are being held accountable for how they spend money, and whether they’re succeeding in educating students: how well are they doing at what they’re supposed to be doing. So it’s been interesting to be part of all these trends.
Which colleges have seen their rankings improve the most over the last two or three years? The rankings are more stable than people think. Typically over a two- or three-year period, the rankings don’t move that much, but I think two schools … Universtiy of Southern California and Washington University in St. Louis … have over the last decade or so made a strategic – they have a strategy to improve themselves, and their strategy is across-the-board improvement, step-by-step. They take small steps each year institution-wide, and that’s the formula to improve in the rankings. What kind of “small steps” are colleges taking to improve their rankings? They’re not small in the sense that they’re little things. They just do them a little bit each year. For example, [a college] would raise the SAT average, so maybe one year it was 1200, the next year it was 1225, the next year it was 1250 … but they wouldn’t go from 1100 to 1300 in one year; they would do it over a ten-year period. Or they would increase the freshman retention rate. They’d put money into increasing freshman retention. The graduation rate would be another one, or faculty salaries. They might put more emphasis on small classes and reduce the number of large classes. They’ll do this a little bit each year, focusing on many factors of the academic environment.
Have you seen any questionable practices put in place just so a college can increase its ranking? There was an event in the summer that came to light, even though I think it was debunked: that the president of Clemson … they were not voting honestly on the peer assessment survey … but we have safeguards to prevent strategic voting. Some schools have put in ways to boost their application count. They may have a one or two or three part application, and reject a student on the second part. They may not have had any intent to seriously consider the student. When they report their data, some schools leave out minorities or certain types of students … they’ll have left out special cases who are beneath their SAT or ACT profile, so they may look like their scores are higher than they are. It’s unclear why they actually do that, because they may be inhibiting people from applying. I haven’t seen any specific names. How does a college break into the top rankings? It’s very difficult. It’s relatively easy, if you’re right beneath the top half, to break into the top half, or to move up somewhat if you’re in the middle of the pack. If you’re in the middle of the pack, it’s easy to move up somewhat, and college presidents have a reputation doing that, like Clemson or Northeastern. There are many schools – Arizona State, University of Arkansas, to name some – who haven’t been that highly ranked, but because their profile isn’t that high, they’ve moved up into the bottom of the top half.
It’s very crowded at the top. It’s really hard to change your academic profile – to become another Harvard, Yale, or Princeton – for a number of reasons. It’s not impossible, but it’s difficult. It’s probably easier now in some ways than it used to be, because schools are becoming more international. There is a much bigger population in the US or the world, so the top 1 or 2% of SAT scores is bigger. Places like Stanford don’t have enough spaces for the top 1 or 2% of students. You can tell by their rejection rates. They’re rejecting people with a 1600, they’re rejecting valedictorians, they’re rejecting … the saddest part of the whole process is high school students who’ve played by all the rules … they’ve created the perfect application packet, and they can get rejected, whereas 20 years ago the odds of you getting into those top schools was greater. There’s a pool of these students who have to go somewhere. That’s why Duke and MIT and Washington University and USC … their academic profile is much higher than it used to be – their admission profile. In some cases they’ve used scholarships, but in other cases there’s just more people out there. What are some of the major trends you’ve noticed in the rankings? When we first started doing the rankings, they were ignored in some ways by college presidents. Now it’s become an acceptable thing among some college presidents to have as a goal: improving in the college rankings. This is at places such as Northeastern and Arizona State and Clemson, so the acceptance of the rankings as an academic benchmark, that’s certainly been one trend. Another trend along the same lines is that many schools brand themselves by how well they do in our rankings or other rankings. When the rankings first came out, they wouldn’t consider doing that. It’s not that we were asking them to do it, but that speaks to the school’s needs of having an external force telling the public they’re good. Another trend: the schools have gotten way more sophisticated in understanding the rankings and how they work. I think the public has benefitted because there didn’t used to be a lot of higher education data out there. [With] the amount of higher ed data that exists, schools have gotten much better at producing information on themselves, so they’ve responded to the consumer’s need for comparative higher educational data. Have some schools rebelled against the rankings? Reed, Oregon, St. John’s in Annapolis and its cousin schools – those are some of the … biggest rebels. They refuse to turn over their statistical data, or they refuse to fill out certain parts of the survey, so they’re taking this supposedly principled stance. They think that being against the establishment is going to be appealing to their particular applicant pool. I think that’s the main reason they do it. It’s fine if they don’t want to do that, but what the schools have to realize is that there’s so much public, available data. They have to turn in essentially the same data to the government, so we’re able to get the same information from other sources. There’s been a movement among certain liberal arts colleges to not
participate in the peer surveys. Amherst, Swarthmore, Reed, Oberlin. Lloyd Thacker has a movement called “college unranked.” How should students use the rankings? Nobody, a student or a parent, should ever use the rankings as the sole basis for deciding to go to one school. It should not be the most important factor. The UCLA freshman survey asks freshmen to choose what factors have been very important in choosing to go to [their] school. The rankings themselves are not a top factor, but certainly they’re more important among minority groups or international students. For people who are going to more selective schools, the rankings are more important. I understand why: if you’re coming from overseas, you want to go to a brand name, because that’s going to be important when you come back to the country. To some parents, when you’re paying, as the price of college has gone up, people want to know if they’re getting their money’s worth, trying to analyze the best value, so that’s another factor in why the rankings have become a more powerful source. I think it’s a minority who uses the rankings as a primary factor, but some do. Admissions counselors or high school counselors have told stories about parents who come in and are effectively saying, “I only want my Johnny or Jane to go to a school above this line,” or “The schools you’ve recommended aren’t too highly ranked by U.S. News.” We’re not the best friend of counselors who feel it’s offering a simplistic answer to a complex problem. How much would you estimate schools spend to lobby or market to improve their rankings? The ranking system is sort of lobby-proof. Talking to US News isn’t going to improve your ranking because they’re based on quantitative numbers, a formula, but certainly schools send out brochures, try to raise their profile among other presidents and deans because of the academic survey. I think it’s more subtle how they’re spending money to improve in the rankings. With Washington University or UNC, they may be spending money to improve student services so they get a higher graduation rate. The way to improve in the rankings is through the institution itself, not by lobbying US News, which is actually a good thing because students benefit from that.
How has your formula changed over the last ten to fifteen years? At the beginning they were 100% reputation, and today they’re 25% reputation and 75% quantitative data, so that’s certainly one change. We’ve de-emphasized admissions data to some degree. We’ve switched the weight to output like graduation and retention rates. We’ve also dropped “yield.” At one point we had yield in the model, but now we don’t.
Which colleges, in your opinion, will be making a jump in the rankings? Rochester has been falling recently. For the next few years, the rankings are going to be impacted by the recession. States have been cutting the budgets of the some of the major public schools. It’ll be interesting to see whether the UCal schools can maintain their position. It’s unclear whether the tuition increase is going to be enough to cover the budget cuts. They may start taking more out-of-state students. The UCals take almost no out-of-state students, so there’s talk that they’re going to take a greater percent of out-of-state students because their tuition is so much higher. It’s going to be harder for in-state students to get into the publics from their own state as those schools accept or enroll a greater proportion from out of state as a revenue enhancer. If the UC schools drop in the rankings, who comes up? Some of the privates who’ve been managed [constructively] may be able to maintain their budgets. Some of the privates’ endowments have really fallen. The way these rules work, you have to average your endowment spending over x number of years, so that will have an impact on their budgets. There are rules: you have to spend 4 or 5% of your endowment each year, so if your endowment is shrinking, that’s why schools like Harvard have to cut back. The point is, it’s hard to know how all these cutbacks and trends are going to impact the rankings because it’s happening in both publics and privates in different ways. I know that schools have tried to emphasize their alumni giving. That’s how schools game the rankings, by boosting their alumni giving rate. We’re not counting the average contribution; we’re counting the average portion of alumni that are giving – not the amounts. But it’s not a heavily weighted factor. How do you see the ranking system changing over the next few years? Using the web, we can create a use-your-own-ranking. Students can develop their own ranking, so if they think the student-faculty ratio is more important than U.S. News does, they can weight our factors using their own weights to come up with where they stand. We’re going to build more interactive features on our website, trying to take advantage of what the internet offers to students. I think maybe within a few years there will be more outcome measures, more ways of viewing the student experience: student engagement or student learning. That’s what’s missing from the rankings: some indicator of what’s going on in the classroom, or how much students have learned. Do you think that U.S. News would benefit from factoring in what students do after graduation? Definitely. But [right now] it’s only spotty data. We measure what happens after graduation in our MBA rankings and our law rankings because we have placement data, career outcomes for the most recent class, but there’s nothing like that available at the undergraduate level. Yes, if there were data like that, it would be pretty powerful.
Have any notable schools called or emailed to contest their rankings? Schools call and contest their rankings all the time. The schools don’t really lobby us … schools call about their rankings. A couple years ago we had something about UC Davis saying that they’d misreported some data, and they called up all upset about it. What you find, the very top schools – the Harvard, Yale, and Princeton’s – they will try to stay above the fray. They don’t send out press releases and they’re not going to be in contact with us on the rankings. A lot of it is, Why [do] they rank the way they do, or, Explain how the rankings work, or, Where’d you get that data – because in some cases if they’ve assigned filling out the surveys to some other office, then when the rankings come out, a senior person in the president’s office says, Well that can’t be right. Of course we can prove that we got it from the school. Sometimes you can call up two or three offices at the same school and get slightly different answers to the same questions. So we face that when we collect data from schools. **This concludes Top Test Prep’s in-depth interview with Bob Morse of U.S. News & World Report. Amary Wiggin, and admissions consultant and President of Top Test Prep, Ross Blankenship, conducted this interview. Top Test Prep specializes in private tutoring, test prep, and admissions consulting to help students get into top prep schools, colleges and graduate schools. Call (800) 501-PREP to find out more. The full transcript can be read on the admissions blog.
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue listening from where you left off, or restart the preview.