Washington Monthly misses the mark in college rankings

Editorials featured in the Forum section are solely the opinions of their individual authors.

I am a faithful reader of The Tartan, and one of its biggest boosters. The paper’s phoenix-like rise from the ashes of the Natrat disaster of two years ago has been inspiring. I salute the students who have led the effort, producing a fine campus newspaper of which we can be proud.

But even the new Tartan can get things wrong, as it did in the editorial “How do we really rank?” in its 11 September 2006 edition. The editors cited a flawed Washington Monthly college ranking. And, in basing sweeping conclusions about Carnegie Mellon on that ranking, the editors came to seriously incorrect conclusions about our University.

The editors were right to ask whether Carnegie Mellon “encourage[s] community service and civic duty”; whether we “go out of [our] way to offer opportunities to those who can’t afford the cost of attendance”; and how many of our “alumni are working to help stop global warming or end poverty.” But the editors were wrong to base their answers to these questions on the Washington Monthly ranking, which was based on a strange methodology. I didn’t make an “educated guess” about the methodology; with the help of our institutional research staff, I went to the trouble of finding out.

The Washington Monthly rankings are based on three equally weighted categories: community service, research, and social mobility. Each of those is based on two or three factors. Let me briefly discuss each in turn.

Community service is based on three factors: percentage of students enrolled in Army or Navy ROTC, percentage of alumni currently working in the Peace Corps, and percentage of work-study grants devoted to community service. While ROTC and the Peace Corps are worthy organizations and we’re proud to support them, these are very narrow measures of community service which overlook the countless contributions that Carnegie Mellon’s faculty, staff, and students contribute to the community through scores of programs and volunteer efforts that have nothing to do with ROTC, the Peace Corps, or federal work-study grants. For instance, the Washington Monthly metrics completely ignore our students’ tutoring and mentoring of K-12 students, the thousands of hours of free support students provide to local non-profits and businesses through capstone project courses and independent research initiatives, the contributions of our Greek life system to local charities, food drives, blood drives, and Hurricane Katrina relief support efforts, and much, much more. Furthermore, no weight is given to alumni working for organizations like Teach for America or for those in public service.

By the way, our alumni are working to reverse global warming and end poverty. Prodipto Ghosh, for example, who received a Ph.D. from the Heinz School, is the permanent secretary of India’s Ministry of Environment and Forests. Hue Lan, who received his Ph.D. in Engineering and Public Policy and is now a dean at Tsinghua University, has been leading an effort to create a sound technology base for environmental policy in China. And, did you see last week’s local news story about Nicholas A. Wilson? He graduated in May from the Heinz School and decided to volunteer this summer in the Gulf Coast region of Mississippi. There’s a long list of alumni who have devoted their careers to making the world a better place.

Research is also based on only three factors: federal research grants, number of Ph.D.s awarded in science and engineering, and percentage of undergrads who go on to receive a Ph.D. (in any subject). I suppose those measures are not unreasonable, but not normalizing them (e.g., research dollars or Ph.D. awarded per faculty) is very misleading. It leads to the conclusion that bigger is not only better, size is basically the only thing that matters. A relatively small university like Carnegie Mellon will always fare poorly on such a list, despite being one of the most research-intensive institutions in the country.

Social mobility is based on percentage of students receiving Pell Grants and the difference between a university’s actual graduation rate and its predicted graduation rate, the latter computed with a formula created by Washington Monthly. The first measure does partially capture the relative size of the student population that qualifies for that form of federal support but it misses completely the other forms of financial aid that universities give. The second measure is bizarre: Washington Monthly uses the number of Pell grant recipients and the school’s average SAT scores to build a regression model to predict a school’s expected graduation rate. The predicted graduation rate is then compared to the school’s actual graduation rate and ranked accordingly. Schools that outperform their expected graduation rate are scored higher than schools that fell under their predicted graduation rates. I’m not sure what they’re trying to measure with this, but no educator I know could support a regression model to predict graduation rate based on only these two variables.

I’ve given more space to the Washington Monthly ranking than it deserves. Indeed, if that ranking proves anything, it’s that any publication — even a marginal local magazine with a tiny circulation — can create rankings and get attention for them.

The real point is that Carnegie Mellon does encourage community service. We go out of our way to provide opportunity to those who can’t afford our tuition. And Carnegie Mellon alumni (and faculty) are absolutely engaged in taking on society’s greatest challenges.