/

Catching up with Conservatism: Ranking the Rankings

honors rachel head

Editor’s note: This article was initially published in The Daily Gazette, Swarthmore’s online, daily newspaper founded in Fall 1996. As of Fall 2018, the DG has merged with The Phoenix. See the about page to read more about the DG.

College Rankings are a corrupted, self-serving, meaningless, deceptive Big Business. At least – so say the students already securely admitted to America’s “best colleges.” Certainly many academics have come out against the U.S. News & World Report ranking system and its annual numbers game. But is it the game we deplore, or the specific numbers the magazine prioritizes?

If you’re a student at Swarthmore, you surely know that Swarthmore is currently ranked U.S. News and World Report‘s third best liberal arts school, behind those New England colleges we don’t much like to mention. If you’re a real rankings spectator, you’re also aware that other peer institutions, such as Reed and Sarah Lawrence, refuse to comply with the statistical shakedown and are, consequently, buried down in the U.S. News liberal arts lineup. You might say the rankings behave as a box-score for snobbery. You might argue keeping a numbers tab on top colleges encourages an elitist impulse to foster an American gentry. I for one have a bit of a complex regarding the whole numbers ordeal. They embarrass me, except when they make me proud of my school. I find them arbitrary and misguided, except when I admit that I researched and visited Swarthmore because of its status in the liberal arts trifecta.

The current U.S. News logarithm consists of undergraduate academic reputation, graduation and freshman retention rates, faculty resources, student selectivity, financial resources, graduation rate performance and alumni giving. In recent years, the system has also tried to incorporate high school guidance counselor opinion. Notice that affordability has no input.

Mostly, these criteria symbolizes sky-high superficiality. Boosting graduation rates and freshman retention isn’t all that hard – a university simply has to lower its standards and pass more students than professors would otherwise. As for reputation and giving – they’re a part of a circular cycle. The better reputation one’s alma mater has, the better career options to harness. The better career, the more cash flow. The fatter the alumni wallets, the fatter the school’s endowment and the more impressive an institution’s “financial resources.” “Faculty resources” are usually code for research facilities. The more professors are researching, the less likely they are to be interfacing with undergrads.

The craziest benchmark and the one that accounts for the most weight remains the “reputation” score, which asks college presidents, provosts and admissions deans to grade a multitude of other universities. How on earth is a college provost supposed to have the insight to score all 261 national universities? If such a provost did possess this superhuman skill, she probably spends far too much time on campuses other than her own.

Pop data guru and New Yorker writer Malcolm Gladwell revealed an interesting case-study that asked a hundred top attorneys to rank 10 law schools. The list included “biggies” like Harvard and Yale, and some smaller names such as John Marshall. The lawyers, on average, ranked Penn State’s law school toward the middle. To them, Penn State represented the type of large, recognizable state school deserving middle-of-the-pack credit. The problem is, at the time, Penn State didn’t have a law school. Our ability to judge educational quality on a mass scale is a fat illusion.

Claremont McKenna recently drew outcry for moderately fudging its SAT stats. In other words, Claremont McKenna marks the latest school to get run over in the numbers rat race. Some have resisted blaming Claremont McKenna and instead cite the “culture” of the rankings system. Personally, I was disappointed and confused to learn of Claremont McKenna’s trickery, since I’ve admired the school as a hotbed for students who care about a classical understanding of history and government. What’s more, it’s home to the conservative-leaning Claremont Institute for the Study of Statesmanship and Political Philosophy. If you believe, as I do, that universities ought to stand as the conservators and defenders of civilization, than you’d probably respect Claremont McKenna’s curriculum. Nevertheless, cheating isn’t exactly a civilized highpoint. Apparently, Claremont McKenna, like so many schools, can’t locate what it wishes to stand for–a world-class education, or an easily-digestible rubric.

Faced with a daunting level of variables, future college students face a version of Frederick Hayek’s classic Knowledge Problem. Hayek claimed no single economist, bureaucrat, or Treasury whiz-kid could ever predict all of the ins and outs of a complex economy. Likewise, how can one person possibly have the insight to discern all of his or her college choices and their impact on the future? We simplify by turning to U.S. News. Can you blame us? The Academy itself seems unable to articulate what it is, or ought to be.

The conservative essayist Russell Kirk once wrote, “[C]ollege and university ought not to be degree-mills: they ought to be centers for genuinely humane and genuinely scientific studies, attended by young people of healthy intellectual curiosity who actually show some interest in mind and conscience.” But how can we measure such bright-eyed moral intellect? What if a university admitted average high school graduates and, after four years, churned out bright scholars fluent in philosophy, economics, science, and literature?  What if these critically-minded students defended spots at respectable grad programs and contributed smarts to the field? Wouldn’t we want to advertise such a school? A formula that relies on entering GPA’s and SAT scores would never make room for such an institution.

My libertarian instincts tell me that the way to combat the flawed U.S. News model is to introduce competition–and it’s true that Forbes, The Huffington Post, and Newsweek have all offered alternatives. My favorite guide is Choosing the Right College, put out by the conservative Intercollegiate Studies Institute. Rather than number universities in a zero-sum game, ISI’s mission is to “uphold the liberal arts tradition.”  As such, it prioritizes curricula, academic vigor, intellectual freedom, safety, and price. The guide also flashes colleges with a red, yellow, or green light–typically based on the whether or not a college features a core curriculum and political tolerance (Swarthmore earns a yellow). True, ISI is geared toward a more classically-conservative audience, but it’s upfront and comprehensive about its beliefs. Different students (and parents) are gong to gravitate toward different criteria, and that’s fine. But the one-size-fits-all ranking method doesn’t work for anyone – whether they be conservative, progressive, or oboe-players.  It’s time to reconsider what we cherish about higher education and filter that information into the public figures. Otherwise, the academic quad is little more than a costly labyrinth.

4 Comments

  1. I guess I’m with the crowd on not liking rankings, but I don’t quite see why ISI being “upfront and comprehensive” about its criteria is any different than the US News et al crowd. Their criteria aren’t secret either. People who rely on US News or similar rankings don’t read the fine print very well, I think. But I’m not sure that folks who like the ISI ranking system do either. Or they use a “core curriculum” as a bad proxy way of enforcing an ideological litmus test, just as “percentage of alumni who donate” is used as a bad proxy for “alumni successfully placed in careers who regard their education as an important part of their success”. When you look underneath the hood, ISI’s thinking about what a “core curriculum” is, and their strategies for measuring that value, are pretty much just as contradictory or manipulable as anything US News or Forbes throws out.

    Assessment and transparency are interesting challenges. I think it’s better to ask what they ought to involve than to assume any evaluating institution is presently handling them correctly.

  2. The Washington Post, Princeton Review and Kiplinger’s, I believe, all take into account some measure of cost and value (the latter two more prominently I believe). Notice how Swarthmore also tops or is near the top of their rankings as well…

  3. This article prompted me to look at the ISI website and learn more about them. Their leadership looks like that panel that Rep. Issa put together to discuss religious liberty—and attack contraception. Check it out: http://home.isi.org/about/people Where are the women and people of color?

  4. I feel like a consistent “turn” in your articles is to reveal the ideological blindspots of Liberal think-tanks/statistics, point those out as incorrect or at least misleading measurements, and to then point to a Conservative think-tank’s numbers as a way out of the problem. Your point about thinking outside the Liberal bubble to which Swatties are sometimes blind is well taken. However, I don’t think you are willing to take the same critically distanced look at Conservative bubbles.

    As Burke pointed out (above), the “core curriculum” ranking is just a different but equally questionable measure, something used to promote what the institution doing the ranking thinks is important. In this article, and in your last one on Obama and OWS, you imply that the way beyond a misleading liberal bias is to turn to some other institution that does things differently. You seem to use the word “Conservative” as if it were self-evidently a synonym for “good,” but this is just to repeat the mistake that you point out others often do with the word “Liberal.”

    I think that a better approach would be to expose the ideological choices that ALL think-tanks do, not just liberal ones. I just don’t see how you can assume that the Conservative ones are better at measuring things– surely they too are constructing their rankings through ideologically laden choices. Which variables to include and which to omit when measuring things as complex and incommensurable as colleges, that’s the fundamental problem here, one not so easily overcome.

Leave a Reply

Your email address will not be published.

The Phoenix

Discover more from The Phoenix

Subscribe now to keep reading and get access to the full archive.

Continue reading