top of page
Brandon Wohl

College Rankings Lists Don’t Tell the Whole Story

For both eager high school seniors and college students, few things impact views on higher education more than college rankings lists. The most famous is published by U.S. News & World Report, but students who want an alternative ranking (or perhaps one that favors their own school/prospective school) have a plethora of options. Niche.com, Forbes, The Wall Street Journal, and The New York Times are among the other popular sources many students and parents “consult.” Whichever publication you flock to, there is a list for you. Lists on school type (national universities, liberal arts colleges, regional schools, etc.), average earnings 10 years after graduation, and specific major rankings, just to name a few, litter the pages of the internet.

Before I continue, I want to preface this by saying I understand why these rankings exist. They are a digestible way for the masses to gain marginal insight into schools relative to one another. Picking a school is incredibly difficult, so any “information advantage” students and parents can get is crucial. Additionally, I must confess that I too have spent more hours than I am proud of reading college rankings. 

The business of ranking schools has become a commodity, regardless of your feelings towards these lists. Originally, U.S. News & World Report operated as a print magazine but still lagged behind Time and Newsweek. The Best College Rankings debuted in 1983 and not only became a mainstay for those reading U.S. News, but it is likely the only thing the majority of students and recent grads know about the publisher. Nowadays, it is the education section (specifically these rankings) that keep U.S. News afloat. Even though the bread and butter of some of the other sources I mentioned above is reporting the news (WSJ/NYT), the rankings lists are yet another way to draw eyeballs. As I alluded to above, “list culture” is arguably just as pervasive as other trends and social fads, yet we don’t seem to realize. Perhaps it is the simplicity? Or maybe people experience FOMO? Answers to such questions will differ for everyone, but I can confidently answer “yes” to both of them.

Now that we have set the stage, we can discuss the rankings themselves. Many of the usual suspects – the Ivy League, MIT, Stanford, Cal Tech, Northwestern, Duke, etc. – populate the top 20 every year. Top public schools typically fall between #15 to #40 – think the UC system, UVA, Michigan, Georgia Tech, UNC Chapel Hill, and others. This year (the 2024, not 2023, rankings), U.S. News updated their rankings (see below). These changes, including removing emphasis on smaller class sizes, caused real shifts at the top of the rankings and typically harmed private colleges, while public schools moved up. Broad changes to the admissions game have accelerated in recent years in the form of test optionality and adherence to DEI guidelines. 

What remains a part of the criteria for the rankings is “Peer Assessment,” which still accounts for 20% of a score. Think of this as a popularity contest. If you are running an institution that carries prestige, then the odds of your peers believing you are prestigious goes up. The controversy surrounding Peer Assessment is because it is flagrantly subjective in a list many interpret as objective. From the official U.S. News & World Report website, they state, “Each survey respondent was asked to rate the overall academic quality of peer schools' undergraduate academic programs on a scale from 1 (marginal) to 5 (distinguished). Respondents who did not know enough about a school to evaluate it fairly were asked to mark "don't know" or leave it blank; neither of which were counted for or against a school's score.” I hope after reading that you’re just as confused as I am. So my advice to you? Remember that the “best universities rankings” are more of a popularity contest and not a measure of academic achievement.

Another issue with the rankings lists that I do not believe is discussed enough is the fact that the rankings can be gamed. Presidents and deans who wish to improve their standings can take specific initiatives that benefit their respective universities. The details of the Northeastern case are beyond the scope of my article, but I’ll attach the sources at the bottom. Former Northeastern president Richard Freeland led the school’s rapid rise up the rankings. His 17-year career from 1997 to 2013 saw Northeastern move from #162 to #49. Sweeping changes including hiring more faculty to artificially drive down student-to-faculty ratios, building new and modern dorms, and allowing applications to be submitted through everyone’s favorite, the Common Application. “There’s no question that the system invites gaming,” Freeland said in an interview with Boston Magazine in 2014. “We made a systematic effort to influence the outcome.”

Northeastern was not punished because their efforts were permitted, though debatably unethical. Other schools opted to boost their positions by misreporting key data points. Claremont McKenna College is an elite liberal arts school in California that inflated students’ SAT scores by 10-20 points. In 2012, Emory University misreported incoming students’ GPAs, while George Washington University misreported incoming students’ class rank. Most recently, Columbia University fell from #2 to #18 in the 2023 Rankings after being caught falsifying data that was later submitted. The university was in hot water after a math professor raised serious questions regarding the accuracy of the data. All of these examples further show that schools can manipulate the rankings.

It's not that I hate the rankings – I only hate when people view them as meaningful. Everyone has a different rankings list because everyone’s needs as a student are different. I encourage you all to remember that the next time you look at the rankings. Think of the rankings as marketing, rather than fact – and maybe, just maybe, resist the urge to repost it on LinkedIn.


Works Cited



Commentaires


bottom of page