US News & World Report ranks America’s ‘best’ colleges. But, is there really a way to know?

College rankings purport to tell the public which schools are worthwhile, even though many academics view the rankings as worthless. 

The latest salvo in the battle between the ranked and the rankers comes out of Reed College in Portland, Oregon. A statistics professor and a group of students say, based on an statistical analysis, Reed College appears to be under-ranked compared to other schools on the U.S. News & World Report’s ranking of “best colleges.” The publication denies that claim and questions the accuracy of the group’s work. 

Though the two parties may not come to an agreement, the debate speaks to a broader conversation about the value of quantifying the college experience. Mainly, is there any?

In America, the U.S. News & World Report rankings are regarded as the gold standard. The publication’s methodology usually changes annually, but it includes student retention and graduation rates, resources available to faculty, and the opinions of fellow college leaders and high school counselors. It also create snapshots of colleges that include cost, application deadlines and a school’s history. 

“Taken together, the rankings and profiles – combined with college visits, interviews and your own intuition – can be a powerful tool in your quest for the right college,” the company’s website reads. 

Americans’ obsession with choosing the best product also informs the longevity of U.S. News college rankings, which began in 1983. They persist because choosing where to start a higher education career is confusing and there are hundreds of colleges, each promising a quality education. But when students can’t figure out who is telling them the truth, a list of winners and losers can seem like a clarion through the noise. 

So, McConville and students Bailee Cruger, Huaying Qiu and Wenxin Du  set about creating a model to test it following a suggestion by the college’s institutional research office. Using federally available government data from the Integrated Postsecondary Education Data System and their model, they said they were able to recreate the U.S. News ranking with 94% accuracy.

In their model, Reed ranks 38th for liberal arts colleges nationally, but U.S. News’ model put the college at 90th. McConville said there is “variability in the accuracy of the prediction,” but it does provide evidence Reed is under-ranked. 

“We were really surprised,” she said. “It seemed like their models implied that Reed should be ranked higher.”

The point, McConville said, wasn’t to fight over which equation was best, but to show the limitations of a one-size-fits-all model, and to demystify the process of rankings.

Robert Morse, chief data strategist at U.S. News, challenged the Reed College findings on two grounds. First, he argued, it was “not possible” to reverse-engineer the college rankings based only on IPEDS data.

Secondly, Morse said the publication doesn’t penalize schools that don’t participate in the surveys. He added that the magazine relies on IPEDS data when colleges decline to take part in the survey. 

Christopher R. Marsicano, a professor at Davidson College who studies higher education, said it appears the students were able to predict the U.S. News rankings with a high degree of accuracy. But, he cautioned, it would be impossible to replicate the scores exactly without the publication’s precise formula. 

He couldn’t perfectly replicate the students’ work, but Marsicano looked at publicly available data for colleges in U.S. News rankings in the high 30s and the low 90s. 

He said Reed appeared to share many traits with the high-30s groups, including graduation and retention rates, money spent on students and SAT scores. He did say it’s possible Reed, however, got a bump for having a higher graduation rate than what U.S. News had predicted. Its prediction, Marsicano said, was 5 points below Reed’s lowest graduation rate in nearly the past decade. 

“It seems that U.S. News just doesn’t have a good handle on Reed in general,” he said. “And to be fair, Reed doesn’t send in the survey – so how could U.S. News adequately have a handle on Reed?”

‘Nobody should fill those forms out’

Rancor over the rankings has long been brewing. 

Douglas C. Bennett is one of those longtime critics of the rankings. While president of Earlham College in Indiana, he did not participate in the system and, in 2007, he signed a letter encouraging other leaders in higher education to do the same. 

Little has changed, he said, since his 2011 retirement. Among his complaints, Bennett said the rankings measure many factors that don’t necessarily reveal how adept a college is at educating students. In particular, a portion of the rankings ask presidents to assess other institutions. His response when he was a president: He couldn’t really know.  

“Nobody should fill those forms out,” he said. “They don’t know what’s going on. They’re just reacting to prestige, and prestige is the illusion of quality. It may get at something, but it isn’t getting at something real or trustworthy.”

A small contingent, including Reed, opted out on ideological grounds. But the rankings clearly mean something to many others.

Some list their ranking on their website as an advertising tool. Others make it part of the institution’s goal to rise in the rankings and, in extreme cases, may tie presidential bonuses to moving up, according to Robert Kelchen’s “Higher Education Accountability.”

Others lie to boost their standing. In 2019, the University of Oklahoma submitted false data about its fundraising. And, in 2018, Temple University’s Master of Business Administration program was stripped of its ranking after it was found to have lied about its students test data, among other things.

UC Berkeley, which often ranked as the best public school in the nation, was also recently moved to unranked “status” after misreporting alumni giving rates. In that case, the university reported the error itself. 

What are numbers without context?

Walter M. Kimbrough, president of Dillard University in New Orleans, has long refused to participate in the rankings system. He joked that he sends the U.S. News and World Report’s surveys back in the mail labeled as junk. 

“Since the ’80s, we have been a brand-name nation, and it applies to higher education,” Kimbrough said. “They’re looking for a brand to give themselves some kind of status.”

Rankings, though, reflect not what colleges do, but who the college serves, Kimbrough said. The colleges that do especially well in the rankings are generally the ones with the fewest students on Pell Grants, a type of financial aid given only to students from low-income backgrounds. Students from such backgrounds often have less academic preparation and will have more ground to cover. 

And numbers sans context, he said, mean little. His university’s six-year graduation rate in 2012 was 28 percent, he said. That was true, but it was based on the cohort who started seven weeks before Hurricane Katrina.

The institution had to shut down for a semester, and they operated partially out of a hotel the next. Some people just never came back, Kimbrough said. For the next three years, that was part of the data that informed the U.S. News and World Report ranking. 

“They just see the numbers without a full interpretation of about what those numbers mean,” Kimbrough said.

This story was originally published by USA TODAY. Education coverage at USA TODAY is made possible in part by a grant from the Bill & Melinda Gates Foundation. The Gates Foundation does not provide editorial input.