Advertisement

National college rankings don’t always tell a school’s whole story

At colleges and universities both in South Carolina and across the country, the “report cards” have arrived. Ratings from US News and World Report, The Princeton Review, Forbes and other publications are released, and parents, trustees, alumni and legislators pay attention.

In the US News and World Report Best College Rankings for 2022, Clemson University tied for #30 in “Top Public Schools.” Coastal Carolina University ranked #5 in “Best Value Schools.” Voorhees College scored a #4 on “Social Mobility.” Winthrop University was #5 “Best Colleges for Veterans.”

Fact is, there are so many lists, and so many categories, that every college in South Carolina can find something to cheer about. And that’s part of the challenge.

If these ranking are considered by many to be higher education’s report cards, we ought to think about how the college ranking sausage is made.

In many of these ranking systems, points are awarded based on “expert opinion” — in other words, the reputation an institution has among higher education leaders.

As a university provost, I often fill out surveys sent by publications that are in the college rankings business. In one of those surveys, I was asked to rank the quality of undergraduate teaching for colleges across the southeast.

I know a great deal about the quality of the undergraduate teaching at my institution - high. I know something of the quality of undergraduate teaching at the schools with which I have regular dealings.

But for some reason I was being asked to rank the quality of undergraduate teaching among universities in Arkansas. I’ve never been to Arkansas. But my “expert opinion” was requested, so I gave it – inexpertly. And somewhere in Arkansas, I suspect a provost was ranking universities in South Carolina, including my own.

US News and World Report and other publications often use rolling averages to generate certain rankings, which means the data going into some of these lists goes back as far as six years.

Institutions that are rapidly improving wait years to see changes in the rankings. In many ways, college academic rankings are like college football rankings – blue chip programs tend to be ranked year after year, but upstarts must overcome the odds.

A cynical (but very effective) approach is to hire a consultant to “goose” these rankings. For a price, a for-profit company can help a university score extra points in this or that category.

We call this “pleasing the algorithm,” and it is a tempting tactic when parents, trustees, alumni, and legislators use these rankings to make decisions.

Even more cynical is the decision to avoid students who hurt rankings. US News and World Report includes “alumni giving” as a category.

A college that wants to score more points in that area might avoid first-generation students, since they rarely come from wealthy households. Or one might discourage students from going into essential but low-paying professions such as teaching, law enforcement, and social work — such alumni do good in the world, but they may never write their alma mater a big check.

When these rankings come out, it is best to remember that different colleges have different missions.

Institutions that focus on undergraduate teaching may not be research powerhouses. Research universities may struggle to create small-scale learning opportunities.

There is a right college in South Carolina for every student in the Palmetto State, but not every college will meet every student’s needs.

The pressure to chase rankings increases for college leaders in the public eye, but pleasing the algorithm can end up hurting the students we are supposed to serve.

Daniel J. Ennis is Provost and Executive Vice President for Academic Affairs at Coastal Carolina University.