Skip to main content
opinion

Alex Usher is the president of Higher Education Strategy Associates.

The father of modern university rankings is James McKeen Cattell, a well-known early 20th century psychologist, scientific editor (he ran the journals Science and Psychological Review) and eugenicist.

In 1903, he began publishing American Men of Science, a semi-regular rating of the country's top scientists, as rated by university department chairs. He then hit on the idea of counting how many of these scientists were graduates of the country's various universities. Being a baseball enthusiast, it seemed completely natural to arrange these results top to bottom, as in a league table. Rankings have never looked back.

Because of the league table format, reporting on rankings tends to mirror what we see in sports. Who's up? Who's down? Can we diagnose the problem from the statistics? Is it a problem attracting international faculty? Lower citation rates? A lack of depth in left-handed relief pitching? And so on.

The 2018 QS World University Rankings, released last night, are another occasion for this kind of analysis. The master narrative for Canada – if you want to call it that – is that "Canada is slipping." The evidence for this is that the University of British Columbia fell out of the top 50 institutions in the world (down six places to 51) and that we also now have two fewer institutions in the top 200 – Calgary fell from 196 to 217 and Western from 198 to 210 – than we used to.

People pushing various agendas will find solace in this. At UBC, blame will no doubt be placed on the institution's omnishambular year of 2015-16. Nationally, people will try to link the results to problems of federal funding and argue how implementing the recommendations of the Naylor report would be a game-changer for rankings.

This is wrong for a couple of reasons. The first is that it is by no means clear that Canadian institutions are in fact slipping. Sure, we have two fewer in the 200, but the number in the top 500 grew by one. Of those who made the top 500, nine rose in the rankings, nine slipped and one stayed constant. Even the one high-profile "failure" – UBC – only saw its overall score fall by one-tenth of a point; the fall in the rankings was more a result of an improvement in a clutch of Asian and Australian universities.

The second is that in the short term, rankings are remarkably impervious to policy changes. For instance, according to the QS reputational survey, UBC's reputation has taken exactly zero damage from l'affaire Gupta and its aftermath. Which is as it should be: a few months of communications hell doesn't offset 100 years of scientific excellence. And new money for research may help less than people think. In Canada, institutional citations tend to track the number of grants received more than the dollar value of the grants. How granting councils distribute money is at least as important as the amount they spend.

And that's exactly right. Universities are among the oldest institutions in society and they don't suddenly become noticeably better or worse over the course of 12 months. Observations over the span of a decade or so are more useful, but changes in ranking methodology make this difficult (McGill and Toronto are both down quite a few places since 2011, but a lot of that has to do with changes that reduced the impact of medical research relative to other fields of study).

So it matters that Canada has three universities that are genuinely top class, and another clutch (between four and ten, depending on your definition) that could be called "world class." It's useful to know that, and to note if any institutions have sustained year-after-year changes either up or down. But this has yet to happen to any Canadian university.

What's not as useful is to cover rankings like sports, and invest too much meaning in year-to-year movements. Most of the yearly changes are margin-of-error kind of stuff, changes that result from a couple of dozen papers being published in one year rather than another, or the difference between admitting 120 extra international students instead of 140. There is not much Moneyball-style analysis to be done when so many institutional outputs are – in the final analysis – pretty much the same.

Interact with The Globe