Keep Your Virtue*…Ignore the Reputation Rankings

If Government took an “experienced, published” scholar and said they would have to spend the next 21 years working only on a vanity project, the higher education community would be outraged at the waste of time, intellect and potential.  But that’s what the THE Reputation Rankings 2021 appear to be doing by having around 11,000 such scholars submitting views on which other universities are good at research and teaching.  It’s only the beginning of the problem for another dreary day in the self-interested world of the rankers.

We are told that academics are hard-pressed and facing mental burnout but that doesn’t stop the THE taking their valuable time to seek opinions about other institutions.  If each of the 10,963 respondents spent half a working day on their submission that would be 5,481 days.  Dividing that by 260 working days a year (a five-day working week times 52 with no holidays) suggests THE may have taken more than two decades of academic time away from scholarship that could save humanity.

Twisted by Knaves to Make a Trap for Fools

Despite, or perhaps because of, all that effort it comes as a yawn inducing revelation that the top five in 2021 are exactly the same universities as the top five in both 2020 and 2019.  The procession of Cambridge, Harvard, MIT, Oxford and Stanford – placed alphabetically here because the whole notion of ranking them seems ludicrous – continues.  In a world where the purpose of higher education, its value and its relevance are all under question this parade of hierarchy seems irrelevant.  

I wonder how Harvard’s reputation really looks to people who agree with Scott Galloway that higher education is becoming “the enforcer of a caste system” where “elite institutions no longer see themselves as public servants”.  Or Michael Crow, President of Arizona State University, when he says that HE is increasingly a system “where status is derivative of exclusion”.  Or to those who have listened to Malcolm Gladwell’s forensic dissection of the US News rankings, where he notes that they are “a self-fulfilling prophecy”.

There should also be an enquiry into the fact that California Institute of Technology (CalTech) is only placed at 14 in the Reputation Rankings.  On THE Student the compilers are telling thousands of candidates that CalTech is the best university in the US,  QS rank it at 6 in the world, it’s number 9 in US News and number 2 in the THE’s own World Rankings.  There are several other examples illustrating inconsistencies which confirm that the whole exercise isn’t really about understanding or reflecting excellence.

One might guess that the opportunity to reach out and flatter 10,963 potential readers and event attendees by asking for their opinion is a primary motivator of the approach taken.  But for THE to then claim the rankings are “based on the opinions of those who truly know” seems typically hyperbolic and ill-founded.  Donald A. Norman is quoted as saying – “academics get paid for being clever, not for being right” – which is an alternative view worth considering.  

Fill the Unforgiving Minute  

The 21-year estimate of time does, of course, presume that the academics involved took the weighty task of considering the reputational implications for thousands of their colleagues at thousands of universities seriously.  Half a day hardly seems long enough to do the job properly but some cynics might suggest that it was more likely half-an-hour over a cup of coffee.  Plenty of time to see scores settled, biases reinforced and best friends rewarded. 

Even half an hour for each submission would be about two-and-a-half years of time stolen from saving the world and brings equally good questions about the value of the exercise.  Each academic submitted around 14 names on average which which, in 30 minutes, means they would take about two minutes to weigh and consider each nomination.  It’s less time and attention than most people spend on selecting their top ten favourite party records for the Christmas playlist.  

Make Allowance for their Doubting Too

The reputation rankings methodology** specifically gives “more weight to research”.  This is not because research is intrinsically more important but because “our expert advisers have suggested that there is greater confidence in respondents’ ability to make accurate judgements about research quality”.  It is interesting to read that THE’s advisers thinks academics can’t really be trusted to review the teaching quality of their peers. 

Pity the poor student who believe the Reputation Rankings have much to do with the teaching they might receive.  The methodology places a premium on the score for research reputation of 2:1 over the score for teaching reputation.  This gives some idea of the importance that THE attributes to teaching in establishing an institution’s standing and the extent to which academics are contributing to perceptions about their priorities.

One Heap of All Your Winnings

It also seems that the eventual ranking is driven as much by volume as by quality.  Respondents are asked to simply name, rather than rank, up to 15 institutions which are best at research and teaching) in their specific subject discipline.  But the number one institution “is the one selected most often” as being one of the best in its field.

It doesn’t seem to matter if the twenty most eminent researchers in each field believes your university is the best.  You will not be top if enough other “experienced, published” researchers pick the place where they are angling for a job, enjoy to visit or where the overlord of research is a personal friend.  There is no indication in the methodology that there is a weighting to account for the ability of the respondents to make or offer a well-informed judgement.

Or Being Lied About Don’t Deal In Lies

However, there are adjustments to ensure that the ranking represents the global distribution of scholars.  Participants are selected randomly from Elsevier’s SCOPUS database which can presumably create a sample in line with the global distribution of academics.  As responses do not reflect the UNESCO distribution of scholars so they have to be weighted.   

Engineering overperformed with 15.8% of respondents and had to be levelled down to a weighted 12.7% while Business and Economics and Arts and Humanities had to be levelled up from 8.2% and 7.7% of respondents to 13.1% and 12.5% respectively.  Maybe it’s just that engineers like questionnaires and giving their opinion but it would be nice to think that some scholars are too dismissive of the process to join in.

If the argument is that the Reputation Rankings are only a game and for entertainment then academics might consider how wise it is to be wasting their time on something as meaningful as voting on The X-Factor or Dancing With the Stars.  But if it is intended to inform politicians, funders, philanthropists and business about quality it carries the danger of reinforcing bias while devaluing other institutions, and their students.  Next time the email from THE pops up in the inbox it might be a good moment to press delete and get on with doing something important. 

NOTES

*Part of the title and all the sub-heading are fragments from Rudyard Kipling’s wonderful and inspirational poem ‘If’

** The page of description of the methodology is, in my view, neither clear or well written. I would be happy to clarify or correct any areas where my interpretation or understanding is incorrect on the advice of an authoritative source.

Image by Gerd Altmann from Pixabay      

2 thoughts on “Keep Your Virtue*…Ignore the Reputation Rankings”

    1. Thank you for your response. My blog focused on THE because it had just been announced but I have little doubt that QS rankings have equivalent failings. Others have commented on this and I have referenced several of their points about QS in earlier blogs. It seems reasonable to keep reminding students and universities that the rankings have become aligned with organizations that are increasingly intent on monetizing their data. This view has been reinforced by QS announcing the purchase of the aggregator StudentApply. My opinion, and a specific point in the blog, is that academics and institutions should consider declining to participate in a charade that is not in their best interests.

Leave a Reply

Your email address will not be published. Required fields are marked *