FROM FUTILITY TO UTILITY

Collins Dictionary tells us that, “If you say that something is futile, you mean there is no point in doing it, usually because it has no chance of succeeding.”  It is difficult to think of a better description of a student scanning the Times Higher Education or QS World Rankings or any of the multiplicity of other rankings that have proliferated from those organizations.  They don’t really tell students anything useful about whether the institution is right for them as an individual or whether it will allow them to fulfil their life and career ambitions.

All the evidence suggests that the primary motivator for going to higher education is to enhance job prospects. Chegg’s survey across 21 countries, INTO’s research with agents and Gallup surveys are among indicators that for both home and international students a degree is largely a means to an end. That is not to say that people don’t want to study something they enjoy – just that the degree is the aim.

Most existing rankings are, however, just an attempt to monetize industry data for commercial ends and the sector collaborates, possibly because it’s the way things have always been done.  The rankings, as someone said, “Xerox privilege” by reaffirming existing hierarchies and usually allow institutions to manipulate their data, sometimes beyond the point of criminality.  For the institutions they are vanity projects which lead to dubious internal resource allocation, avoid hard questions about graduate employability and distort the decision making of Governments, funders and students.

Utility, on the other hand, is “the quality or property of being useful” and we may be beginning to see the glimmer of some media developing data to be genuinely useful to students.  It is a timely and smart move because we are nearly at the point where AI will give students the opportunity to have near total, instant and absolutely personalized university search capability at their fingertips.  That should send a shudder through ranking organizations that are wedded to a business model and presentation based on early 2000s thinking.

Money magazine’s Best Colleges 2023 may point the way.  It still has a vapid “star” system to allow colleges to be ranked but the database begins to say some useful, student oriented, things about Acceptance Rate, tuition fee (both headline price but more importantly average actual price) and graduation rate.  Imagine if that database approach married itself, in the US, to the work of a company like College Viability, LLC which gives an insight into reasons which a college “…may not be financially viable for the time required to earn a degree from that college.”  Then, add to the mix comprehensive information on the graduate outcomes and career payback from specific degrees – the Princeton Review Best Value Colleges gives a flavour but still ends up as a ranking with limited coverage.

In the UK, the growth of private universities and the significant difference in tuition fees at graduate level between public universities makes the approach equally appropriate.  Such a database would begin to answer the most pressing of student needs – will I get in and with what grades, am I likely to graduate, and what are my career and earning prospects thereafter?  There could be plenty of further nuance added, including grades required, accommodation, measures of student experience and so on.

All of this could be done without the need for a grading system.  The problem with rankings is that the company doing the ranking sets an arbitrary test which institutions do their best to pass with a high grade.  This entirely excludes the student from having any input into the criteria but the results are then presented as an aspirational or emotional nirvana for them to consider.

A smart organization would be ensuring that their data collection is driven by the real world needs and concerns of students. It’s time to remove the worthies who make up the Advisory Groups and Panels for the major ranking organizations and find ways of engaging directly with potential students. The outcome would be relevant, dynamic and have utility for millions around the globe.

It would also be a driver for universities to engage more effectively with the issue of graduate employment both through on-campus services and establishing strong data on careers and jobs. Colleagues including Louise Nicol of Asia Careers Group and Shane Dillon of CTurtle have been demonstrating for years that smart use of technology even makes it possible to leave antiquated, email driven surveys of graduates behind in collecting the data. The answers might even begin to convince Governments around the world that universities are engaging effectively and adding value to economic growth and sustainability.

McKinsey and many others have written about personalization of the customer experience in retail with much of the impetus being given by technology.  The insurance world has seen the rise and rise of aggregators and there is talk of the “personalized insurance engine” that gives a fully automated customer journey.  Potential students are hungry for better decision making option and education needs to catch up fast with the opportunities that exist.

Image by Steve Buissinne from Pixabay

Rankings Have No Deep Impact

It is increasingly difficult to take the Times Higher Education (THE) Impact Rankings seriously but the tone-deafness, doublethink, obfuscation and self-delusion becomes ever more extraordinary.  The only blot on the comedic value of the Rankings is that they continue to highlight Russia and Afghanistan universities – one group in a country in thrall to a leader whose war has killed tens of thousands and the other in a country where women cannot enrol in higher education.  There seems no way in which the SDGs were intended to be used to provide publicity and credibility for countries deliberately applying policies decried by the United Nations. 

Same Old, Same Old

We are told that the “The Impact Rankings are inherently dynamic…we expect and welcome regular change in the ranked order of institutions (and we discourage year-on-year comparisons)…”.  Unfortunately, the THE corporate communications department didn’t read the memo because they announced, “Western Sydney University claims the Impact title for the second year running with a near-perfect score” – which sounds rather like a year on year comparison.  Further diminishing the sense of dynamism is that eight of the top twelve are the same as last year.

Five of the 2023 top twelve have been in the top twelve since 2020.  It would probably be higher but Kings College, University of Leeds and University of Sydney, who were in the top twelve in 2020, have all dropped out of the table completely.  There seems to be the possibility that some of the very best universities with strong SDG credentials are ignoring the Impact Rankings because they recognize the inherent weaknesses. 

It should not be surprising that universities who choose to be part of this manipulable process are able to enhance their performance.  Universities are full of administrators and academics who are good at passing exams so shame on Newcastle University and Hakkaido University for falling from eight and ten last year to 24 and 22 in 2023.  Perhaps a new ranking should be based on calling out institutions that cannot maintain or improve their position on a yearly basis.

It is slightly bemusing that King AbdulAziz University was a non-runner in 2023 after being in fourth place the year before.  Is it possible that they could not find any researchers willing to sign over a sufficient number of citations or maybe the failure to come top was too much to bear?  A related anomaly is that it features rankings from AWRU, QS and US News World Report on its International Rankings page but nothing from THE.

Living In the Past

As previously noted, the data in the rankings is based on 2021, the era before ChatGPT, the Russian invasion of Ukraine and the implosion of Boris Johnson’s premiership.  Unfortunately, this means that any student relying on the rankings to make judgements about institutions is going to be sadly misled.  Not that this matters to the way THE and their enablers like Study Portals use rankings to monetize student eyeballs.

The most egregious example of the Impact Rankings failure to keep up to date is the increase in the number of Afghanistan universities in the Impact Rankings.  Going from two to three listed entrants is bad enough after a year in which they have followed their government’s edict to prevent women going to university.  Two of the three have scores under SDG 5, which is specifically about Gender Equality and the aim to “Achieve gender equality and empower all women and girls”, while to add insult to grievous injury their score in that category is better than hundreds of other institutions.

It seems extraordinary that nobody at the THE was paying sufficient attention to understand the condemnation of the world at the exclusion of women from education in Afghanistan.  As noted in previous blogs it might be reasonable to think that the lack of women in the board ranks of THE and its owners contributes to this indifference.  It is, however, very difficult to think of a good excuse for the Advisory Board which one might hope has some members with a broader perspective on justice, equity and decency.

From Glasnost to Skrytnost

It was all the rage to celebrate glasnost and perestroika in the 1980s but openness and restructuring in Russia have long given way to autocratic rule and whim.  Maybe that’s why the THE’s treats some Russian university scores in the spirit of what the Washington Post termed “skrytnost: – derived from the Russian verb skryt meaning “to conceal”.   It is unacceptable for a ranking that trumpets its supposed transparency to offer no explanation for blanking Russian university scores for SDG 16, Peace, Justice and Strong Institutions.

It must be bad enough for the compilers that Russian universities continue to be the single largest number of entrants to the Impact Rankings but totally infuriating that many choose to be scored on their support of the very virtues that the country currently seems to lack.  Unfortunately, the THE seem to accept whatever is submitted, adds it to the total, then blanks it out as if it was some secret.  There is no explanation in the methodology which only reminds everyone that the scoring itself is a matter of, um, autocratic rule and whim.

The continued presence of Russian universities in the league tables and the way they are publicized as study destinations by THE Student is another reminder that the entire premise of the tables is to commercialize data and sell consultancy rather than enrich the sector.  While the Ukranians are on a counter-offensive to remove the aggressors from their lands the Impact Rankings celebrate universities whose Rectors publicly endorsed Putin’s war.  If their decision was based on a quick Russian victory it is time to reconsider.

Reputation Bust?

For all the noise from those going up in the Impact Rankings an analysis shows that only three of the top 12 institutions (Manchester, Arizona State and Alberta) feature in the THE’s own World Reputation Rankings.  This might suggest that academics see the Impact Rankings as a refuge for those who feel the need to please their governing bodies but not as a genuine marker of global quality.  It’s a bit like football fans getting excited when their team wins the Europa Conference League while those supporting serial contenders for the Champions League are not so easily impressed1.

Nobody expects the THE to give up on its money-go-round of league tables any time soon but it is remarkable that after five years most universities have declined to spend the time, effort or money to engage in the Impact Rankings.  One might argue this is because they recognize the dangers of being involved in a competition that is easily rigged and where the referees might just be willing to tip the scales a different way to create a headline.  The evidence suggests that absence does not impact the credibility of absentees at all.

Notes

1.            For those who do not follow European football, the Europa Conference League is the third tier of European competition after the Champions League and Europa League.  With apologies to West Ham United fans I would say it has much in common with any other conference – you go not knowing anything about the people you’ll meet, you end up in many dreary rooms discussing irrelevant things and you return to a pile of work.  Football fans will know that in that sentence you can replace “people” with teams, “rooms” with stadiums and “pile of work” with relegation trouble.  If you’re lucky you get a certificate of attendance (known as the Europa Conference League trophy).   

Image by WikiImages from Pixabay

THE’s Russian Ranking Reprise

Despite a year of slaughter, destruction and probable war crimes in Ukraine the Times Higher Education (THE) continues to turn its eyes away from the obvious step of excluding the names of Russian universities from its rankings.  As the drumbeat starts for the launch of the 2023 Impact Rankings at the end of May 2023, the THE has already announced that Russia will again have the most institutions taking part.  We are also told that they are “expecting data to come from a single academic year: 2021” so there would appear to be no chance of revulsion at an institution’s support for unprovoked war, deaths and a refugee crisis impacting on its ranking.

The Sustainable Development Goals are a decent and positive attempt to build a better world and universities are right to consider how they might play a part in that endeavour.  This makes it particularly unfortunate that the THE Impact Rankings have ignored the underlying principles to give continued encouragement to institutions that have backed Putin’s war.  There is even more to suggest how this distorted world view undermines the credibility of the rankings and the organization.      

Indifference and Inaction

The THE Chief Executive Office expressed “solidarity with Ukranian people” on behalf of the company in March 2022 and claimed “we will allow the rankings to do what they are designed to do, and show the world the impact of those [Russian government] decisions..”.  He conveniently forgot to mention that it would be years before the rankings reflected the impact of the war and may even have hoped, in best WW1 jingoistic fashion, that it would all be over by Christmas.  Imagine if every other business, Government and individual that has supported Ukraine through resources, funding, boycotts or direct action, had decided it would wait more than two years before doing anything.

He went on to say that “..we will of course keep the situation under constant review, and will not hesitate to take further steps if we believe it is necessary to do so.”  As far as one can see there has been no further action, no further statements and no further interest despite more than a year of bloodshed and atrocities.  In that respect, the Impact Rankings have become a monument to the indifference of the THE’s leadership.      

Lack of Transparency

Even the THE doesn’t seem able to stomach the notion of Russian universities parading their credentials on SDG 16 – Peace, Justice and Strong Social Institutions.  It is difficult to see any other reason that they would blank the scores for this SDG in the rankings of Russian institutions.  However, there is no explanation in the methodology as to whether there is still a score counting towards the overall ranking of the institution, whether it is zeroed or if there is some other fix.

When a senior data scientist at the THE was asked to explain the methodology no response was received1.  It’s not a very good look given claims about the openness and integrity of the rankings. But it should be a timely reminder to every participant that the methodology is subject to the whims of the compilers.

Allowing Brand Endorsement

Meanwhile, Russian universities remain entirely content to maximise the publicity they get from featuring in the rankings. For example, Altai State University features their ranking, complete with blanked out boxes on SDG 16, as part of their marketing.  Their corporate statement reflects glowingly on what they position as “the third nomination, in which the university was awarded, was…Goal No. 16.”

It seems beyond belief that the THE cannot see that its logo, rankings and reputation are being used as an endorsement for Russian universities.  Neither do they seem to realize that league table endorsement is exactly what the Russian government requires of the institutions. The minimal efforts made by the THE to reduce these bragging rights have manifestly failed and allows Putin’s regime to claim a semblance of normality and acceptance in the world university community.

Promoting Russia as a Study Destination

The THE continues to actively promote Russian universities, allowing easy and searchable access to university courses to 457 courses in the Russian Federation.  Courses from HSE University (shown here) are also publicized, along with many other Russian universities, by Studyportals who act as a THE partner and facilitator in exploiting student eyes on league tables.  It is difficult to see that this is not contributing to Russia’s continued success in attracting international students

Hapless, Hopeless or Worse

It seems reasonable to accept the Ukranian group Progresylni taking any opportunity to understand how they can raise the profile of Ukraine and its fight for academic survival. We should all feel humbled by their willingness to look forward while facing a devastating attack on their country. The uncomfortable truth is that the THE’s unwillingness to act means that the names of Ukranian institutions in the rankings continue to stand next to those from an invading power which continues to build a reputation for crushing academic freedom.

In the Impact Rankings Ukranian institutions are outnumbered by around three Russian universities to one Ukranian which, according to Statista, makes the ratio slightly better than the advantage that Russia has in active soldiers.  With a single decision the THE could allow Ukrainians to enjoy the rankings without the presence of the aggressors. A reformulation of a line from David Sedaris might suggest that these are circumstances where humbled can be found between hapless and hypocrisy in the dictionary. 

Keeping Bad Company

Nobody really expects the THE to give up on the money-go-round that is the university rankings and they may have already anticipated an end game in the war.  It could come down to a calculation of the odds on who prevails or who will have the most university buildings left standing in the long-term.  The needs of private investors and owners, Inflexion, may also make it seem important to keep the doors to revenue open for all possibilities.

What we do know is that the Impact Rankings are manipulable and there is an emerging consistency about those who most want to be involved.  The top three countries involved in 2023 will be Russia (92), Japan (91) and Turkey (84) with two sharing the distinction of having a so called “hard man” at the top and all three being in the bottom 40% of the Academic Freedom Index. In the 2022 Impact Rank the five countries with the most entries – Russia (94), Japan (76), Pakistan (63), India (61) and Turkey (57) – were all in the bottom 40% of the Index.  

In Turkey (which is in the bottom 10% of the Index), President Erdogan signed a decree that allowed him to appoint a president to any university in the country and did so at Bogazici University which he claimed, “failed to understand and incorporate itself to the nation’s values.”   He appointed Melih Bulu as president and while protests erupted and students were arrested “Bulu kept repeating his main promise of improving Bogazici’s international university ranking…”.  While Bulu was eventually removed2 it suggests how pernicious the rankings can be in creating a lever for politicians to ride roughshod over academic freedom.

Even in countries considered to be relatively liberal democracies the rankings have become a dumbed down touchstone for awarding visas in a way that is both vapid and discriminatory.  It is not too far-fetched to believe that rankings are already a vanity project for every wannabe dictator or authoritarian government that wants credibility on the world stage and are becoming a simplistic measure for politicians to judge value in higher education.  It is, after all, much easier to expect universities to manipulate their rankings submission, than to allow academics and students to build a liberal, challenging community where governments are critiqued and challenged. 

NOTES

  1. The individual had been openly looking at my LinkedIn profile. After the message was sent they disappeared from view on my account. Strange behaviour.
  2. Before cheering the demise of Melih Bulu it’s worth noting that Professor Mehmet Naci Inci was appointed (by Erdogan) despite the opposition of 95% of the institution’s academics. In January 2022 he removed three deans of school for their part in protests then in August 2022 he suspended 16 academics who protested “..against presidentially appointed rectors at the school..”. In February 2023 an Istanbul court sentenced 14 Boğaziçi University students each to six months in prison for staging a protest over his appointment.

Image by Andrew Martin from Pixabay 

(What’s So Funny ‘Bout) Peace, Justice and Strong Institutions?

Readers might need some inspiration for trivia questions about the latest Times Higher Education Impact Rankings so I thought I’d help out.  The Russia Federation had more universities represented than any other country in the 2022 rankings BUT in which SDG category are none of them listed?  It’s a good test of whether anyone can remember all the SDGs but for those that can, the unsurprising answer would be Peace, Justice and Strong Institutions.

This is despite the fact that 49* Russian institutions were listed in that category in the 2021 rankings and it highlights the big problem when you allow universities to self-report and select which categories they enter because they can be really quite good at some things but ignore or even oppose others.  It is difficult to see how a Russian institution could be good at, say, “peace” when there could be 15 years in prison and other serious penalties for mentioning “war” or “invasion”.  Strong institutions also come under serious pressure when Russian Political Rights rank a 5/40 and Civil Liberties rank 14/60 according to Freedom House.

Universities can be enfeebled as institutions by political power and the outcome can be that they even become agents of coercion and repression.  Examples include the Higher School of Economics restricting political activism on campus in 2020 and more recently hundreds of students reported as having been expelled and some students playing an active role in hunting down activist teachers.  The tightening of the Russian Government’s grip on senior administrative appointments and strategic direction is well documented and one author has suggested, “controlling universities via rector appointment may serve as an instrument for controlling young minds.”

THE Fails On Effective Action to Minimise Credibility, Prestige and Marketing

Also, unsurprising is that the universities have continued to use their presence in the Impact Rankings to publicise themselves despite the words of Times Higher Education Chief Executive Paul Howarth on 4 March that “we will be taking steps to ensure that Russian universities are not using branding or other promotional opportunities offered by THE until further notice.”  Here’s a snip from Peter the Great St Petersburg Polytechnic University showing how feeble that statement was and why the THE should ban institutions from the rankings.

It would be good to think that the full weight of the THE’s legal machinery might come crashing down on the Russian universities that are continuing to use the organization’s logo and public properties for their own promotional purposes.  But as we have seen in a previous blog Study Portals, the THE’s partners in monetizing student mobility, also continues to promote the THE ranking of Russian universities.  The statement from THE looks increasingly like a cynical PR ploy to play for time and hope that nobody remembered the promises made.  

Lilliput or Brobdingnag

In Jonathan Swift’s books Gulliver becomes a giant amongst the people of the island country of Lilliput during his first voyage because they are only 6 inches tall.  But the second voyage takes him to a peninsula called Brobdingnag where he lives with a farmer who is about 72 feet tall.  It is a reminder that there is a perspective to most things and the Impact Rankings are worth considering in that respect.

So, another good trivia question might be – universities in which countries seem disinterested in the Impact Rankings?  A good answer might be the USA where only 42 universities are shown but even lower is China where only 13 universities are featured.  The USA number is even down on last year’s 45.

It is difficult to believe that the USA does not have more than that number of institutions with a strong record in sufficient SDG categories to make a bid for the top place.  As it is, only one of the 12 US institutions who rank in the THE’s own world top 20 seems to have taken part.  Neither of the Chinese universities in the world top 20 are mentioned in the Impact Rankings and the three from the UK are also missing.

Professor Barney Glover of the table topping Western Sydney University recognised the problem and commented, “there are too many of the very strong and powerful universities in the world that are not recognised” in the Impact Rankings.”  His university’s website doesn’t go so far as to acknowledge that situation or that less than half the universities in the World Rankings feature in the Impact Rankings.  But I think he may realizes that WSU was visiting Lilliput on this occasion.

Writing in University World News Dr Anand Kulkarni makes the point that while the number of Indian universities participating grew  that “what is also noticeable is that, unlike the World University Rankings, the famed Indian Institutes of Technology are not as prominent.” Rankings expert Ellen Hazelkorn noted that absence of many leading universities “may not be due to their poor(er) performance but rather their choice not to participate” and commented on the THE Impact Rankings reliance on “self-reported and interpreted data”.  If, as claimed by THE chief knowledge officer, Phil Baty the Impact Ranking are “redefining excellence in global higher education” it rather makes one wonder why the THE don’t have the courage of their convictions and drop their other league tables.

There is a tradition in the English Football Association Challenge Cup (the FA Cup) that there are qualifying rounds before the First Round Proper when teams from the bottom two tiers of the professional Leagues join.  The Second Round Proper sees the teams from the second tier join and finally the Premier League teams join for the Third Round Proper.  The Impact Rankings look a little like selecting the winner of the FA Cup long before the Third Round Proper.

Not The Only Game In Town

This is not to argue that many of the institutions who enter aren’t doing magnificent work in some areas related to the SDGs.  But it does suggest that some institutions have recognized that the THE Impact Rankings are just another attempt to build rankings for commercial benefit and private equity gain or are simply unwilling to undertake the extra administration for little gain.  There are also other channels, with Carnegie Mellon, Georgia Tech, Rice, Harvard and Northeastern not featuring in the Impact Rankings but all being mentioned in a recent United Nations Foundation blog highlighting innovative ways progress on the SDGs is being driven by universities in the US.

There is also increasing evidence that students are less interested in rankings and more focused on employability while interest in the SDGs seems less evident.  The THE’s own research suggests that “only 16 per cent said they would choose a university that had a worse reputation for teaching and research if it had a better reputation for sustainability.”  It may be that the refusal of significant numbers of universities to become involved is a sign that the merry go round of league table mania has passed its peak.  

Note:

The title of this blog is a small nod to the classic tune “(What’s So Funny ‘Bout) Peace Love and Understanding” written by Nick Lowe and originally released by his band Brinsley Schwarz in 1974.  It became more famous when recorded by Elvis Costello and the Attractions in 1978 but even then was only a B-side.  It has been played by many artists to reflect hope in troubled times and the message seems very pertinent right now.

 Image by Joan Cabras from Pixabay 

*As a note of clarification. In its Overall Rankings list the THE only shows the top three scores of the institution plus their SDG17 ranking. They list separately, presumably for all institutions registering a score in the specific category, the ranking for each individual SDG. Thus, in 2021, 27 out of 75 Russian institutions had SDG 16 count towards their overall ranking but 49 were listed as having a score in the SDG 16 category.

Matryoshka Dolls for THE and Study Portals

The Times Higher Education made a big play about “solidarity with Ukraine, and our rejection of Russia’s aggression” back on 2 March, 2022 and said they will keep the situation under constant review.  Despite all that has happened since then nothing seems to have changed in the THE response and they continue to promote Russian universities in their current league tables.  They seem to have taken no action at all to reduce the THE Student promotion of Russian universities with detailed information about the institutions being routed through their partner in inaction Study Portals.

It’s a little like a Matryoshka (commonly known as Russian) Doll where the parts are nested inside each other so you can’t see the entire thing at once. The start is when a student searches on the THE Student site and finds that there are 359 courses in the Russian Federation and they are all given equal prominence to any other course.

The curious student clicks a specific course, say on Mechanical and Aerospace Engineering, and finds that they can study this in Russia.  The neat trick is that they have to go off to the Study Portals site to pursue their enquiries so THE can presumably say it is not having specific publicity about the institution on its site. 

A month or so ago Study Portals said it was “terrified and upset at our core seeing the war in Ukraine unfold” but was cautious to not say anything about taking action to even reduce the prominence of Russian universities on its website.  So here is the link from THE Student promoting study at Peter the Great St Petersburg Polytechnic University (SPbPU).

It probably goes without saying that Andrey I. Rudskoy, Rector of the University is one of the signatories to the infamous statement from the Russian Union of Rectors saying it is “..very important these days to support our country, our army, which defends our security, to support our President…”

Rector Rudskoy also comments in an interview that league table rankings are “a marketing tool for attracting external audiences and working with academic reputation”.  His position echoes that of the Russian government in their desire for credibility and prestige through rankings. So it is no real surprise to see Study Portals focus on its rankings position.

And then the whole circle becomes complete with the list of league table rankings which Study Portals will be able to continue even if the Ukraine war goes on until next year because the THE are doing nothing to suspend Russian universities from their rankings. QS and US News and World Report also continue to promote the institutions.

To compound matters Study Portals has no compunction about promoting Russia as a study destination.

The page goes on to tell us about Russians who have become globally famous. “From athletes like Anna Kournikova and Maria Sharapova, to composers like Rachmaninoff, Tchaikovsky, or Shostakovich, to great authors like Nabokov, and Dostoyevsky (and all other “-evsky”s, and “-ov”s and “-ova”s), Russia gave us of the most influential people in history.” 

Sharp eyed observers will note that of the two living people one became a US citizen in 2010 and the other has lived in the US since she was seven.  The others are all dead and there is a conspicuous failure to mention the Russian who is the most globally influential and notorious at the current time. The word omission in the sentence probably reflects the care and attention to detail but does nothing to hide the flimsiness of the insight.

That’s it really.  There are no notes on any of the pages to suggest that students might be wary of attending a country where they can be carted off to jail for using the world “war”.  No reflection of the “assault on academic freedom” in Russia has accelerated in recent years with universities having their licenses suspended, students expelled, Government control of foreign academic collaboration and prevented academics attending international conferences.

Matryoshka dolls are often carved to reflect a theme and embody the concept of an idea within an idea. The idea that THE Student and Study Portals seem to be capturing is that everything is normal and there is no reason to raise realities or suggest that anything is amiss. That seems quite wrong.

Making Music or Chasing Placing

When Simon Rattle was interviewed about his move from the Berlin Philharmonic to the London Symphony Orchestra he made the point, “There are a few great orchestras in the world, thank goodness. Although some people do put them in ranking order, it’s not like a snooker match. Each orchestra has different things to offer. In some ways these two orchestras are as different as you can imagine.”  He went on to comment that, “So many of the things I believe deeply in, including this idea of access for everybody, that education and growth should be at the centre of an orchestra, are exactly what the LSO have been doing.”  Universities share some characteristics with orchestras and access, education and growth should always come before rankings.

Regrettably, the University of Southampton’s recently published strategic plan is a reminder that some universities are willing to consider the empty credibility of league tables as equal to the needs of students, communities and society.  However, my review of 50 UK university strategic plans suggests that most are avoiding the temptation of putting rankings as a measure of performance, with the Principal and President of King’s College London even writing in a preamble to their plan, “This is not about league tables but about the real contributions we make to the world around us.”  Some who have built their measurement around league table rankings are finding that their statements are not ageing terribly well.

University of Southampton

The University of Southampton has been good enough to leave the September 2021 Consultation Draft Strategy on its website so it is possible to see how it developed a more bombastic tone that leaned towards rankings as a sign of success.  For example, the draft Purpose and Vision’s rather modest “we aspire to achieve the remarkable” becomes the heroic “we inspire excellence to achieve the remarkable”.  Even this is slightly less overstated than Queen Mary University’s, “the unthinkable, achieved”.

A triple helix of Education, Enterprise, Research becomes more convoluted with the insertion of Knowledge Exchange (KE) in front of Enterprise to make it, more logically, a quadruple helix.  The Research England’s Knowledge Exchange Framework confirms KE as reflecting “..engagement through research, enterprise and public engagement.” so it could stand alone. One suspects that some enterprising (sic) apparatchik suggested that you can’t have a PVC Research and Enterprise without using the word (perhaps PVC Research and Knowledge Exchange would be a better option).   

The draft suggests that the “suite of KPIs, should position us to achieve a stretching ambition of being a top 10 UK and pushing towards a top 50 internationally recognized university..”.   There is much less room for doubt in the final version where “..success will be Southampton positioned as a top 10 UK and towards a top 50 internationally recognised university..”.  One oddity in all this posturing is that the University’s website home page carries a statement about being a Top 15 UK University; Top 100 in the World but takes you to a page of rankings where they are shown as a Top 16 UK University. This is presumably because they think the Sunday Times is more credible than the Complete University Guide (where they are 15th).

Not In a League of their Own

The University of Southampton is not on its own in having league table aspirations and the table below shows others in the sample of 50 who are explicit about ranking being a strategic plan objective.  The point here is that if something is in the strategic plan you would expect a university to devote time, money and effort specifically towards achieving it.  It is quite different to prioritising what is best for the student, the community or the great global challenges.

Many universities focus on self-improvement through enhancing their performance in, for example, the National Student Survey or Research Excellence Framework or through measures such as financial stability, attrition rates and graduate outcomes. This seems more reflective and service oriented than deciding to compete in myriad and meaningless ‘best of’ tables that have little direct relevance to students or staff. It is noticeable that universities in the Russell Group are more likely to cite rankings as a performance criteria which suggests they may be a little insecure about their credentials to be in a Group that claims members as “world-class, research-intensive universities.”

Several of those reviewed have, somewhat sneakily but probably wisely, left the provenance of their measurement to be chosen at the discretion of a future Vice Chancellor. It is also relatively easy to sign off on an heroic objective if you know you will not have accountability for delivering it. Others have nailed their colours firmly to a specific mast and may regret it.  

UniversityStatement in Strategic Plan
LancasterWe will measure this goal by making further progress towards a top 100 position in key global rankings of universities.
ManchesterWe will be recognised as among the best universities in the world, in the top 25 in leading international rankings
BirminghamOur aspiration to establish Birmingham in the top 50 of the world’s leading universities
CardiffWe aim to remain in the world top 200 as measured by QS World University Rankings, the Times Higher Education World University Rankings, the Academic Ranking of World Universities and the Best Global Universities Ranking, and in the top 100 of at least one of these.  We aim to enter the UK top 20 in The Times and Sunday Times Good University Guide.
DurhamThe Times/Sunday Times League Tables Top 5
BristolBy 2030, we will: be firmly established among the world’s top-50 universities (draft)”
Liverpool…will be among the top 20 UK universities in the world rankings.
QUBBe ranked in the top 175 in global league tables. Be a top 50 university for our global impact.
SurreyReach a top 15 position in appropriate national league tables; be in the top 100 position in global league
EssexIn 2025 we will be recognised nationally (top 25 Times Good University Guide) and globally (top 200 Times Higher Education World Rankings)
East Angliawill focus on consolidating our position as a top 20 university in all of the main UK university league tables

Cardiff’s approach may have looked reasonable in 2018 when the strategy was launched and they were in the 101-150 grouping for the AWRU (they are now in the 151-200 group).  However, the most recent tables show they have failed to achieve one top 100 international ranking and their current Times/Sunday Times rank is 35.  The strategy runs until 2023 so there may still be time and it’s always possible to blame the pandemic but the next iteration of their strategy may be slightly less prescriptive.

The University of East Anglia says, “We also recognise the importance of league tables and will focus on consolidating our position as a top 20 university in all of the main UK university league tables.”  Regrettably, the most recent round of league tables finds them at 22 in the Complete University Guide, 41 in The Guardian, 26th in the UK in the THE World Rankings and the THE Table of Tables, and, 27 in The Times/Sunday Times.  Not one top 20 place to consolidate as yet but the strategy allows until 2030 to put things right.

One observation is that the University of Warwick, which seems obsessed with league table measurements on the front page of its website, does not explicitly suggest that success will be measured by them – its main claim seem to be that it will be ‘larger than now’.   Another would be that UCL is currently in a consultation about its 2022-2027 strategy as a contribution to “maintain the trajectory established by UCL 2034” and uses league tables to highlight issues as part its discussion papers.  UCL’s approach is rich in content and may be worth a review by anybody looking to write their own strategy or simply to understand this end of the higher education landscape.   

The Things They Say

No review of Strategic Plans would be complete without reflecting briefly on the tendency to reach for the most hyperbolic forms of expression to convey even the simplest of ideas.  It is as if the universities are writing the higher education version of the September Dossier rather than setting out a sober-minded and responsible plan. For some there is a reflex to state the blindingly obvious as if it were the musings of a Zen master:   

University of Exeter – Together we create the possible

University of Warwick – Excellence with purpose

University of Strathclyde – The place of useful learning

While, occasionally, there are some phrases which just feel, um, worth recording:

University of York – collaborating unconventionally

Leicester University – we don’t want to make a negative impact

Summary

There is increasing evidence that students consider other factors more important than league tables, so for universities to place them as a key measure seems more about internal vanity than external need. INTO University Partnerships claimed recently that research shows “Gen Z students have adjusted their focus from rankings to outcomes amid COVID-19” and even Universities UK has got round to suggesting eight “core metrics” which could easily form the basis for both degree and institutional measurement . Regrettably, this has not stopped some relative newcomers to the rankings party presenting machine learning and AI as the answer to achieving transparency, objectivity and non-gameability so the merry go round continues.

Making league table positions a measure of university strategy puts marketing before meaning or Style Over Substance (a new SOS for the sector).  I have discussed views on the most obvious failings in “Keep Your Virtue…Ignore the Reputation Rankings” and “Rank Hypocrisy” and it is good to see that most of those reviewed seem to recognize the vacuousness of this form of measurement.  To place ranking as a strategic ambition diverts time, energy and money away from delivering results for students, partners and the great global challenges.

NOTES

* The review of 50 University strategic plans considered documents publicly available on their websites. A combination of search mechanisms and text review was used to determine if league table rankings were specifically and meaningfully mentioned as an objective of the plan. A number of strategic plans reviewed mention current league tables in their text but do not elevate them as a specific strategic objective. The author is pleased to consider any authoritative challenges to the material identified and will post updates/corrections if they prove to be valid.

**The review was based on the documents identified as the main strategic plan of the university in question. It is recognized that operational plans at theme e.g. research, or school of study e.g. biological sciences, may suggest objectives related to league table rankings.

***The review focused on references to league table rankings identified as THE World University Rankings, AWUR, QS University Rankings, the main UK newspaper rankings e.g. Times/Sunday Times, Guardian etc or more broadly as, for example, “key global rankings”.

Keep Your Virtue*…Ignore the Reputation Rankings

If Government took an “experienced, published” scholar and said they would have to spend the next 21 years working only on a vanity project, the higher education community would be outraged at the waste of time, intellect and potential.  But that’s what the THE Reputation Rankings 2021 appear to be doing by having around 11,000 such scholars submitting views on which other universities are good at research and teaching.  It’s only the beginning of the problem for another dreary day in the self-interested world of the rankers.

We are told that academics are hard-pressed and facing mental burnout but that doesn’t stop the THE taking their valuable time to seek opinions about other institutions.  If each of the 10,963 respondents spent half a working day on their submission that would be 5,481 days.  Dividing that by 260 working days a year (a five-day working week times 52 with no holidays) suggests THE may have taken more than two decades of academic time away from scholarship that could save humanity.

Twisted by Knaves to Make a Trap for Fools

Despite, or perhaps because of, all that effort it comes as a yawn inducing revelation that the top five in 2021 are exactly the same universities as the top five in both 2020 and 2019.  The procession of Cambridge, Harvard, MIT, Oxford and Stanford – placed alphabetically here because the whole notion of ranking them seems ludicrous – continues.  In a world where the purpose of higher education, its value and its relevance are all under question this parade of hierarchy seems irrelevant.  

I wonder how Harvard’s reputation really looks to people who agree with Scott Galloway that higher education is becoming “the enforcer of a caste system” where “elite institutions no longer see themselves as public servants”.  Or Michael Crow, President of Arizona State University, when he says that HE is increasingly a system “where status is derivative of exclusion”.  Or to those who have listened to Malcolm Gladwell’s forensic dissection of the US News rankings, where he notes that they are “a self-fulfilling prophecy”.

There should also be an enquiry into the fact that California Institute of Technology (CalTech) is only placed at 14 in the Reputation Rankings.  On THE Student the compilers are telling thousands of candidates that CalTech is the best university in the US,  QS rank it at 6 in the world, it’s number 9 in US News and number 2 in the THE’s own World Rankings.  There are several other examples illustrating inconsistencies which confirm that the whole exercise isn’t really about understanding or reflecting excellence.

One might guess that the opportunity to reach out and flatter 10,963 potential readers and event attendees by asking for their opinion is a primary motivator of the approach taken.  But for THE to then claim the rankings are “based on the opinions of those who truly know” seems typically hyperbolic and ill-founded.  Donald A. Norman is quoted as saying – “academics get paid for being clever, not for being right” – which is an alternative view worth considering.  

Fill the Unforgiving Minute  

The 21-year estimate of time does, of course, presume that the academics involved took the weighty task of considering the reputational implications for thousands of their colleagues at thousands of universities seriously.  Half a day hardly seems long enough to do the job properly but some cynics might suggest that it was more likely half-an-hour over a cup of coffee.  Plenty of time to see scores settled, biases reinforced and best friends rewarded. 

Even half an hour for each submission would be about two-and-a-half years of time stolen from saving the world and brings equally good questions about the value of the exercise.  Each academic submitted around 14 names on average which which, in 30 minutes, means they would take about two minutes to weigh and consider each nomination.  It’s less time and attention than most people spend on selecting their top ten favourite party records for the Christmas playlist.  

Make Allowance for their Doubting Too

The reputation rankings methodology** specifically gives “more weight to research”.  This is not because research is intrinsically more important but because “our expert advisers have suggested that there is greater confidence in respondents’ ability to make accurate judgements about research quality”.  It is interesting to read that THE’s advisers thinks academics can’t really be trusted to review the teaching quality of their peers. 

Pity the poor student who believe the Reputation Rankings have much to do with the teaching they might receive.  The methodology places a premium on the score for research reputation of 2:1 over the score for teaching reputation.  This gives some idea of the importance that THE attributes to teaching in establishing an institution’s standing and the extent to which academics are contributing to perceptions about their priorities.

One Heap of All Your Winnings

It also seems that the eventual ranking is driven as much by volume as by quality.  Respondents are asked to simply name, rather than rank, up to 15 institutions which are best at research and teaching) in their specific subject discipline.  But the number one institution “is the one selected most often” as being one of the best in its field.

It doesn’t seem to matter if the twenty most eminent researchers in each field believes your university is the best.  You will not be top if enough other “experienced, published” researchers pick the place where they are angling for a job, enjoy to visit or where the overlord of research is a personal friend.  There is no indication in the methodology that there is a weighting to account for the ability of the respondents to make or offer a well-informed judgement.

Or Being Lied About Don’t Deal In Lies

However, there are adjustments to ensure that the ranking represents the global distribution of scholars.  Participants are selected randomly from Elsevier’s SCOPUS database which can presumably create a sample in line with the global distribution of academics.  As responses do not reflect the UNESCO distribution of scholars so they have to be weighted.   

Engineering overperformed with 15.8% of respondents and had to be levelled down to a weighted 12.7% while Business and Economics and Arts and Humanities had to be levelled up from 8.2% and 7.7% of respondents to 13.1% and 12.5% respectively.  Maybe it’s just that engineers like questionnaires and giving their opinion but it would be nice to think that some scholars are too dismissive of the process to join in.

If the argument is that the Reputation Rankings are only a game and for entertainment then academics might consider how wise it is to be wasting their time on something as meaningful as voting on The X-Factor or Dancing With the Stars.  But if it is intended to inform politicians, funders, philanthropists and business about quality it carries the danger of reinforcing bias while devaluing other institutions, and their students.  Next time the email from THE pops up in the inbox it might be a good moment to press delete and get on with doing something important. 

NOTES

*Part of the title and all the sub-heading are fragments from Rudyard Kipling’s wonderful and inspirational poem ‘If’

** The page of description of the methodology is, in my view, neither clear or well written. I would be happy to clarify or correct any areas where my interpretation or understanding is incorrect on the advice of an authoritative source.

Image by Gerd Altmann from Pixabay