The Times Higher Education or THE Impact Rankings are designed to measure the extent to which universities are working towards fulfilling the United Nations’ Sustainable Development Goals. The SDGs are not focused on higher education, but provide a shared blueprint for setting strategies that seek to improve health and education, reduce inequality and foster economic growth, while addressing climate change and working to preserve our oceans and forests.
Across the world, decision-makers and policy wonks have embraced the SDGs as a symbol of action and political discourse at times of uncertainty and increased inequality, and in the face of the many challenges we are confronting these days.
It also means that there are many people in professional services who are seeking to capitalise on the SDGs for consultancy, assessment and evaluation services. It is therefore useful to keep those mediating tensions in perspective so that institutions can fully address the SDGs and universities can distil their impact on society.
Last year THE inaugurated its ImpacRankings on how universities are meeting the SDGs.
This year’s edition, published in late April, includes 767 institutions from 86 countries which submitted data on at least four SDGs. Compared to last year there was a 38% increase in the number of participating institutions.
Expanding SDG coverage: From 11 to 17 SDGs
For this year’s edition, THE expanded coverage from 11 to all 17 SDGs. This means THE has developed 105 metrics and 220 measurements in total. This has been a significant undertaking on THE’s part. This is because the tier classification for global SDG indicators contains more than 230 measures.
The team at THE must be very pleased with the system they have developed over the past two years, but we also need to observe how THE addresses calls from experts regarding accountability.
There were 164 institutions globally (21% of all participants) which submitted data and evidence on all 17 SDGs. As observed last year, working on the submission for this ranking is a significant endeavour for institutions because of the copious amount of information required.
It also requires that those teams of individuals working on the submission are diligent, taking every care in compiling evidence and ensuring it is publicly visible on the web, as failure to do so means losing points and receiving an average result.
Unlike all other rankings, months of planning and preparation are the key to success for THE Impact Rankings, combined with optimal resourcing and the ability to work across functional groups, often with competing priorities. Unfortunately, institutions with fewer resources cannot afford full-on participation and need to moderate their expectations of how well they are likely to perform in this ranking.
Of the 164 institutions which were able to submit data on all 17 SDGs, 95 were from high-income economies, largely drawn from the East Asia and the Pacific and Western Europe regions. There were 124 institutions from lower middle-income economies which submitted data on at least four SDGs and which were therefore given an overall rank. Half of these submitted data for four to six SDGs.
Of the universities ranked in the global top 200, 82% were universities from high-income economies, primarily from the regions of East Asia and the Pacific, Western Europe and North America.
Half of the universities from Latin America are ranked in the 301-500 range, while most universities from the Arab states and the Central and East European regions rank outside the top 400.
This ranking, like all other schemas, highlights equity issues around performance and we need to emphasise that rankings need to be viewed in context, ideally not on a global but a regional basis.
Benchmarking
There is no doubt that institutions participating in the THE Impact Rankings are equipped with invaluable information to benchmark themselves against other institutions on any SDG once the results are published.
There is a catch, though. In order to make optimal use of the results, institutions need to take a paid-up three-year subscription to THE’s dashboard which enables institutions to benchmark against others (regardless of their geography and standing) on any metric within the SDGs. Alternatively, THE publishes an online list of institutions ranking on every SDG, but it does not provide a breakdown by metric.
The results of the rankings can be used by institutions for a variety of purposes, for instance: harnessing awareness of the SDGs; developing a roadmap for ongoing improvement; embedding the SDGs in every facet of university activity; and ensuring university strategy aligns with the SDGs.
An alternative model for institutions which choose not to participate is to take the United Nations’ Tier Classification for Global SDG Indicators as a guide, adapt these for their own assessment and seek to partner with other interested institutions as part of an inter-institutional benchmark.
Preferred SDGs
Leaving aside SDG 17 (on partnerships), which is the only compulsory SDG for institutions, there are some SDGs which attract more interest than others.
In both editions of this ranking, we observe that SDG 4 (quality of education), SDG 3 (good health and well-being), SDG 9 (industry, innovation and infrastructure) and SDG 5 (gender equality) have the greatest number of submissions from institutions. It is not surprising to see that the lowest number of participating institutions occurred in SDG 14 (life below water), SDG 15 (life on land) and SDG 6 (clean water and sanitation).
The selection of which SDGs to submit to THE in part reflects an institution’s profile, mission and discipline strengths. Geographical location also plays a part in the chosen SDGs.
For example, universities from Latin America were more prominent in SDG 1 (poverty), SDG 2 (hunger), SDG 3 (good health and well-being) and SDG 4 (quality of education), while European universities were less enthused about SDGs 1 to 3, but more prominent in SDG 8 (decent work and economic growth), SDG 16 (peace, justice and strong institutions) and SDG 12 (responsible consumption and production).
Year on year comparison
Any year on year comparison of the results needs to be treated with caution for the following reasons: firstly, the number of participant institutions increased by 211 (or 38%) from 556 in 2019 to 767 in 2020; secondly, the number of SDGs rose from 11 to 17, increasing the number of metrics ranked. Thirdly, there were some SDGs in which new measures were added, revised or weight redistributed.
Bearing in mind these methodological differences, we can see some institutions maintained their relative standing from last year to this year and there are some institutions which saw a rapid rise in their standing while some new entrants rocketed to the top.
In the first group, we see that New Zealand’s University of Auckland is overall ranked first for a second consecutive year and there are seven other institutions which maintain their rank among the world’s top 20.
They are: Western Sydney University (third, up from 11th), University of Bologna (sixth, up from ninth), University of British Columbia (seventh, down from third), University of Manchester (eighth, down from third), King’s College London (ninth, down from fifth), University of Waterloo (16th, down from 13th) and McMaster University (17th, down from second).
We also see that new entrants La Trobe University and Arizona State University (Tempe) rocketed to rank fourth and fifth, respectively. Meanwhile, the University of Sydney, which ranked 25th last year, ranks second this year and RMIT University moved from 82nd to 10th overall.
Comparison with THE World University Rankings
Of the institutions ranked in the Impact Rankings, 32% of them are not included in THE 2020 World University Rankings. Most of the World University Ranking institutions rank outside the world’s top 400 in the Impact Rankings. Only 82 universities can be found in the top 300 spots of both rankings.
So participation in the Impact Rankings seems to be a vehicle for gaining visibility at a global level for many.
The decision for many universities to participate in this ranking is also driven by the fact that research metrics weigh about 27% of the overall score, considerably less than the overall weight given in the World University Rankings.
Time will tell the degree to which institutional variability in the Impact Rankings continues to be prevalent. Let us hope institutions from middle-income economies see an improvement in their standing in the future.