Times Higher Education World University Rankings

There are many reasons to distrust or ignore the various league tables of University performance. The rankings of universities are strongly affected by small changes in scores. The measures used benefit English-language, larger, longer-established, research-based universities. Yet the rankings themselves may influence the choices of international students and the reputations they suggest may affect patterns of staff recruitment. If we look at universities within Ireland then at least some of these problems are less serious. Furthermore, for some of the ranking schemes, we now have a relatively long run of data, we get fairly decent information about how the scores are produced, and some of the basic data is also published for each university.

The rankings produced by the Times Higher Education were published earlier this week. [1] I summarise in Tables 1 and 2 below the returns for the Irish institutions. Trinity College Dublin, historically the highest-ranked Irish university, has been excluded from the THE tables for 2015-16 and 2016-17 because the university was found to have inadvertently sent incorrect data. [2]


Table 1. Some basic data and the overall ranks, THE World University Rankings

 2016-17 Students Student:
staff ratio
International students Female:
male ratio
TCD* 15,521 18.0 25% 57
NUIG 14,067 26.8 14% 56 201-250
QUB 17,940 17.9 30% 54 201-250
RCS 2,473 15.6 63% 55 201-250
UCD 22,193 24.5 23% 201-250
NUIM 7,653 28.0 11% 55 351-400
UCC 15,805 22.3 15% 55 351-400
DCU 8,546 22.9 17% 51 401-500
UL 12,212 19.8 13% 46 501-600
UU 19,622 15.8 15% 55 501-600
DIT 13,839 18.9 25% 41 601-800

(TCD – Trinity College Dublin, data are for 2014-15; NUIG – National University of Ireland, Galway; QUB – Queens University Belfast; RCS – Royal College of Surgeons, Dublin; UCD – University College Dublin; UCC – University College Cork; NUIM – Maynooth University; DCU – Dublin City University; UL – University of Limerick; UU – Ulster University; DIT – Dublin Institute of Technology)


From Table 1, we can see that compared both to the universities of comparable or higher rank, but also compared to universities that were given a lower rank, Maynooth has the most unfavourable staff:student ratio on the island, and also the lowest proportion of international students. In terms of its rank, Maynooth sits with University College Cork and below Trinity College Dublin, NUI Galway, Queens University Belfast, Royal College of Surgeons and University College Dublin, but above Dublin City University, University of Limerick, Ulster University and Dublin Institute of Technology. One might expect that larger institutions (many graduates in many places and thus more likely to get a bounce from alumni) and longer-established institutions (several generations of alumni plus greater general visibility) would do well on these questions about reputation. Recognising the advantages of the longer-established institutions, the Times Higher Education also produce a separate list of the 150 highest ranked universities that are under 50 years of age and last year Maynooth headed this list (see Table 2; the list for 2016-17 is yet to be published). [3] The methodology for measuring Teaching and Research on this score appears to be slightly different to that reported in the global rankings for 2016-17 (or perhaps innovations made in 2016-17 for the World University Rankings have yet to be applied retrospectively for 2011-12 to 2015-15 in the 150 under 50 listing). With respect to the other young Irish universities, Maynooth’s most significant advantage comes from its citation score.


Table 2. Irish universities in the Times Higher Education list of 150 universities under 50

2015-16 NUIM DCU UL UU
Overall 41.1 39.2
Teaching 31.6 34.4 25.9 24.9
International Outlook 75.6 76.7 76.5 79.0
Industry Income 36.6 51.7 34.0 29.2
Research 33.4 36.4 23.9 19.1
Citations 50.2 36.4 36.6 44.1
Rank 68 79 101-150 101-150


But to offer any interpretation at all we should familiarise ourselves a little more with the data and what it measures. [4Times Higher Education constructs its score from five sets of indicators: Teaching, Research, Citations, International Outlook, and Industry Income. The basic results for 2016-17 are given in Table 3.


Table 3. The component indicators in the overall score, THE World University rankings

 2016-17 Teaching International outlook Industry income Research Citations
TCD* 39.4 83.9 30.6 30.3 77.6
NUIG 27.7 78.1 41.4 27.8 76.7
QUB 31.2 94.3 37.1 31.9 78.3
RCS 35.5 89.5 37.1 18.6 81.7
UCD 31.5 88.1 37.4 37.8 72.5
NUIM 25.5 77.6 37.5 24.2 61.0
UCC 27.7 76.0 50.6 25.7 54.1
DCU 26.7 78.6 46.7 26.7 42.7
UL 18.5 82.6 36.0 19.0 44.3
UU 18.7 70.7 33.1 15.3 49.5
DIT 15.0 72.6 32.1 10.4 27.7


Maynooth’s component scores give the university a ranking that is more or less consistent across the five areas but it is also possible to examine these scores for many of the past half-dozen years and see if there might be anything interesting to say about trends.



Figure 1. Teaching scores, Times Higher Education, World University Rankings

Figure 1. Teaching scores, Times Higher Education, World University Rankings

For the overall University score the THES give 30% weighting to a variable they call ‘Teaching (the learning environment).’ The 30% is derived as follows: reputation survey (15%), staff-to-student ratio (4.5%), Doctorate-to-bachelor’s ratio (2.25%), doctorates-awarded-to-academic-staff ratio (6%), institutional income (2,25%). For Maynooth, Figure 1 is not too reassuring (Maynooth was missing from the rankings for 2014-15, hence the gap and missing value for 2015).  All universities ranked higher than Maynooth had better teaching scores, whereas five years earlier Maynooth had a better Teaching Score than both University College Cork and NUI Galway.

Much of this Teaching Score comes from the 10,323 individuals who responded to a request from the publishers Elsevier, and agreed to report which universities in which countries they understood to have a good reputation for teaching.

The questionnaire, administered on behalf of THE by Elsevier, targets only experienced, published scholars, who offer their views on excellence in research and teaching within their disciplines and at institutions with which they are familiar.

The 2016 survey was carried out between January 2016 and March 2016, and received a total of 10,323 responses from 133 countries. Contact details of scholars are drawn from Elsevier’s extensive database of published journal article authors. [5]

The questionnaire does not seem to have been published but there is a broad description given and from it we learn that the Reputation Survey assesses teaching in the following manner:

We ask all respondents to identify the best teaching institutions in their field of expertise. Additionally, those who indicate that teaching accounts for the “highest percentage of time spent” are later asked to identify the one institution they would recommend that a student attend “to experience the best undergraduate and/or graduate teaching environment” in their subject area. [6]

Most of the Irish universities in Figure 1 show some improvement over the period 2012-17. It seems plausible that Maynooth falling below Cork and Galway over the period is related to its reporting the most unfavourable student:staff ratio of any Irish institution (28.0, see Table 1 above). It would be interesting to know the number of doctorates awarded by field for each Irish university. For the Republic of Ireland, since 2011 the Higher Education Authority reports on the collective category, ‘Postgraduate certificates, diplomas, masters and PhD degrees awarded in Universities.’ About a tenth of these are doctorates so we cannot use this data to examine the ratio of doctorates to academic staff or to the total number of students for each institution. A closer proxy may be produced from the HEA statistics on full time registrations. [7] The proportion of full-time students who are studying for a PhD is perhaps indicative of what the THE wanted to register: ‘a sense of how committed an institution is to nurturing the next generation of academics [… and] suggests the provision of teaching at the highest level that is thus attractive to graduate and effective at developing them.’ [8]


Table 4. The proportion of full-time students studying for doctorates (1 March each year)

2012 2013 2014 2015
TCD 0.128 0.111 0.100 0.095
NUIG 0.065 0.062 0.056 0.068
UCD 0.082 0.072 0.064 0.062
NUIM 0.049 0.044 0.041 0.038
UCC 0.068 0.069 0.067 0.066
DCU 0.065 0.057 0.051 0.047
UL 0.060 0.057 0.056 0.059
DIT 0.021 0.017 0.013 0.012

It would appear that universities in the Republic of Ireland have seen a decline in the share of their student body who are studying for doctorates. Maynooth has a smaller share of its student body undertaking doctoral research than two of the other younger universities, exceeding only the Dublin Institute of Technology, a ranking that is stable over the four years shown in Table 4 above.



Figure 2. Research scores, Times Higher Education, World University Rankings

Figure 2. Research scores, Times Higher Education, World University Rankings

A variable that the Times Higher Education calls Research (volume, income and reputation) contributes a further 30% to the overall score, and it is made up as follows: reputation survey (18%), research income (6%), research productivity (6%). We have already met the reputation survey and for this part of the score people are asked to indicate the leading universities in their fields. In the main these fields are very broad: ‘arts and humanities; business and economics; clinical, pre-clinical and health; computer science; engineering and technology; life sciences; physical sciences; and social sciences.’ [9] Respondents can also indicate the leading institutions in their own specific sub-field within these broad categories. Data on research income is collected for these broad areas and the score for a university reflects a weighted sum of its normalised scores so that a good performance in social science research funding is not swamped by a mediocre performance in engineering and technology even though the latter might represent a larger sum of money. Research productivity is based on publications:

To measure productivity we count the number of papers published in the academic journals indexed by Elsevier’s Scopus database per scholar, scaled for institutional size and normalised for subject. This gives a sense of the university’s ability to get papers published in quality peer-reviewed journals. [10]

Scopus is rather poor at capturing publications in Humanities and the Social Sciences. In one study of the publications of 146 senior academics, researchers found that Google Scholar found 54% more papers for scholars in the Life Sciences, 48% more in the Sciences, and 39% more for Engineering. [11] For the Social Sciences Google Scholar found 238% more and for the Humanities 343% more. Normalising the scores for each subject will help but there is a potential circularity in arguing that you want to measure research productivity as ability to get into the journals that Scopus happens to cover, even if that is glossed as ‘quality peer-reviewed journals.’ This certainly shows the importance of one bibliometric tool (Scopus) for research rankings. From my own list of publications: my two books are included, none of my four edited books are included, only five of forty-two book chapters, nineteen of forty-one single-authored articles, one of four joint-authored articles, and nine of thirteen review articles. Not included are some open access journals, most book chapters, all edited books.

The data shown in Figure 2 do show a good deal of volatility but it is clear that Maynooth is certainly not improving its standing with respect to higher-ranked Irish institutions. Without knowing the precise way normalised subject scores are weighted and aggregated into a university score, it is hard to speculate what mix of grant income, publication productivity and subject mix might be influencing this result.



Figure 3. Citations scores, Times Higher Education, World University Rankings

Figure 3. Citations scores, Times Higher Education, World University Rankings

Another 30% of the score comes from citations collected from the 23,000 journals included in Scopus. Again the data are normalised by subject. Maynooth’s citation score has risen, more or less in line with other Irish institutions and as such Maynooth remains significantly above the levels of the other young Irish institutions.  The faltering of University College Cork has taken it below Maynooth. The dramatic falls in citation scores for Trinity and UC in the first half of the 2010s is quite striking. This is perhaps the indicator where Maynooth shows most progress over the past half decade.


International outlook

Figure 4. International outlook scores, Times Higher Education, World University Rankings

Figure 4. International outlook scores, Times Higher Education, World University Rankings

The International Outlook score contributes 7.5% to the overall score and it is composed as follows: international-to-domestic-student ratio (2.5%), international-to-domestic staff ratio (2.5%), and international collaboration (2.5%). From Table 1 above, we can see that the proportion of international students at Maynooth is below that of any other Irish institution. Maynooth’s position on Figure 4, then, probably reflects a slightly more international staff or greater degree of international collaboration than some other institutions.


Industry Income

Figure 5. Industry income rankings, Times Higher Education, World University Rankings

Figure 5. Industry income rankings, Times Higher Education, World University Rankings

This category ‘seeks to capture […] knowledge transfer activity by looking at how much research income an institution earns from industry.’ [12] This sum is expressed in ratio to the number of academic staff. This measure is not clearly related to the general ranking of overall scores. Dublin City University and University College Cork do especially well on this measure, and Maynooth ranks above University College Dublin.



The comparative data show that Maynooth is relatively poorly provided with academic staff in comparison to its student numbers, that Maynooth has a very low share of international students within its community, and a fairly decent level of research output as indicated by Citations. These rankings are dominated by the subjective opinions of academics contacted through Elsevier (33%), and bibliometrics measures from Scopus (36%). The significance of the age and size of institutions has a big effect upon reputation (conceded by producing the 150 under 50 list) thus as the second smallest Irish institution (see table 1) and as one of the youngest Irish universities Maynooth faces a difficult challenge. The failure of these bibliometric measures to include the bulk of book chapters and edited books, and even a good share of academic journals is worrying and merely normalising scores within a broad field does not necessarily mean that we are evening up from a randomly distributed set of underly-reported profiles.

Insofar as it is helps in recruiting staff and students, a university might think it worthwhile to address the components of the THE rankings. Some of the things that serve the ranking might even be beneficial in their own right. Investing in new staff will improve the staff:student ratio while also improving the Teaching score. Making the university community more international may improve the educational experience for domestic students while also raising the International Outlook score. Encouraging staff to place research where it gets greatest visibility will help in the dissemination of scholarship while also improving both Citation score and Research score. Both books and articles gather very different levels of citation depending upon where they appear. Funding staff to present high quality work at the leading international conferences and encouraging staff from foreign universities to spend sabbatical time with us not only enriches our own scholarship but also services our international visibility, contributing to the Reputation elements within both Teaching and Research scores. Investing in masters and thus very likely also doctoral scholarship not only gives staff the opportunity to pursue synergies between teaching and research but also raises the Teaching score. Promoting the commercial opportunities of research not only brings resources to the university but also gives some research a real-world relevance that can lead to the exploration of cutting-edge applications while developing for students greater chances of post-degree employment. Needless to say it would also raise the Industry Income score.

Yet, we also know that pursuing rankings as an end in themselves distorts scholarship and teaching, while also demoralising staff. There is a pervasive neoliberal anxiety in many quarters where means and ends are perversely confused. [13] Nevertheless, some of the things that help universities satisfy the imperatives of scholarship, learning and teaching will also gather points on many of the ranking exercises.

Gerry Kearns, 25 September 2016


[1] ‘World University Rankings 2016-17,’ Times Higher Educationhttps://www.timeshighereducation.com/world-university-rankings/2017/world-ranking#!/page/0/length/25/sort_by/rank_label/sort_order/asc/cols/rank_only

[2] Carl O’Brien, ‘Whoops: Data blunder sees TCD fall out of global rankings,’ Irish Times (21 September 2016), http://www.irishtimes.com/news/education/whoops-data-blunder-sees-tcd-fall-out-of-global-rankings-1.2800587

[3] ‘150 under 50 rankings 2016,’ Times Higher Educationhttps://www.timeshighereducation.com/world-university-rankings/2016/one-hundred-fifty-under-fifty#!/page/0/length/25/sort_by/rank_label/sort_order/asc/cols/rank_only

[4] ‘World University Rankings 2016-2017. Methodology,’ Times Higher Educationhttps://www.timeshighereducation.com/world-university-rankings/methodology-world-university-rankings-2016-2017

[5] ‘Academic Reputation Survey 2016 explained,’ Times Higher Educationhttps://www.timeshighereducation.com/world-university-rankings/academic-reputation-survey-explained

[6] ‘Global institutional profiles project, Academic reputation survey, stage 2 methodology,’ Thomson-Reutershttp://ip-science.thomsonreuters.com/m/pdfs/GIPP_AcamRep_methodology.pdf

[7] Higher Education Authority, Statistics. Higher Educationhttp://www.hea.ie/en/statistics/overview

[8] ‘World University Rankings 2016-2017. Methodology,’ Times Higher Educationhttps://www.timeshighereducation.com/world-university-rankings/methodology-world-university-rankings-2016-2017

[9] Ellie Bothwell, ‘THE World University Rankings 2016-2017: subject rankings results to be released on 28 September,’ Times Higher Education (23 September 2016), https://www.timeshighereducation.com/world-university-rankings/world-university-rankings-2016-2017-subject-rankings-results-be-released

[10] ‘World University Rankings 2016-2017. Methodology,’ Times Higher Educationhttps://www.timeshighereducation.com/world-university-rankings/methodology-world-university-rankings-2016-2017

[11] Anne-Wil Harsig, Satu Alakangas, ‘Google Scholar, Scopus and the Web of Science:A longitudinal and cross-disciplinary comparison,’ Scientometrics 106:2 (2016) 787-804, http://mail.harzing.com/download/gsscowos.pdf (calculated from Figure 3 in this article).

[12] ‘World University Rankings 2016-2017. Methodology,’ Times Higher Educationhttps://www.timeshighereducation.com/world-university-rankings/methodology-world-university-rankings-2016-2017

[13] Lawrence D. Berg, Edward H. Huijbens, Henrik Gutzon Larsen, ‘Producing anxiety in the neoliberal university,’ Canadian Geography/Le Géographe canadien 60:2 (2016) 168-180, https://www.academia.edu/19714927/Producing_Anxiety_in_the_Neoliberal_University



  1. […] Source: Times Higher Education World University Rankings […]


  2. Alistair Fraser · · Reply

    Nice post Gerry.

    I guess the implied message from these rankings is this: publish with Elsevier.

    They do have a big presence in Geography (see here: https://www.elsevier.com/social-sciences/geography-planning-and-development/geography-planning-and-development-journals), including Political Geography, JRS, Health & Place, and Geoforum.

    But they also have a bit of a bad rep in the discipline, although I can’t remember the details of all that.

    Alistair Fraser


  3. Thanks Alistair. Scopus covers more than Elsevier [and yes there was a controversy about Geography and Elsevier – it concerned their sponsorship of arms sales conventions; also a number of German universities have at various times boycotted Elsevier over the high cost of journals]. Only 10% of the Scopus journals are Elsevier. Scopus claim to cover 564 journals in Geography, Planning and Development. Even so, not all Geography journals are covered and you are right work that is not in Scopus journals does not feature. Academics need an ORCID number to help the search engines find their work.


  4. […] have commented before on the way some of these rankings work and QS is more reliant on subjective evaluation of […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: