Cream of the Crop: Rankings of the Best Universities

Although the quality and competitiveness of international education had always fascinated people, college and university rankings did not exist until the early 2000s. The majority of Western countries ranked their own institutions, how any given university would perform internationally was not known, be it a country’s top institution. The first international ranking was created in 2003. In the following year, several attempts were made to assess international higher education. Today, we may regard three rankings as the most popular: the ARWU, the THE ranking, and the QS World University Rankings.

The world’s first ranking that classified and ranked the world’s higher education institutions was created in 2003 – paradoxically in China, by the University of Shanghai Jiao Tong. Individual countries had already ranked their own universities earlier, and a few larger Anglo-Saxon news publications attempted to conduct rankings as well. Since 1988, Business Week, The Wall Street Journal, the Financial Times, The Economist and Forbes, among others, all conducted a ranking, but these only contained the 10-50 top universities named by the paper, in many cases they focused on a particular scientific field (such as the business sphere) and their methodology was also questionable.

ACADEMIC RANKING OF WORLD UNIVERSITIES

In June 2003, Shanghai Jiao Tong published its first international higher education ranking, the Academic Ranking of World Universities.

The ranking’s original intention was to investigate how Chinese universities ranked internationally, then develop a higher education strategy based on the results to aid Chinese higher education in closing the gap. As a similar comparative list had not existed beforehand, the ARWU global ranking received great attention after its publication. The Economist‘s appreciative article also played a large role in its wide reception.

Universities that came out leading ARWU snatched the ranking up, cited it on their websites and in their publications. In certain countries, a great debate ensued: for the first time, these universities were confronted with the global standing of domestic institutions, and not everyone was satisfied with the results. Research projects were initiated, and a whole series of studies came out discussing the reasons why domestic universities may have performed poorly on global rankings. The best example is France, where the low rankings of French universities forced authorities to introduce new legislation according far greater freedoms to universities.

Thus, the ARWU international ranking made a substantial contribution to the international reforms of higher education. After all, the universities and the higher education governance started out on a new, result-oriented path: the poor rankings served as incentives for change.

What distinguished ARWU from previous attempts was that on the one hand they prepared a truly global list by investigating 1,200 universities, out of which they created a ranking that included 500 universities. On the other hand, they used a stable and transparent methodology: the Chinese university’s ranking was based on transparent, traceable and predictable indicators; thus, each university could see the areas where they lagged behind their competitors. This, in essence, issued a call for open competition to the world’s universities!

Naturally, Shanghai Jiao Tong University also used the results. In 2007, it even published a book about the large international universities, in which they discussed the secret of their success and how, based on their experiences, it was possible to make Chinese higher education competitive. Moreover, in 2005 they organized a conference about the world’s universities, which has since then taken place biannually.

ARWU’s methodology has not changed since its first publication in 2003, while the two other international rankings have changed their methods more than once in the past few years. While this lends the ranking a stability and marks it as a good basis for comparison, it entirely dismisses the rankings’ criticisms.

The ranking’s basis were the universities’ Nobel Prize winners: 10% of the weighting is comprised by the Nobel Prize winners who studied at a particular university, 20% by Nobel Prize winners currently teaching at that university. The ranking weights highly cited researchers at 20%, studies published in Nature and Science at 20%, researchers listed in Science Citation Index and Social Science Citation Index at 20%, and the research performance per capita for each university at 10%.

On this basis, the latest ARWU ranking published in August 2015 featured the following universities in its top ten:

  1. Harvard (USA)
  2. Stanford (USA)
  3. MIT (USA)
  4. Berkeley (USA)
  5. Cambridge (UK)
  6. Princeton (USA)
  7. Caltech (USA)
  8. Columbia (USA)
  9. University of Chicago (USA)
  10. Oxford (UK)

Here we may first observe that US universities dominate the top rankings: out of the ten top ranked institutions eight are located in the US, and two in the UK.

Another aspect worth noting is that in the past 12 years the top rankings virtually have not changed (the 2003 rankings: Harvard, Stanford, Caltech, Berkeley, Cambridge, MIT, Princeton, Yale, Oxford, Columbia). Only Yale dropped out, with The University of Chicago taking its place, and MIT has substantially improved its position.

According to the ARWU ranking, 146 universities from the United States made it into the top 500 universities. The United States is followed by China with 44 universities, Germany with 39, the United Kingdom with 37, France with 22, and Australia, Canada and Italy each with 20 universities. Hungary is represented by two institutions in the rankings (Eötvös Lóránd University and the University of Szeged. Many criticized Shanghai Jiao Tong University’s rankings, however, primarily because it palpably privileged the natural sciences and engineering to the detriment of the humanities.

This is partly due to ARWU’s methodology, which heavily relied on the Nobel Prize, an award that lays greater emphasis on the natural sciences. Therefore, internationally renowned humanities institutions rank much lower when compared to universities that have large engineering or medical faculties.

Another criticism leveled at ARWU’s rankings was that it privileged English-language journals in its weighting. Thus English-language researchers who lived in Anglo-Saxon countries had a much greater chance of being published in an English-language journal and being highly cited.

THE -QS WORLD UNIVERSITY RANKINGS

Times Higher Education (THE), a British weekly founded in 1971 focusing on higher education, was one of the most important critics of the Chinese rankings. The weekly’s writers questioned, for instance, why in the case of awards and publications the researcher’s original alma mater received the points and not the university that financed the famous research, or whether it made sense to rank a university higher because they had a Nobel Prize winner 40 years ago? Radical criticisms finally spurred THE to create its own international rankings in 2004 together with Quacquarelli Symonds (QS). This became the THE-QS World University Rankings.

The THE-QS World University Rankings were based on a complex methodology: 1200 universities of 88 countries were examined, the methodology’s key element was what was known as the Academic Reputation Survey, in which several thousands of researchers were requested to give their opinion about the universities. In this way, the professional reputation of the universities became an important factor in the ranking. As it provided 40% of the ranking’s weighting, it was in essence its most important element. Moreover, the weighting took into account the following: educational environments, research, and concrete research. Thus, the list was narrowed down to 800 universities.

The THE-QS ranking was heavily criticized, mostly because its methodology was based on the universities’ reputation. Leaders of certain universities emphasized in their analyses that due to this methodology, the ranking of individual universities might change from year to year without any perceivable reason, therefore the ranking cannot be taken seriously. The criticisms finally led to internal debates between THE magazine and the British company Quacquarelli Symonds (QS). While THE’s employees recommended drastic changes, QS resisted. The two companies parted ways in 2010.

TIMES HIGHER EDUCATION WORLD UNIVERSITY RANKINGS

From 2010 onwards THE published a new ranking based on a new, entirely revised methodology: Times Higher Education World University Rankings. Although the survey-based results remained in part (those about educational environment and about research conducted at universities), the main emphasis was no longer on reputation, but on academic citations, which accounted for one-third of the weighting. The THE examined 11,3 million publications over a five-year period, which meant browsing through 51 million citations. The methodology’s other two pillars were on the one hand learning environment (quality, university income, the income per academic), on the other the volume and reputation of research activities. Moreover, the rankings weighted international diversity at 5% on the basis of international professors and students. Thomson Reuters was the new collaboration partner until 2014; in 2015-6, THE collaborated with Elsevier.

THE’s 2015 ranking places the following universities at the top ten position of the list:

  1. Caltech (USA)
  2. Oxford (UK)
  3. Stanford (USA)
  4. Cambridge (UK)
  5. MIT (USA)
  6. Harvard (USA)
  7. Princeton (USA)
  8. Imperial College London (UK)
  9. ETH Zürich (CH)
  10. University of Chicago (USA)

Thus, this ranking includes three English and one Swiss university alongside six US universities among the top ten institutions. Six universities from Hungary appear in the THE ranking: Semmelweis University is ranked between 501-600, the other Hungarian universities on the list are ranked between 601-800. Amongst our neighbors, only Austria performs better with seven ranked universities. Its best institution was ranked 142nd place. The reformed THE rankings were also criticized: according to its critics it laid far too great an emphasis on publication citations, which disadvantaged the humanities on the one hand (as they traditionally use fewer citations than the natural sciences), while on the other hand non-English-language universities may be disadvantaged as well.

We could mention as an example the London School of Economics and Political Science, which was featured at 11th place in the ranking prepared in collaboration with QS. However, in the 2010 ranking prepared independently by THE, the renowned English university only finished at 86th place. At the same time, it is interesting that in the 2015 ranking by THE, the London School of Economics and Political Science improved its ranking and now occupies 23rd place.

QS WORLD UNIVERSITY RANKINGS

After Times Higher Education and QS parted ways in 2010, the company QS came up with its own university ranking under the title “QS World University Rankings”. The list is compiled by the British company Quacquarelli Symonds (QS) specializing in education and study abroad, and it has been present on the market since 1990. The company’s profile includes study abroad, and in addition to the ranking it has an analytic division, which prepares studies for higher education institutions about the education market.

After breaking up with Times Higher Education, QS made virtually no changes to its earlier methodology, and thus continued the practice of the previous years. The weighting includes 40% of evaluation received by representatives of the profession, 10% of evaluation by employers, who judge the performance of the graduates of various institutions. Thus, half of the QS rankings rest on reputation-based evaluations that are opinion in character.

Professors participating in the survey may nominate up to 30 universities but are not able to vote for their own institution. The other half of the weighting is comprised by the students of the faculty (20%), the citations per academic (20%), and the number of international students and professors (5-5% each).

Thus the latest ranking published by QS is as follows:

  1. MIT (USA)
  2. Harvard (USA)
  3. Stanford (USA)
  4. Cambridge (UK)
  5. Caltech (USA)
  6. Oxford (UK)
  7. University College London (UK)
  8. Imperial College London (UK)
  9. ETH Zürich (CH)
  10. University of Chicago (USA)

This ranking includes the fewest American universities in comparison to other rankings: the top ten contains five American, four English and one Swiss institution. As the ranking is largely based on survey-based rankings, outstanding universities of certain regions may receive larger emphasis and better rankings. For instance, 21 Russian universities are featured on the QS rankings. The best one, Lomonosov Moscow State University is ranked 108. In comparison, the THE rankings feature thirteen Russian universities, whereas the ARWU ranking a mere seven. We see similar results in other regions as well: Brazil is represented by 22 universities in the QS rankings, in comparison to 17 universities ranked on THE, and a mere 12 on ARWU. Hungary is represented by four universities in the QS rankings. Szeged University received the best ranking followed by Debrecen University, Eötvös Lóránd University and Corvinus University of Budapest. Amongst our neighbors, Austria received better rankings (7 universities, the best institution ranked as 153.) and Ukraine (six universities). Romania is represented on the list by four institutions.

As half of the ranking is comprised of impressions and evaluations based on personal opinion, the ranking continues to receive similar criticism as was leveled at THE-QS ranking started from 2004. Particularly universities whose emphasis is the natural sciences question its methodology.  Due to its methods, relatively significant differences may occur from one year to another: thus, for instance, Imperial College London was ranked 8th in the 2015 rankings, whereas in 2014. it was ranked 2nd.

Despite the differences between the three largest higher education rankings, if we take a look at the top ten, seven universities are the same across all three rankings: Harvard, Stanford, MIT, Cambridge, Oxford, Caltech and the University of Chicago.

ALTERNATIVE MEASUREMENT METHODS AND RANKINGS

In addition to the best-known rankings, there are alternative methods to measure the international higher education field. These measurements were also devised after the second half of the early 2000s. One such list is Webometrics – Ranking Web of Universities, which is prepared by the Spanish National Research Council (CSIC) Cybermetrics Lab. The essence of Webometrics is that it ranks universities according to online presence. It exists since 2004, and at the beginning it was created to provide an incentive to universities to increase their online presence. The rankings take into consideration each university’s commitment to openness and their commitment to knowledge sharing: the number of subpages, citations, files shared etc. are taken into account. This ranking is wholly dominated by the United States: the top twelve comprises only US universities, and Oxford and Cambridge, which fare traditionally well, received places 13 and 14. Amongst Hungarian Universities, Eötvös Lóránd University and Budapest University of Technology and Economics finish in the top 500 (365 and 411) and are followed by Szeged University (530), University of Debrecen (596) and University of Pécs (896).

The G-Factor University Rankings used a similar principle (Google Factor): this international ranking also measured online presence, but ranked universities by counting the number of links only from other university websites. However, this ranking is no longer accessible.

Fired up by the Chinese higher education ranking published in 2003, other regions also prepared their own ranking: for instance, Moscow’s Global University Ranking. This ranking was short-lived, as it was quickly discredited for having ranked Lomonosov Moscow State University 5th, before Harvard, Stanford and Cambridge. Both the collaborating company and the website ceased to exist. The better-known university rankings include HE-EACT, created in Taiwan in 2007, the rankings published in 2007 by the Dutch Leiden University, CWTS Leiden Ranking, and the rankings published by the Saudi-Arabian Center for World University Rankings, or the Russian Round University Ranking from 2013.

THE ANCHORING EFFECT OF GLOBAL INDICES

A serious criticism leveled against global rankings is the so-called “anchoring effect”. According to certain studies the first global rankings published in the early 2000s substantially influenced international opinion regarding universities. The anchoring effect’s essence is that human beings rely heavily on the first piece of information, on previous knowledge, when making decisions: in other words, that a certain cognitive bias influences our judgments and opinions. The first published global ranking could have contributed to the formation of previous knowledge or an opinion with regard to great universities. This is particularly true about the first publication. Shifts in existing rankings are no longer in a position to effect great change in opinion.

This “anchoring effect” is easily measurable: compare those who had never heard about the rankings with those who had seen at least one of them. On the basis of studies there is a much greater chance that the subject (who had already seen the rankings) names one of the universities included in the top rankings as the best than any other institution. These studies therefore question the results of those rankings which base their research on surveys.

All of this might result in quite far-reaching and surprising effects according to the studies: the results of international higher education institutions may influence the university’s research performance, and the behavior of its professors and students. Thus, year by year a kind of reverse effect might come to be: it is not the university’s performance that influences its prestige and evaluation, but the ranking itself might become the incentive for its prestige and performance.

written by: Anton Bendarzsevszkij

Anton Bendarjevskiy was studying at the University of Pécs, and have MA degrees in History and MA in Media and Communications. In 2009 he was also studying Political Science at University of Leicester. Anton is specializing in the internal- and foreign policy of Russia, Ukraine and Belarus, in addition to energy security and security politics. He is a regular guest of different international conferences and forums dealing with these topics. Since 2011 he is a regular guest of Hungarian media as an expert of post-soviet countries.

Anton Bendarzsevszkij

Anton Bendarjevskiy was studying at the University of Pécs, and have MA degrees in History and MA in Media and Communications. In 2009 he was also studying Political Science at University of Leicester. Anton is specializing in the internal- and foreign policy of Russia, Ukraine and Belarus, in addition to energy security and security politics. He is a regular guest of different international conferences and forums dealing with these topics. Since 2011 he is a regular guest of Hungarian media as an expert of post-soviet countries.

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: