Today, no respected university website is complete without a phrase that says the institution is in the «top 100» for this or that.
Rankings have become a staple of universities’ reputation and branding, helping them attract students, staff, and research investment.
Some university authorities may be skeptical of these lists, but they are all well aware of their importance.
College rankings are a phenomenon of the past decade. And yet, its influence is undeniable.
«It’s fair to say that the university that doesn’t care how these arrays are constructed is stupid,» says Oxford University admissions officer Mike Nicholson.
But these rankings seem to be dominated by a select group of institutions that repeat themselves over and over again.
The Massachusetts Institute of Technology (MIT) has just been crowned, for the third consecutive year, as the best university in the world, according to the list of QS World University Rankings.
The top 20 includes 11 American institutions. Most of the rest are made up of European universities.
What’s the secret to your success? What does it take to rank among the best in higher education worldwide?
How to be the best
The most important factor in the QS ranking is academic reputation. It is calculated by polling more than 60,000 academics around the world on the merits of the institution.
Ben Sowter, director of QS, explains that this means universities with an established name and a strong brand are more likely to rank higher.
The second factor, «citations (to professors) by faculty,» studies the strength of research in universities, based on the number of times research articles are cited by other scholars.
Another element of enormous weight is the ratio of academics to the number of students.
These three aspects represent four fifths of the score.
Points are also awarded for being more international, in terms of teachers and students.
All of this means that large, prestigious, research-intensive universities with strong science departments and great international collaboration are more likely to rise to the top.
Is this a fair way to determine which colleges are the best? Does this refer to the quality of the teaching or the skills of the students?
“We don’t take a comprehensive look at what universities are doing,” Sowter says.
“It will always be an unrefined instrument,” he says, adding that this characteristic constitutes the strength and weakness of such lists.
The overall effect of a decade of rankings has been positive, argues Sowter. This has caused universities to take a closer look to find out how they stack up against their rivals.
There have always been “unwritten lists based on stereotypes,” he says. So having more transparency allows for a more open discussion.
But creating a ranking of this nature has its own dynamics. Sowter admits that unintended consequences have been given.
“Some people get too obsessed,” he says. Improving the position of the university in the ranking has been integrated into the mission of some universities.
It has also acquired a quasi-official status. The Danish immigration system awards additional points to applicants based on their university’s position in the rankings.
The pressure to climb positions caused some universities to try to manipulate the result, sending incorrect data.
The Times Higher Education Ranking (THE) is even more specific on the criteria that place a university in the top 200.
It includes a total income of over US $ 750,000 per scholar, a student-faculty ratio of nearly 12: 1, about one-fifth of international faculty and students, and research income of about US $ 230,000 per scholar.
“It takes a lot of money, paying salaries is essential to attract and retain the best specialists and to build the necessary facilities,” explains Editor-in-Chief Phil Baty.
Regardless of how they’re calculated, the rankings are alluringly simple.
“For better or worse, rankings have been very influential among students, as well as among political leaders and some universities in various countries,” says Philip Altbach, director of the Center for International Higher Education at Boston College.
But he cautions on the criteria used for the measurement. Should universities that do not do research be compared to instruments designed for those that do so intensively?
An attempt to create a different kind of academic comparison was launched this year by the European Union, with the U-Multirank project.
The project puts less emphasis on reputation and allows students to choose their own benchmarks.
The idea is that a student who wants to enroll in an art class won’t get much from the rankings made on the number of international science research projects.
Andreas Schleicher, director of education at the OECD, the organization that created the Pisa test for schools, wants to start making comparisons in higher education.
Schleicher says the public demands that the quality of universities be assessed.
But instead of looking at what goes into universities – money, staff, facilities – he wants to know what comes out, in the form of what students are learning.
A proposal to develop a different classification will be submitted to the OECD for consideration shortly, he said.
It is not difficult to see the limits of university rankings, currently dominated by a certain type of institution and without too much consideration for the attributes of its students.
But these lists have an undeniable appeal.
“The fact that people are discussing the issue is an incentive for change,” Sowter admits.