The product, now known popularly as the Shanghai Ranking, uses public information to rank the world's best universities. Over the years, it has developed into a ranking system widely noticed, discussed and sometimes criticised.
When Shanghai Jiao Tong University published its ranking in 2003, nine Chinese universities made it to the top 500. Three Indian institutions too figured in it — Indian Institute of Science (IISc), IIT Kharagpur and IIT Delhi. Last year, the Shanghai Rankings featured 54 Chinese universities in the top 500, while India had one in IISc. The IITs had dropped out.
At least on one set of parameters, Indian institutions were slipping. Now India has started emulating the Chinese, in principle if not in method. Last week, the ministry of human resource development (MHRD) announced its list of top Indian universities, based on its own ranking methodology. It was the second year in succession, but the first thorough job, as its debut had been hurried.
Unlike the Chinese system, the ministry has stuck to Indian universities, though its unstated aim is to benchmark Indian institutions against the best in the world sometime in the future. "We want to improve our institutions," says Surendra Prasad, former IIT Delhi director, who is chairman of the National Board of Accreditation, and a core member of the National Institutional Ranking Framework.
IISc led the MHRD rankings, followed by the older IITs, Jawaharlal Nehru University and some of the newer IITs. In an unusual exercise, President Pranab Mukherjee on Monday gave away awards to the heads of the top 10 institutions and the top institution in subject categories. In another unusual statement, HRD minister Prakash Javadekar suggested that the top institutions might get more funding.
Heads of state usually do not bother tracking news of academic rankings but Mukherjee is an exception. He has been speaking about India's poor performance in international rankings whenever the opportunity has arisen. He even organised a retreat for some heads of IITs in 2014 and got Phil Baty, editor of Times Higher Education World University Rankings, to speak to them about ranking systems.
This annual meeting has now been enlarged in scope and institutional participation. Just before he left office, former Prime Minister Manmohan Singh had also expressed disappointment that India did not have even one institution in the top 200 QS World University Rankings, an annual publication. Soon other politicians caught on to the President's interest.
Smriti Irani became interested in university rankings when she became the HRD minister. With such interest at the highest level, it was only a matter of time before India developed its own ranking system. The IIT Council, governing body of all IITs, made the first move two years ago when IIT Kharagpur director Partha Chakrabarti made a presentation to it on why rankings are important.
Some of Chakrabarti's suggestions became the foundation for current MHRD rankings. Chakrabarti had argued that India could not ignore international rankings, but institutions also could not be diverted from the mandates given in the country.
Indian institutions operate under circumstances not captured in international rankings and these have to be captured in a different set of parameters. He recommended that, like the Chinese, India should develop its own ranking system. And once the methodology stabilised, it could be used to compare Indian institutions fairly with the best in the world. Most of the heads of Indian institutions and senior faculty had always looked at international rankings with suspicion.
The QS and Times Higher Education World University rankings had subjective criteria and some data from them was not public. Reputation surveys were generally skewed towards US and European universities, as Asian universities were not well known around the world.
Shanghai Rankings emphasised Nobel Prize winners excessively, it was felt, as most universities had no Nobel Prize winners and thus got no points. Many questioned the concept of rankings itself, as universities are so varied around the world. How can you capture such diverse entities in one number?
Despite these objections, many faculty members, policy-makers and even politicians kept an eye on the rankings. International institutions that do the rankings started engaging more and more with Indian institutions. Although Indian institutions — except IISc — dropped out of the Shanghai Rankings list, they were somewhat stable in the QS rankings. A close look shows their methodologies reveal why Indian institutions do not do well.
QS rankings use five criteria — academic reputation (40%), employer reputation (10%), student-faculty ratio (20%), citations per faculty (20%), internationalisation (5% each) for faculty and students. Other than citations per faculty, Indian institutions would falter in them.
Academic reputation and employer reputation are assessed with the help of surveys. The Scopus database, a large research database, provides citations data. Indian institutions have limited reputation abroad and thus cannot be expected to score highly in surveys. "The IITs have good reputation abroad," says Bhaskar Ramamurti, IIT Madras director, "but I don't think many can distinguish between each IIT separately."
Indian institutions do not have many international faculty or students. In the QS rankings last year, the top Indian institution, IISc, had no scores for employer reputation, international faculty or international students. Institutions in India are at a disadvantage here, as the domestic demand is high. China has high demand too, but there is a difference.
"In India demand is an increasing problem," says Ben Sowter, QS director, "but in China it is a decreasing problem." Indian elite institutions, under pressure to admit more students, are not likely to go international in the near future. QS stands for Quacquarelli Symonds, a British company that specialises in education and overseas studies.
Indian institutions are at a disadvantage even in the Shanghai rankings system, which has a stable methodology and thus relatively stable rankings. No Indian researcher working in India has won a Nobel Prize since CV Raman but 30% of the weight for the rankings is for Nobel Prizes. So India competes only in the rest of the parameters.
This should not take them totally out of contention. China has had only one Nobel Prize winner working in the country but it has 54 universities in the top 500. The difference is in the research output of Indian and Chinese universities. By intent, the Shanghai rankings are skewed towards superstar professors and elite institutions. India does not have them in large numbers, but things seem to be improving.
"Some Indian institutions are fast approaching the top 500," says Xuejen Wang, rankings manager of ShanghaiRanking Consultancy. These are some of the IITs, JNU, University of Calcutta and University of Delhi. The Shanghai Jiao Tong University does not make public the list of institutions ranked below rank 500.
MHRD rankings were created to remove disadvantages of Indian institutions in international rankings, by developing a set of parameters that were relevant to the Indian situation. It was also an exercise to force the Indian institutions to collect and document data on themselves. Since this data were to be made public, the objectivity of the exercise was not to be in doubt.
"We found the data collection to be a useful exercise," says TA Abinandan, chairman of the materials science engineering department at IISc. "We didn't know many things about ourselves."
When the first committee began debating the creation of the National Institutional Ranking Framework (NIRF), Smriti Irani made it clear that students were the most important stakeholders in this exercise. The committee recommended a set of parameters to be judged, most of which were to be based on objective criteria. Since colleges were an important part of the Indian education system, NIRF created a separate ranking system for colleges.
The NIRF looks for a good learning environment, a good research culture, impact of graduates, social inclusivity and, finally, reputation among the public, peers and employers. The last criteria were not supposed to get too much weight. Three sources were used for research impact: the Web of Science database, the Scopus database, and the Indian Citation Index. The first two are international databases, while the third is an Indian database. Universities and colleges responded well, as NIRF got 3,139 applications. Only a few medical, law and architecture institutions participated.
The data were substantially richer this year than in the first, thereby allowing policy-makers - or the public - to draw some conclusions, especially when compared to international rankings. In this year's ranking, the smaller institutions lost out as NIRF introduced points for an institution's size. There was little dispute about India's best institutions.
The order changed a bit, but the top ten Indian institutions were largely the same in the QS and Indian rankings. Faculty student ratio was poor in many institutions, with some working with one teacher for more than 50 students. The top 100 institutions accounted for 89% of the research output. Research culture was not very deep-rooted, but things were improving here.
"Research is more widespread than we thought," says Prasad. This was especially true of engineering research, with significant contributions coming from the National Institutes of Technology (NIT) and state universities. The rankings show private universities have also been improving standards, especially in research.
Amrita Vishwa Vidyapeetham, based in Coimbatore, is ranked 16 among all institutions, ahead of IIM Ahmedabad, Pune University and Aligarh Muslim University. Private universities have starting taking the rankings seriously, and have been engaging with the international ranking agencies in recent times. Over the next decade, they could provide serious competition to India's best public institutions.