I was chatting yesterday to a professor emeritus and we got on to the subject of rankings. “You know”, he said, “they really represent a kind of A grade for the whole community”. He was right. We recently have seen several different positive results in a number of rankings exercises. In Maclean’s rankings we moved up a notch overall to 4th spot, behind McGill, Toronto and UBC, all 3 much larger institutions. In the recent Globe and Mail, student-driven rankings, we scored more A grades than any other school and finished number 1. Yesterday’s Research Infosource rankings, which measure research activity and intensity, showed us moving up a position into sixth spot (see story on this and related links at http://www.queensu.ca/news/articles/queens-moves-research-ranking). And in the international exercise of the Times Higher Education Rankings (an exercise we sat out last year because of concerns with the methodology which have since been addressed), we continue to place in the top 200 schools in the world.
My well-known scepticism about rankings exercises aside (they are too often subject to impressionistic ‘reputational’ data, and a small change in one or two inputs can have a disproportionate effect on overall standing), the collective picture is very clear. We are doing well by most indicators; the good results reflect tremendous effort by faculty, staff and students; and most students remain highly satisfied with their experience. This is occurring in circumstances that are scarcely ideal as the university continues to face funding shortfalls, class sizes have expanded in most faculties, and some of our indicators (for instance student to faculty ratio) have climbed. We had a difficult time of it last year with several student deaths and some very complicated labour negotiations.
We should therefore take some collective pride in the results of all of these exercises combined, while making sure that we pay due attention to any warning signals that the data contain. Our institutional analysis unit, which now reports to the Provost via Vice-Provost (Planning) Jo-Ann Brady (till Oct 31, our Registrar), will assist in analyzing the results of these rankings (and other indicators, for instance those in the National Survey of Student Engagement, on which our own Chris Conway is a recognized expert). These sorts of data can help us make evidence-based decision-making. They are, of course, not the only factors driving decisions–qualitative evidence such as student comments and feedback from faculty and staff count also. They will assist in integrated planning across all portfolios of the university.
Another thing that some rankings exercises do is help us see what other universities are up to and where we can learn from them. We are all part of a provincial PSE structure, and relate also to other institutions outside Ontario. But it is helpful to read about the pedagogical and research innovations that are going on elsewhere, some of which (though not all) may be suitable for experiment at Queen’s (other schools will probably be imitating some of our activities).
So, let’s be pleased by the rankings results, as we should be, and glean from them what we can, especially in areas where we can improve, in teaching, research, and administration. At the same time, let’s also acknowledge that we have some significant challenges ahead of us if we want to keep those overall results positive. Let’s be willing to experiment and try new ideas. And let’s admit that just as our students learn from each other as much as they do from their professors and TAs, so institutionally we can improve by keeping a close eye on what goes on elsewhere in higher education. But for now, congratulations to all members of the community for a strong performance in often difficult circumstances.