The University of Alberta prides itself on innovation and research, but a new study ranks it ninth in engineering and science.
The Higher Education Strategy Associates’ (HESA) report, which has developed a new ranking system for post-secondary institutions, also places humanities and social sciences research in fourth place, behind UBC, McGill and University of Toronto.
The study, Measuring Academic Research in Canada: Field-Normalized University Rankings 2012, looks at around 55 universities across Canada. In order to put all institutions on a level playing field, HESA looks at the impact of researchers from each university in their respective fields, rather than simply number of publications or citations.
“They’re well-intentioned — they want to create a new metric and we think that’s important,” said University of Alberta Associate Vice President (Research) Renee Elio.
Elio added that the University of Waterloo’s highly recognized science program also ranked quite low in HESA’s published findings. Despite its reputation in science and technology, it is ranked tenth in that field.
“This was sort of an indicator (of the study’s reliability). That was an interesting result,” Elio said. “What would a computing science student who looked at this ranking think?”
For Elio, the main concern is the new metric with which HESA has attempted to rank Canadian universities uses an “h-factor” — looking at citations and publications of professors in various university faculties in order to measure the impact of their research.
While she says the organization’s attempt to rank these institutions is admirable, Elio also says there are red flags. One example is a branch of the University of Quebec, which ranked fifth since most of their research lies in the field of marine biology.
When compared to lower-ranked schools with a wider breadth of research, such as the University of Waterloo, Elio sees fault in HESA’s attempts to create a normalized ranking system.
“(The University of Quebec) has a great marine biology program,” Elio acknowledged. “But (these rankings) don’t match what I know.”
As far as understanding and deciding on the legitimacy of this study, Elio says methodology is key. The system used to calculate the “h-factor” is complicated and difficult to understand. Lists of faculty names obtained from universities sometimes contain members of staff not actively researching in particular fields, such as deans, and when these names are used as part of research they lower the h-factor of the entire faculty.
When asked for an opinion on the validity of this study, Elio recommends students investigate how this information was collected and processed.
“Be critical, analytical thinkers,” she suggests.
“Look at (this study) in the context of the whole university ranking business.”
Members of the study could not be reached for comment.
The remnants of chivalry still linger today, especially in the dating world.