Thursday, 31 July 2014

What official rankings don’t tell you about where to go to university

The ivory tower isn't immune to funny business.

Educational rankings are big business in the US. Attracting talent and maintaining prestige depends largely on where an American university falls in official tallies. Rankings are also a cash cow for the companies that produce them like US News and the recently-acquired Princeton Review.


That kind of dependence breeds misconduct. Prominent schools like Tulane’s Freeman School of Business in New Orleans and Emory University in Atlanta have been known to inflate their admissions test scores (like the GMAT and SAT) to try and boost their rankings. More recently, the alleged wrongdoing by the University of Missouri-Kansas City’s Bloch School of Management has been spotlighted in a new series of investigations by the Kansas City Star.


The investigation found “inaccuracies and mischaracterizations of fact” in data that the school provided to the Princeton Review, which rated the school’s entrepreneurship program highly. It also discovered conflicts of interest between UMKC and researchers in a paper published in the Journal of Product Innovation Management. The paper named the school above top-ranking business schools like Harvard, MIT and Stanford for innovation management research, based on the number of papers published by various schools and researchers in certain journals in the field. (The research is still touted on the school’s website.)


The journal (and school) didn’t disclose that the two Chinese researchers who wrote the paper were previously colleagues of Michael Song, a professor who had joined UMKC from a part-time gig at a Chinese university. At the time of publication, authors PianPian Yang and Lei Tao were visiting scholars at UMCK. They shared an office with Song, came to the school at his invitation and consulted with him on the paper before it was published.


The investigation, conducted by two independent business journal editors and Retractionwatch.com, uncovered a number of quirks in the paper’s methodology that boosted Song and UMKC’s rankings. (Song ranked as the number one scholar in the field.) The time horizon of the dataset aligned with the years when Song published most of his research, and the researchers limited the journals included in the data set to Song and UMKC’s benefit.


Instead of crediting the university a researcher worked at when a paper was published, as is common practice, the study credited the entire output of an author to wherever he was currently employed, another boost to UMKC.


Here’s the ranking that appeared in the paper:







1. University of Missouri-Kansas City


2. Massachusetts Institute of Technology


3. Michigan State University


4. INSEAD (an international business school)


5. Harvard University


6. University of Pennsylvania


7. Northeastern University


8. Texas A&M University


9. Stanford University


10. Delft University of Technology in the Netherlands



Here’s the ranking that would have resulted from prioritizing business schools by where the research took place, rather than to a researcher’s current employer:


1. Michigan State University


2. Massachusetts Institute of Technology


3. University of Michigan


4. University of Pennsylvania


5. Harvard University


6. Delft University of Technology


7. INSEAD


8. Northeastern University


9. University of Texas at Austin


10. University of North Carolina at Chapel Hill and Carnegie Mellon



Among the misleading data UMKC provided to the Princeton Review was its claim in 2012 that 100% of the students in the school’s entrepreneurship program went on to start a business after graduating. That percentage was gleaned from a survey conducted by the school of students enrolled in a one year certificate program that included a requirement that they start a business.


In a response to the Star, UMKC said the relationships revealed in the investigation don’t invalidate the study’s findings, and that the misunderstanding was caused by a disgruntled professor who raised a red flag about the paper after Song passed up recommending him for a promotion. The university also denied that it engaged in a “pattern of exaggeration” to boost its rankings.


The situation is a healthy reminder that even well-respected ranking organizations can suffer from self-reported and unaudited information from the schools they rank.




What official rankings don’t tell you about where to go to university

No comments:

Post a Comment