Tuesday, July 26, 2016
Annals of "bullshit" rankings
Rankings are fun, sure, but it's good to figure out wheter the metric means something (anything!) lest one produce nonsense. Case in point: ranking law reviews by Google Scholar h-indices. The problem (we've encountered it in philosophy in the past, but now everyone there knows Google Scholar is worthless for measuring journal impact) is that there is no control for the volume of publishing by each journal, so any journal that publishes more pages and articles per year will do better than a peer journal with the same actual impact that publishes fewer articles and pages.
UPDATE: In the case of philosophy, Synthese was the number 1 journal in "impact" according to the nonsense Google number--this was obviously ludicrous, as everyone in academic philosophy knew. But Synthese also publishes five to ten times as many articles per year as the actual leading journals in the field. One philosopher adjusted the results for volume of publication, and lo and behold, Synthese rank fell dramatically.
https://leiterlawschool.typepad.com/leiter/2016/07/annals-of-bullshit-rankings.html