Wednesday, September 29, 2021
Professor Matthew Sag (Loyola/Chicago) called my attention to his recent critique of and alternative to Professor Sisk's scholarly impact rankings. I have not had an opportunity to look at his paper, but I did read his blog post. Professor Sag states:
Gregory Sisk and his team release these rankings of the top 67 or so schools every three years. And so every three years I find myself wondering: “Really? Can it be true that all these schools have higher academic impact scores than Loyola Chicago, DePaul, and Houston Law?” The short answer is: no, it’s not remotely true. There are quite a few schools that Sisk leaves out who would outrank those he includes on almost any conceivable method of aggregating citation counts.
This is not correct, however: Sisk et al. studied DePaul and Houston, but not Loyola/Chicago. If you are only trying to rank the top third of U.S. law schools by scholarly impact, you need only to study those schools there is reason or evidence to think will be in the top third. Sisk et al. studied 99 schools. (I list only the top 50, since I'm very confident that is the top 50 in impact based on citations.) (Professor Sisk says more, below, about how schools were chosen for inclusion.)
Professor Sag presents an alternative using HeinOnLine; it is remarkably similar to the results of the Sisk study. But Hein has more problems than the Westlaw database, ones that I would have thought are now notorious given USNews.com's flirtation with using it for an impact study. Hein only picks up citations to articles in the Hein database. That means books that are widely cited by law professors vanish in Hein. And articles in economics, philosophy, political science etc. journals that are widely cited by law professors vanish in Hein. Hein's lists of doctrinal faculty are also not reliable, as Professor Sisk discusses, below.
Professor Sisk posted a long comment on Professor Sag's blog post, which I excerpt here, since it makes sound points:
I found his paper to be a powerful endorsement of our triennial Scholarly Impact Ranking. For Professor Sag to use a different database, a different set of faculty at each school, and a different calculation method for scholarly impact —and yet to find a 95% correlation with the ranking results that we independently achieved (as Professor Sag notes in his blog post) — is rather remarkable. This should be grounds for celebrating the strong alignment between us and the confirmation yet again of the robust strength of citation-based rankings. And on top of that, Professor Sag ranks my own school, the University of St. Thomas in Minnesota, way up at #11, way above the #23 that our ranking produced.
Unfortunately, rather than this positive and unifying message, the theme of Professor Sag’s paper is that our Leiter-Sisk Scholarly Impact Ranking is exclusionary and unfair. Fortunately, the factual assertions that draw him to that conclusion are mostly inaccurate. He suggests, for example, that we have excluded such schools as Houston, DePaul, and Seton Hall, when we simply have not. Indeed, he says in a comment to his blog post that DePaul has never been included in our study. To the contrary, DePaul has always been included in our study and, in 2015, achieved the top-third ranking. Yes, it is true that Professor Sag’s own institution, Loyola-Chicago ,was not included in this year’s study. That’s a fair grievance. In fact, we have included Loyola-Chicago in the past, where it did not approach the top third ranking. But faculties change, and, based on Professor Sag’s findings, I agree that we should include Loyola-Chicago again. And I promise we will next time around. Yes, we are that open to inclusion.
Our approach to including law schools for the intensive phase of study has been open and transparent. We share the list of about 100 law schools publicly before we conduct the study through the associate deans’ listserv to which every accredited law school belongs. We invite law schools that are not on the list to conduct their own citation study and share it with us. And schools do every time. While most of those schools do not end up making it into the top third ranking, that does happen on occasion. And we welcome it. And lest there be any doubt, we do a full work-up of all of these schools, meaning that this year we fully vetted the faculty rosters and did a full citation count, including sampling, etc., of all 99 schools studied.
To be sure, there are variations between our rankings, even though the correlation is tight overall. The reason for those variations are likely to be found in (1) different databases (we use Westlaw and Professor Sag used HeinOnline), and (2) a different point of study (we carefully verify rosters of tenured faculty with traditional scholarly expectations and Professor Sag apparently simply accepted a HeinOnline designation of “doctrinal” teaching)....
My greater concern is with Professor Sag’s choice of the faculty to study. Preparing, vetting, and verifying the faculty rosters is one of the most time-consuming parts of our ranking study every three years. I preside over our work on identifying which faculty members at each law school have tenure, which have traditional scholarly expectations, and which are moving to other institutions. I then transparently share those preliminary rosters with the deans at each school, asking to be informed of possible errors and learn of recent changes. We insist on making the final choice, being consistent among all law schools.
But Professor Sag bypasses that entire painstaking stage. He apparently includes all faculty who designate teaching in a doctrinal course, which I think then means he includes not only tenured faculty, but untenured faculty and even those who are not on tenure-track at all. In addition, several schools have confirmed to us that their tenured faculty teaching in clinics have the same scholarly expectations, and so for those schools we include tenured clinical faculty. Not Professor Sag.
And Professor Sag doesn’t account for recently-announced lateral moves, which often is critical. Those lateral moves are a key part of the dynamic nature of Scholarly Impact Ranking.
Getting the faculty rosters right is hard work for us, but it makes all the difference.
Moreover, Professor Sag’s paper confirms our wisdom is not trying to rank all the way down for every ABA-accredited law school. While he imposes an ordinal ranking on schools from 1 to 193, I know from looking at the mean and median data that the differences among the schools after about the one-third point (after ranking about 69 to 70) are too small to justify separation of them through a misleading ordinal ranking. It just is not fair to rank further as the differences between the school’s scholarly impact shrinks to the minuscule.
But let me end by again accentuating the positive. Despite all of these differences, we find again that citation-based rankings tend to bolster one another. For that, I am thankful.