Tuesday, October 29, 2019

Society for Empirical Legal Studies (SELS) objects to use of HeinOnLine citation data to measure "scholarly impact"

Their letter to USNEWS.com editor Robert Morse is here.  I agree with a lot of this; herewith a few comments, sometimes expanding on the points made in the SELS letter, sometimes disagreeing.

The SELS letter states that while "no ranking system is perfect, one strength of the existing ranking approach--as U.S. News officials themselves have argued--is that it provides several accurate metrics for consumers to evaluate for themselves."   This did make me laugh, although I understand the good intentions behind the statement.  In fact, as we all know, USNEWS.com has regularly provided consumers with misinformation, since it never audits the self-reported data schools submit, whether about expenditures or job placement.  

The letter continues: 

Unlike other indicators like graduation rate and bar-passage rate, however, HeinOnline’s current citation system does not appear to accurately capture what it represents to. HeinOnline’s metric would purportedly measure a faculty member’s “scholarly impact.” But the method suffers from a variety of systemic measurement flaws so significant that they undermine its validity as a measure of scholarly impact—and with it, the validity of any metric incorporating it. Making the HeinOnline data part of the Best Law Schools ranking would therefore deviate from your longstanding practice of offering readers accurate information.

A small point:  while U.S. News college rankings incorporate graduation rates, the law school rankings do not.  The main concern of the SELS letter is that USNEWS.com may add the Hein impact data to the overall ranking formula.   I hereby predict with confidence that USNEWS.com will do exactly that within the next two years.  The trend in all their professional school rankings in the last few years has been to try to add "objective" indicia; citation data is the best candidate in the case of law schools.

Of course, the Hein data has exactly the problems that the SELS letter notes (and we have discussed previously):   books and book chapters are invisible, and partly because of that, and partly because Hein is a database of only law-related journals, interdisciplinary scholarship will get less weight in the "scholarly impact" measure.   Of course, it might reasonably be said that "scholarly impact" for a law school should be reflected in law publications, not, e.g., in impact in philosophy or economics journals.  (The two examples given--a highly-cited article co-authored by Lucian Bebchuk [Harvard] in a non-law journal and the highly cited historian Samuel Moyn [Yale] whose citations derive primarily from books--are apt, but probably not typical.  Bebchuk will surely do extremely well by a Hein-only measure even if that one article is excluded, while Moyn won't; but does anyone think that would have factored into Yale's hiring decision?)

But the real question about adding the Hein data is a comparative one.  Right now the USNEWS.com ranking of law schools measures [sic] the scholarly quality of faculties through an academic reputation survey, that has become simply an echo chamber:  if a school's overall USNEWS.com rank increases, the reputation score increases; and vice versa.   The Hein data--or any scholarly impact data--would make the measurement of scholarly quality independent of the reputation echo chamber.   (In USNEWS.com, Harvard, Stanford, and Yale typically tie at #1 in academic reputation; while Chicago, Columbia, and sometimes NYU come in at #4; contrast that with what scholarly impact data reveals.  The differences with impact data become even more dramatic further down the academic reputation hierarchy.)

So if the choice is between academic reputation data and no measure of scholarly impact, versus adding the Hein impact data, I'd vote for the latter.  (I agree with the SELS letter that Google Scholar would be a better metric, but USNEWS.com policy is not to do anything that requires real work on their part, and using Google Scholar would be time- and labor-intensive.)

The SELS letter is right that, as things stand, a Hein impact measure would be weighted against junior faculty:

HeinOnline considers citations to all past publications, even to publications written several decades ago. Consider two legal scholars: a junior scholar who has been cited highly in the last ten years for a prolific string of recent, innovative work; and a semi-retired professor who has been cited many times mostly for articles written decades ago.  Both are undoubtedly a great asset to students, colleagues, and the academic community. Yet the HeinOnline system would likely assign a higher rank to the latter and a lower rank to the former. This is true even if, as has been proposed, less recent citations were de-prioritized or omitted.

The solution here is simply to do what I used to do and which Greg Sisk and colleagues have continued to do:   study a circumscribed time range, and limit the study to tenured faculty.  The former is more important than the latter.

Of course, USNEWS.com has been running American legal education for two decades now, completely determining admissions polices and the award of financial aid, for example.  Thus, the SELS letter is surely right to worry that,

Were HeinOnline’s citation metric to become part of Best Law Schools, it would likewise shape law schools’ faculty hiring and retention decisions. Law schools would increasingly aim to hire or retain scholars based largely on an arbitrary criterion: their HeinOnline citation score. Conversely, schools would feel pressure to devalue those scholars with lower HeinOnline scores, even though it would often mean passing on scholars with greater promise, significant real-world research impact, or special expertise to offer students. This perverse hiring incentive would exist even assuming, as one commentator has argued, the HeinOnline scores generally correlate reasonably well with one other citation measure [the Sisk-Leiter impact metric].

This is without a doubt the strongest reason to hope that my prediction is wrong, and that the Hein impact measure remains separate from the overall ranking.   However, the fact that the Hein results are not that different from the Sisk-Leiter impact measure (the latter picking up, e.g., citations to books and to non-law articles) suggests that the effect on personnel decisions, and especially interdisciplinary scholars, may be less dramatic than feared.   I'll just add my anecdote to the mix of evidence here:   because Hein's database includes many more law journals internationally than Westlaw, while citations to my books and non-law review articles are lost, my total citation count appears to be roughly comparable because legal philosophy is an international scholarly community and I pick up lots of citations I never realized I had in law journals abroad.   Make of that anecdote what you will, but I expect there may be similar effects in legal history, empirical legal studies, and so forth.  Perhaps someone will examine this systematically.

https://leiterlawschool.typepad.com/leiter/2019/10/society-for-empirical-legal-studies-sels-objects-to-use-of-hein-on-line-citation-data-to-measure-sch.html

Of Academic Interest, Rankings | Permalink

Comments