Brian Leiter's Law School Reports

Brian Leiter
University of Chicago Law School

A Member of the Law Professor Blogs Network

Friday, September 7, 2012

More citation studies

This time from James Phillips, a PhD student at Berkeley's JSP program, and John Yoo (Berkeley). 

The two most interesting things they do are consult citations in the "Web of Science" database (to pick up citations for interdisciplinary scholars--this database includes social science and humanities journals) and calculate a citations-per-year score for individual faculty.  A couple of caveats:  (1) they look at only the top 16 schools according to the U.S. News reputation data, so not all law schools, and not even a few dozen law schools; and (2) they make some contentious--bordering in some cases on absurd--choices about what "area" to count a faculty member for.  (This is a dilemma, of course, for those who work in multiple areas, but my solution in the past was to try to gauge whether three-quarters of the citations to the faculty member's work were in the primary area in question, and then to also include a list of highly cited scholars who did not work exclusively in that area.)  Many of those decisions affect the ranking of schools by "area."  The limitation to the top 16 schools by reputation in U.S. News also would affect almost all these lists.  See also the comments here.

I liked their discussion of "all stars" versus "super stars," but it was a clear error to treat the top fifty faculty by citations per year as "super stars"--some are, most aren't.  Citations measures are skewed, first off, to certain areas, like constitutional law.  More importantly, "super stars" should be easily appointable at any top law school, and maybe a third of the folks on the top fifty list are.  Some aren't appointable at any peer school.  And the citations per year measure has the bizarre consequences that, e.g., a Business School professor at Duke comes in at #7 (Wesley Cohen, whom I suspect most law professors have never heard of), and very junior faculty who have co-authored with actual "super stars" show up in the top 50.

I was also puzzled by why the authors thought "explaining" the U.S. News peer reputation scores was relevant--the closer a measure correlates with that, the more dubious it would seem to be, I would have thought.  But that's minor. 

Appendix 5, publications per year, was utterly mysterious to me as to how the results were arrived at!

That's enough commentary for now--there's lots of interesting data here, and perhaps this will inspire others to undertake additional work in this vein. 

UPDATE:  A couple of readers asked whether I thought, per the title of the Phillips & Yoo piece, that their citation study method was "better."  I guess I think it's neither better nor worse, just different, but having different metrics is good, as long as they're basically sensible, and this one certainly is.  On the plus side, it's interesting to see how adding the Web of Science database affects things, and also how citations per year affects results.  On the negative side, a lot of "impact" that will be picked up in the Web of Science database may be of dubious relevance to the impact on law and legal scholarship.  And the citations-per-year measure has the odd result of elevating very junior faculty with just a year or two in teaching into elevated positions just because they may have co-authored a piece with a senior scholar which then got a few dozen citations.   No metric is perfect (what would that even mean?), but this one certainly adds interesting information to the mix.   It's particularly notable how the results are basically the same at the high end (Yale, Harvard, Chicago, Stanford, Columbia, NYU), but with some interesting movements up and down thereafter.

Of course, the biggest drawback of their approach is not the approach itself but that they only examined 16 law schools.  But someone else could rectify that.

http://leiterlawschool.typepad.com/leiter/2012/09/more-citation-studies.html

Rankings | Permalink