Tuesday, February 26, 2008

Ranking Law Schools by the Number of "Deadwood"

That, a bit oversimplified, is the idea behind the Green Bag's new "Deadwood Report," which is described in some detail by Ross Davies (George Mason), editor of the magazine, here and reported on here, where I'm quoted (and so I won't repeat myself).  Law schools would be well-advised to take note of the extent to which the editors are going to utilize web-based information about faculties in deciding whom to scrutinize.  This would be a good time to clean up, and update, those faculty lists!

UPDATE:  Over at InsidehigherEd (second link, above) an anonymous commenter writes:

It’s not surprising that faculty (and the 1% of elite students would want to be faculty some day) view the quality of faculty as the measure of a good law school.

But law school applicants don’t. They look at the USNWR rankings. So do law firms that are looking to hire law students. And the USNWR rankings are driven by the LSATs of the entering students.

In the great tradition of the anonymous ignorant in Cyberspace, this individual makes a number of claims that have no factual support I am aware of:  (1) I know of no evidence that only 1% of applicants to law school have interest in the "quality of faculty"; it is one of the criteria that applicants to UT most often check off on their applications, and the high level of discussion board interest in high-profile faculty moves would also seem to suggest otherwise.  In any case, no appraisal of an academic institution that anyone pays attention to--this is true even of U.S. News--neglects faculty quality (or some proxy for faculty quality) as a measure.  Someone making the preposterous claim that only 1% of applicants to law school are interested in faculty quality might feel some burden to produce evidence.  (2)  What is the evidence that law firms look at USNWR rankings?  I have a lot of accumulated anecdotes over the years, all of which seem to confirm that while USNWR rankings make for coffee room banter, the overwhelming majority of law firms work with their own "internal" rankings of schools based on past experience.  Once in a blue moon, USNWR might affect the internal rankings at the margin, but that is rare.  Is there any actual evidence on this point?  (3)  The USNWR rankings are not driven by the LSATs of entering students, which are not even the most important factor, and only weighted slightly more than GPAs.  The really corrupt engine of the US News rankings are the data on "expenditures" that the magazine does not print.  These urban legends about the US News rankings are right up there with the nonsense category "top 14," which has also taken hold in certain anonymous corners of Cyberspace.


Rankings | Permalink

TrackBack URL for this entry:


Listed below are links to weblogs that reference Ranking Law Schools by the Number of "Deadwood":

» The Deadwood Report from Ideoblog
No, it's not about the HBO show, but still could be useful. As I said last June, “[t]here is perennially a lot of moaning about the US News rankings. My answer to that moaning has been: there's a market for [Read More]

Tracked on Feb 26, 2008 10:25:49 AM