Brian Leiter's Law School Reports

Brian Leiter
University of Chicago Law School

A Member of the Law Professor Blogs Network

Tuesday, August 2, 2005

NY Times Expose of How Law Schools Manipulate the US News Rankings

Many readers will no doubt have seen this useful article, which covers much of the territory reviewed here, but with some nice, concrete examples.  (Mr. Wellen, the article's author, told me that it was rather hard, unsurprisingly, to get Deans or others to own up to the various tactics employed, though he did quite well, I think, in eliciting "confessions," as it were.)

But not all tactics are equal, and that deserves some emphasis.  Inflating the employment statistics by hiring graduates as glorified research assistants (as, e.g., Northwestern admits to having done, though they are not alone in doing that) arguably misleads students as to their real employment prospects, and so harms real people.  Inflating, by hook or by crook, expenditures so that one's school is ranked more in accord with its academic merits than it would be otherwise given the screwball U.S. News ranking methodology harms no one, and arguably provides students who take the overall U.S. News rank seriously better information than they otherwise would get.  (Extensive correspondence with prospective law students over the years convinces me that the better students assign no importance to the overall U.S. News rank, concentrating instead on the underlying data.)

Perhaps most comment on the Times article has attended this long-standing practice at the University of Illinois College of Law:

Consider library costs at the University of Illinois College of Law in Urbana-Champaign. Like all law schools, Illinois pays a flat rate for unlimited access to LexisNexis and Westlaw's comprehensive online legal databases. Law students troll them for hours, downloading and printing reams of case law. To build user loyalty, the two suppliers charge institutions a total of $75,000 to $100,000 a year, far below per-use rates.

But in what it calls a longstanding practice, Illinois has calculated a fair market value for these online legal resources and submitted that number to U.S. News. For this year's rankings, the school put that figure at $8.78 million, more than 80 times what LexisNexis and Westlaw actually charge. This inflated expense accounted for 28 percent of the law school's total expenditures on students, according to confidential data filed with U.S. News and the bar association and provided to The New York Times by legal educators who are critical of rankings and concerned about the accurate reporting of data.

Let's start with the most important fact about this:  despite boosting their expenditure figures this way, Illinois is still ranked too low by U.S. News (26th most recently, which is absurd)!  When I say "too low" I mean "too low" relative to the considerations one might have thought relevant to the assessment of an academic institution:  the quality of its faculty and students, and the accomplishments and attainments of the students upon graduation.  I do not know of anyone knowledgeable about law or the legal academy who would put Illinois outside the top 25, and most would put it in the top 20.  So the real story here is that despite Illinois's "inflated" expenditures figures, the U.S. News "methodology" for ranking law schools is so screwed up that it still produces a silly result.

Now I have it on good authority from someone at Illinois that,

the practice to which the article imperfectly refers was introduced in the mid-nineties after extensive Faculty Executive Committee discussion and after orally securing what was legitimately assumed to be explicit ABA approval; it was specifically reviewed by the 1997 ABA Site Inspection Team (comprised of two deans and a co-author of a leading book on corporate finance) and found at that time to be unproblematic; it was explicitly highlighted in subsequent years in the section of the Annual Report to the ABA that called for special comments, at no point triggering questions or concerns by the ABA; and it was forthrightly disclosed to, and discussed with, the ABA Site Inspection Team that visited the College in the Spring of 2004.

None of this, of course, speaks to why they would want to report the expenditures this way, but the answer is obvious:  because some not-very-savvy journalists at a second-rate news magazine think that if you spend eight million dollars on computer-based legal research instead of $100,000, you're running a "better" law school that should be ranked more highly.

This highlights the fundamental problem:  given that the U.S. News "method" of ranking law schools is unprincipled and indefensible, why should law schools cooperate at all in providing reliable information to the magazine, except in those cases where unreliable information will be published and mislead students (as with the employment statistics)?

I've opened comments; anonymous comments or those not right on point are unlikely to see the light of day.

http://leiterlawschool.typepad.com/leiter/2005/08/ny_times_expose.html

Rankings | Permalink

TrackBack URL for this entry:

http://www.typepad.com/services/trackback/6a00d8341c659b53ef00d8345fbced69e2

Listed below are links to weblogs that reference NY Times Expose of How Law Schools Manipulate the US News Rankings:

» The NYT on USNWR from Ideoblog
Brian Leiter has an excellent post on the recent NYT story about USNWR rankings. He includes a quote from my dean about the Illinois situation, and this more important and useful observation: Let's start with the most important fact about [Read More]

Tracked on Aug 3, 2005 2:17:33 AM

» The Law of Harry Potter from Concurring Opinions
What are the criminal consequences of a curse? Can a person commit a tort by unfair Quidditch play? How can the law of the Muggles be harmonized with the law of the Wizarding World? For a long time, attorneys struggled... [Read More]

Tracked on Oct 13, 2005 6:46:32 AM

» Law Review Citations and Law School Rankings from Concurring Opinions
There's no shortage of writing on law reviews or law school rankings, to say the least. So why not combine the two? Questions about law review ranking abound. How does one compare offers from journals at relatively equal schools? Is... [Read More]

Tracked on Dec 7, 2005 9:18:27 AM

Comments

After reading the NYT article, a colleague has suggested that we ought to calculate the "fair market value" of our faculty--who we would
assume would be earning $500,000 a year as a partner in a respected large firm had they not chosen to "donate" their services as a faculty
member (and we could probably do even better with our adjuncts, many of them partners in large firms--billing at $500 an hour). We could
further "enhance" our budget by calculating the "fair market value" of our recently built library and recently rehabilitated faculty and
classroom building on the basis of local rents for first class office space, which is $40 per square foot per month.

Of course, the USNews rankings are absurd, and the only comparative "measure" which the magazine uses which makes sense is the
LSAT score tabulation--though even these can be manipulated somewhat. Assuming all schools admit the "best" students possible, i.e., generally (but not exclusively) the students with the highest LSAT, it is the only statistic which shows what students perceive to be a quality law school.



Posted by: PLM | Aug 4, 2005 2:07:48 PM

On your last point: is there any reason to think student perception of quality is well-correlated with academic or professional quality? More worrisome, it seems to me, is that student reasons for choosing particular schools have quite a lot to do with regional preferences and financial incentives, not judgments about relative quality. Cass Sunstein discusses some of the pros and cons of "revealed preference" rankings like this in his forthcoming contribution to the Indiana L.J. symposium that Paul Caron organized.

Posted by: BL | Aug 4, 2005 2:15:37 PM

There is no guarantee that student perception of quality is well correlated with the actual quality of a law school. Certainly regional preferences and financial incentives have a lot to do with the choices students make. But I think most students will choose to go to what they perceive to be the “best” law school available to them within those regional and financial constraints. But my second comment was directed at the absurdity of the USNews rankings and was not a claim that what students perceive is the most accurate indicator of law school quality. Outside of the top handful of law schools, I think student perceptions of quality as revealed by LSAT figures are probably a far more accurate guide to quality than whatever a bunch of anonymous professors or anonymous judges and practitioners might think about a law school. (And for the very “top” schools, the student and the professional perceptions are highly correlated.) Perhaps some percipient professors, judges or practitioners can make some valid judgments about the “top” ten or even twenty schools and maybe even a few other schools in the region where they teach, judge or practice. But it defies reason to think that any of them could rank-order the 160 plus law schools in the U.S. with any semblance of accuracy. It would be simply garbage in-garbage out. Virtually every other factor used by USNews other than the LSAT data has similar problems. Thus, even though LSAT data may itself be a flawed measure, I stand by my assertion that it is the only statistic (of those USNews uses) which has any clear relation to the actual quality of a law school.

Posted by: PLM | Aug 4, 2005 3:58:45 PM

I confess I'm still a bit skeptical, because of the substantial influence of geographical preferences on student decisions about where to go to law school. I'm inclined to think that the reputational surveys, as flawed as they are, are about the only thing that prevent the US News rankings from being completely absurd--and notwithstanding the fact that I agree that no one can meaningfully rank 160 schools along any dimension. But the whole point of a survey, of course, is to aggregate opinions based on imperfect information. Still, one reason I have confined my own studies of faculty quality to smaller numbers of schools is because I agree with you that meaningful evaluations of, e.g., faculty quality can only be conducted over a more modest number of institutions.

Posted by: BL | Aug 4, 2005 5:15:37 PM

Back in the mid-1990s (when I was picking a law school), USN&WR used to include a piece of data that, as a prospective customer, I found more helpful than all the others: it listed in the far right column (IIRC) the average starting salary of each school's graduates. The salaries seemed to be pretty well correlated with the USN&WR ranking, ranging from the low $70,000s at Yale and HLS down to $30,000 at the fourth-tier school I was originally considering based on location and financial aid. USN doesn't provide that data anymore; I don't know if anyone else does. But salaries, combined with Prof. Leiter's data on how many academic hires have degrees from each school, are probably the best indicators for the prospective law student asking him/herself, "Which school will be most efficacious for me?"

Posted by: John P. | Aug 5, 2005 6:46:40 AM

US News dropped median starting salaries for a sensible reason: they make for poor comparisons unless adjusted for differences in cost-of-living. When US News factored in the median starting salaries without adjustment, it essentially gave a boost to any school in a high cost-of-living region (since almost all law schools, with a half-dozen exceptions, place the vast majority of their graduates regionally).

Posted by: BL | Aug 6, 2005 12:46:04 PM

Professor Leiter wrote: "[S]tudent reasons for choosing particular schools have quite a lot to do with regional preferences and financial incentives, not judgments about relative quality."

Two points here. I was accepted into the University of Illinois but ultimately attended Pepperdine. My LSAT/GPA would have put me in the upper 25% of students applying to USC, and somewhere in the median of UCLA applicants. But I didn't seek to attend the most highly-ranked school in the Los Angeles area.

My spouse (we met in law school) was accepted into a Top 10 law school and several Top 25 schools, but also attended Pepperdine. We know more than a handfull of people who made similar decisions.

Prospective law students do a lot of quirky things, and weigh a lot of factors not listed in USNews. E.g., I place a high value on where I live. Bigotry and small-mindedness serves a sort of pscyhic pollution. Just as I'd not swim in the Boston Harbor, I do not like to live near bigots. Thus, I moved to California. But USC is in a bad area of town, so I didn't even apply. I also did not want to attend any law schools in the Midwest because I was tired of the close-mindedness I encountered in that region.
But, since I grew up in a rural area, attending UCLA or an East Coast school would have been intimidating, since I had never lived in a big city. Thus, Pepperdine (re: nice area, not huge city, good people, etc.)

Maybe my spouse and I are unique in not looking to USNews as our light in the law school application process. But I tend to agree with Leiter's statement.

Posted by: AnonymousLawGrad | Aug 10, 2005 11:16:52 AM

Post a comment