Tuesday, August 2, 2005
Many readers will no doubt have seen this useful article, which covers much of the territory reviewed here, but with some nice, concrete examples. (Mr. Wellen, the article's author, told me that it was rather hard, unsurprisingly, to get Deans or others to own up to the various tactics employed, though he did quite well, I think, in eliciting "confessions," as it were.)
But not all tactics are equal, and that deserves some emphasis. Inflating the employment statistics by hiring graduates as glorified research assistants (as, e.g., Northwestern admits to having done, though they are not alone in doing that) arguably misleads students as to their real employment prospects, and so harms real people. Inflating, by hook or by crook, expenditures so that one's school is ranked more in accord with its academic merits than it would be otherwise given the screwball U.S. News ranking methodology harms no one, and arguably provides students who take the overall U.S. News rank seriously better information than they otherwise would get. (Extensive correspondence with prospective law students over the years convinces me that the better students assign no importance to the overall U.S. News rank, concentrating instead on the underlying data.)
Perhaps most comment on the Times article has attended this long-standing practice at the University of Illinois College of Law:
Consider library costs at the University of Illinois College of Law in Urbana-Champaign. Like all law schools, Illinois pays a flat rate for unlimited access to LexisNexis and Westlaw's comprehensive online legal databases. Law students troll them for hours, downloading and printing reams of case law. To build user loyalty, the two suppliers charge institutions a total of $75,000 to $100,000 a year, far below per-use rates.
But in what it calls a longstanding practice, Illinois has calculated a fair market value for these online legal resources and submitted that number to U.S. News. For this year's rankings, the school put that figure at $8.78 million, more than 80 times what LexisNexis and Westlaw actually charge. This inflated expense accounted for 28 percent of the law school's total expenditures on students, according to confidential data filed with U.S. News and the bar association and provided to The New York Times by legal educators who are critical of rankings and concerned about the accurate reporting of data.
Let's start with the most important fact about this: despite boosting their expenditure figures this way, Illinois is still ranked too low by U.S. News (26th most recently, which is absurd)! When I say "too low" I mean "too low" relative to the considerations one might have thought relevant to the assessment of an academic institution: the quality of its faculty and students, and the accomplishments and attainments of the students upon graduation. I do not know of anyone knowledgeable about law or the legal academy who would put Illinois outside the top 25, and most would put it in the top 20. So the real story here is that despite Illinois's "inflated" expenditures figures, the U.S. News "methodology" for ranking law schools is so screwed up that it still produces a silly result.
Now I have it on good authority from someone at Illinois that,
the practice to which the article imperfectly refers was introduced in the mid-nineties after extensive Faculty Executive Committee discussion and after orally securing what was legitimately assumed to be explicit ABA approval; it was specifically reviewed by the 1997 ABA Site Inspection Team (comprised of two deans and a co-author of a leading book on corporate finance) and found at that time to be unproblematic; it was explicitly highlighted in subsequent years in the section of the Annual Report to the ABA that called for special comments, at no point triggering questions or concerns by the ABA; and it was forthrightly disclosed to, and discussed with, the ABA Site Inspection Team that visited the College in the Spring of 2004.
None of this, of course, speaks to why they would want to report the expenditures this way, but the answer is obvious: because some not-very-savvy journalists at a second-rate news magazine think that if you spend eight million dollars on computer-based legal research instead of $100,000, you're running a "better" law school that should be ranked more highly.
This highlights the fundamental problem: given that the U.S. News "method" of ranking law schools is unprincipled and indefensible, why should law schools cooperate at all in providing reliable information to the magazine, except in those cases where unreliable information will be published and mislead students (as with the employment statistics)?
I've opened comments; anonymous comments or those not right on point are unlikely to see the light of day.