Brian Leiter's Law School Reports

Brian Leiter
University of Chicago Law School

A Member of the Law Professor Blogs Network

Wednesday, March 10, 2010

An Open Letter to Bob Morse of U.S. News

MOVING TO FRONT FROM MARCH 31, 2008, SINCE IT'S THAT TIME OF YEAR AGAIN

Dear Mr. Morse:

A recent article in the ABA Journal reports your continued interest in suggestions for improving the law school rankings.  I know from experience with you and members of your staff that U.S. News does take suggestions seriously, and, to your credit, you made several important changes during the 1990s.  (Contrary to popular mythology, I also know that U.S. News does not make formula changes each year--indeed, US News has not made any major adjustments to the ranking formula since 1999, the year you began adjusting expenditures for differences in cost-of-living in different regions of the country.  Since then, U.S. News has made some adjustments to definitions of certain items of data, but those were all minor and were, correctly, designed to increase reliability by more closely tracking ABA data.)

The rationale for your law school rankings are to provide information and assistance to students, the "consumers" of legal education.  In some measure you have done that through the data your magazine collects each year.  Unfortunately, the success of the enterprise has largely undermined its original rationale.  Because of keen student interest in the rankings, the US News rankings have become, as I am sure you know, the tail that wags the law school dog.  Because moving up in the US News rankings requires no explanation, while falling invariably does, schools have grown increasingly sophisticated--or sometimes just duplicitous--in how they report data to the ABA and to US News in order to secure favorable results, results that are increasingly unhinged from any actual educational or professional accomplishments. 

In consequence, the almost exclusive way in which a school improves its US News rank (apart from some arbitrary fluctuations in reputational scores, which schools can not control) is very clear:  manipulation, trickery and, at worst, deceit.  You know this as well as I do.  Schools hire unemployed graduates as research assistants, hand out fee waivers to hopeless applicants to improve their acceptance rates, inflate their expenditures data through creative accounting or simply fabrication, cut their first-year enrollment (to boost their medians) while increasing the number of transfers (to make up the lost revenue), and so on.  Because more than half the total score in U.S. News depends on manipulable data, schools intent on securing the public relations benefits of a higher rank simply "cook the books" or manipulate the numbers to secure a more favorable U.S. News outcome.  Schools vary, to be sure, in how aggressive they are about data manipulation, and one expects that public law schools, whose records are subject to scrutiny, are especially careful.  But there is no one in legal education who will deny, with a straight face, that a significant number of law schools, probably the majority, now "massage" their reporting, often within the letter, if not the spirit, of the rules.

So the question you confront is how to restore the integrity of the ranking enterprise, so that you continue to provide meaningful consumer information.  Here are a few suggestions, in something like ascending order of importance:

1.  Contrary to what one sometimes hears, it is clear to me, and I imagine any other informed observer of school evaluations, that the reputational surveys are the one component of the U.S. News ranking that actually keeps the results tethered to reality.  Unfortunately, as Professor Stake of Indiana has shown, the superficial survey method U.S. News employs is increasingly producing an echo chamber effect, with the reputation of a school essentially tracking the overall rankings from prior years by U.S. News.  In order to minimize that effect, I suggest you switch to an on-line survey system with academics (your response rate from academics is already quite high, and I imagine that for an on-line survey it will be even higher), in which evaluators are presented with concrete information about each school, rather than simply a school name:  e.g., a current faculty roster, numerical credentials of the student body, a list of distinguished alumni (let the school provide a list, limited to 50 names, say), and so on.   Ask academics to evaluate the scholarly and professional excellence of the school, not simply the "reputation" they associate with a name.

The lawyer/judge survey, by contrast, currently gets such a low response rate that the results are highly suspect.  I do not know how you can increase the response rate, but you may need to put in place measures to insure geographic and practice-area diversity in the response pool.  In any case, you need to make public the geographic distribution of the respondents, since I suspect that will shed important light on the reputational results.

2.  To the extent you continue to employ data self-reported by the schools, you really must undertake more aggressive audits of the data.  This year--to take the most notorious example that has already attracted widespread attention--the University of California at Berkeley claimed an astounding 99% of its students employed at graduation, a fact to which Professor Lindgren of Northwestern has already called attention.   In prior years, Berkeley has reported (going backwards by year) 97.2% employed at graduation, 74.4%, 89.8%, 88.7%, 96.8%,  and 93.2% .  Berkeley is a state school, subject to open record requirements.  Have you assigned a reporter for your magazine to investigate anomalous data reporting by schools?  The integrity of the enterprise surely demands an occasional follow-up investigation.

3.  Since what can only be facetiously called the "objective" data that schools self-report is the source of most of the egregious trickery and deceit that renders the results dubious, why not take steps to reduce your reliance on this data?  (That was a primary consideration in the Canadian law school rankings I designed for MacLean's.)  Eliminate expenditures altogether:  that alone would put a halt to the worst offenses.  What schools spend on utilities and secretaries and landscaping has nothing to do with anything.  Per capita expenditures systematically penalize larger schools for their economies of scale and reward inefficiency:  there is simply no denying this.  Even expenditures on faculty salaries is a very poor proxy for faculty quality, and would be, in any case, redundant upon well-done reputational surveys or citation studies, which would provide a direct measure.

You should also eliminate the self-reported employment data, which is, as you well know, a work of fiction:  it bears some resemblance to reality, but it is mainly a work of the imagination.  Substitute data in the public domain, like the representation of school graduates as associates at leading law firms nationwide, or in federal clerkships.  Eliminating expenditures data, and substituting public data on employment success for self-reported employment statistics, would immediately increase the credibility of the results, and would get U.S. News out of the business of rewarding trickery and deceit.

I hope you will make some significant changes to the ranking formula this coming year.  "Gaming" the rankings only works because schools know the rules of the "game."  Change them, and do so in ways that will increase accuracy and that won't permit new gaming.  Your existing methods are discredited and are now disserving students, rather than informing them.  (If you think the methods are not discredited and are not disserving students, then I hope you will make public a defense of the methods in light of the problems noted above, and noted here.)  Do keep a reputational component, but improve your survey methods, so that evaluators are asked to respond to concrete information about schools, such as their current faculty rosters.  Avoid self-reported data as much as possible, substituting information in the public domain such as the success of graduates in securing clerkships or employment at leading law firms.   Scrap the nonsense about expenditures, which is responsible for some of the worst offenses in data reporting and which, in any case, has at best only a distant relationship to educational quality or professional outcomes.  These kinds of changes are fully consistent with--indeed, demanded by--the standards of objectivity and accuracy that are the aspirations of any reputable news organization.  I trust U.S. News will rise to the occasion, and seize the opportunity to restore the integrity of its law school evaluations as a source of consumer information.

I would be happy to discuss these issues with you further, either in private or in a public forum.

Sincerely yours,
Brian Leiter

http://leiterlawschool.typepad.com/leiter/2010/03/an-open-lette-1.html

Rankings | Permalink

TrackBack URL for this entry:

http://www.typepad.com/services/trackback/6a00d8341c659b53ef00e55181d23a8833

Listed below are links to weblogs that reference An Open Letter to Bob Morse of U.S. News:

» US News and the Costs of Ethical Behavior from Discourse.net
Our US News score is lousy. The methodology is very bad and any large private school without a very big endowment is going to suffer under it but even so I think fails to reflect some of our real strengths when I look at who is doing re... [Read More]

Tracked on May 12, 2008 7:21:57 AM