Friday, April 14, 2006
I am no longer surprised, but continually amused, at the misinformation that circulates in Cyberspace, misinformation posted with complete confidence by (almost always) anonymous posters. A student e-mailed me the other day to ask about this item:
[Leiter] rails against the "T14" as an arbitrary concept with no grounding in reality, yet when he talks about the top law schools he talks about "the top 14" with Texas replacing Duke.
What evidence was cited for this odd assertion? None, of course. The only thing I could come up with that was even remotely related was from the 2002 edition of the Philosophical Gourmet Report, where in the section on studying philosophy in law schools, I gave a list of the 14 strongest law schools in terms of faculty quality, and only that (since faculty quality is what matters in terms of prospects for getting into law teaching). Is this what the anonymous poster menat? Perhaps. But by 2004, Duke had begun improving its faculty, and the 2003 reputational survey suggested the need for a new appraisal, so the Philosophical Gourmet Report for 2004-06 listed 17 top schools in terms of faculty quality. (A sidenote: USC may no longer be in this group, after losing two of its top faculty--Chemerinsky and Talley--in the last two years.) So much for "top 14," even limited to questions of faculty quality.
I did get a good deal of correspondence in response to my earlier posting about the nonsense category "top 14," some from law professors puzzling as to how any students could be so easily conned by U.S. News (answer: most aren't), and some from students insisting the category was meaningful--though oddly no one could cite any actual information undermining my point that "top 14" does not pick out the best faculties, the best student bodies, the best job placement, the best prospects for law teaching, or anything else. The closest one correspondent came was to reference this informative, though very different, study of job placement (which appeared in Jurimetrics this past summer) which, itself, undermines "the top 14" talk, since it finds Vanderbilt in "the top 14" for national placement, and Georgetown outside the "top 14." Like all studies (mine included), this one has its own methodological peculiarities; as I noted in the article on "How to Rank Law Schools":
This is a quite interesting and informative study...but the reader must approach with care what its results mean. Its regional placement results (the most interesting part of the study) are affected by the number of graduates of each school seeking to find work in that region; hence, for example, in the region that includes New York and Philadelphia, it turns out that North Carolina ranks ahead of Penn and Cornell! This plainly doesn’t mean a student looking to work in these Northeastern legal markets ought to go to North Carolina instead of Cornell or Penn; rather, the result is an artifact of the very small number of UNC students seeking work in these markets, combined with the fact that they will be a self-selected few with unusually good credentials (the average UNC student presumably doesn’t bother to try to land a job a firm in New York City). This limitation of the regional results, however, would be apparent to anyone who reads the ranking methodology carefully. More problematic is the way the author aggregates the regional results into a ranking of schools by “national placement....” [The author] opts to aggregate regional placement results based on each region’s share of the market for elite law firms. But since student geographic preferences play an enormous role in where students choose to work (as the author elsewhere notes), any school located in a geographic region with fewer “elite” firms will fare less well by this aggregation method. Moreover, since “elite” firms are determined in part by revenues, and since revenues are, in part, a function of cost-of-living in different regions of the country (which affects fees charged), the results will also be skewed in favor of schools located in higher cost-of-living areas.
Whatever its own virtues and limitations, this study, like all the other data points I referenced originally, confirms that "top 14" correlates with nothing in the real world: not job placement, not faculty quality, not student quality, not clerkships, and so on (see here for lots of data). Like so much on the misinformation superhighway known as the Internet, it flourishes mainly in obscure corners of Cyberspace, though with some leakage here and there into the real world.