April 18, 2017
Mark Hall and Glenn Cohen have extended Brian Leiter's approach to ranking faculty by scholarly citations (based on Sisk data) to the field of health law.
According to Hall and Cohen, the most cited health law scholars in 2010-2014 (inclusive) are:
|Rank||Name||School||Citations||Approx. Age in 2017|
|2||Mark A. Hall||Wake Forest||480||62|
|3||David A. Hyman||Georgetown||360||56|
|4||I. Glenn Cohen||Harvard||320||39|
|5||John A. Robertson||Texas||310||74|
|6||Michelle M. Mello||Stanford||300||46|
|10||George J. Annas||Boston U||270||72|
The full ranking is available here.
April 12, 2017
April 11, 2017
Bill Henderson (Indiana) comments. (I'm more skeptical than Henderson appears to be that the adoption of GRE by Harvard had anything to do with rankings, though. Harvard's US News problem has to do with its size, and nothing else--if it were even half the size it is, it would be #1 every year. But being more than twice the size of Yale, Stanford, and Chicago means it is punished in the per capita expenditures measure because of economies of scale.)
Isn't it a bit nutty that law school admissions in the United States are run by a guy who works for a ranking website?
March 14, 2017
...by promoting random movement in the "overall" US News rank as meaningful, rather than noise. This only came to my attention because their PR office actually sent it to me! They should do some research about whom they send this stuff too! What's especially unfortunate about press releases like this is that it legitimizes the US News metrics, which can only come back to haunt schools when the "overall" nonsense number moves in the opposite direction for no discernible (or, in any case, meaningful) reason.
UPDATE: More superficial reporting, treating random movements as having meaning, or as worthy of note. 95% of movement in the US News "overall" rank is attributable to schools puffing, fudging or lying more than their peers in how they report the data to US News (or the reverse, for schools that drop); US News, recalls, audits none of the self-reported data.
March 13, 2017
MOVING TO FRONT (ORIGINALLY POSTED FROM OCT. 3 2011, WITH MINOR REVISIONS), SINCE IT IS TIMELY AGAIN
I've occasionally commented in the past about particular schools that clearly had artificially low overall ranks in U.S. News, and readers e-mail me periodically asking about various schools in this regard. Since the overall rank in U.S. News is a meaningless nonsense number, permit me to make one very general comment: it seems to me that all the law schools dumped into what U.S. News calls the "second" tier--indeed, all the law schools ranked ordinally beyond the top 25 or 30 based on irrelevant and trivial differences-- are unfairly ranked and represented. This isn't because all these schools have as good faculties or as successful graduates as schools ranked higher--though many of them, in fact, do--but because the metric which puts them into these lower ranks is a self-reinforcing one, and one that assumes, falsely and perniciously, that the mission of all law schools is the same. Some missions, to be sure, are the same at some generic level: e.g., pretty much all law schools look to train lawyers and produce legal scholarship. U.S. News has no meaningful measure of the latter, so that part of the shared mission isn't even part of the exercise. The only "measures" of the former are the fictional employment statistics that schools self-report and bar exam results. The latter may be only slightly more probative, except that the way U.S. News incorporates them into the ranking penalizes schools in states with relatively easy bar exams. So with respect to the way in which the missions of law schools are the same, U.S. News employs no pertinent measures.
But schools differ quite a bit in how they discharge the two generic missions, namely, producing scholarship and training lawyers. Some schools focus much of their scholasrhip on the needs of the local or state bar. Some schools produce lots of DAs, and not many "big firm" lawyers. Some schools emphasize skills training and state law. Some schools emphasize theory and national and transnational legal issues. Some schools value only interdisciplinary scholarship. And so on. U.S. News conveys no information at all about how well or poorly different schools discharge these functions. But by ordinally ranking some 150 schools based on incompetently done surveys, irrelevant differences and fictional data, and dumping the remainder into a "second tier", U.S. News conveys no actual information, it simply rewards fraud in data reporting and gratuitously insults hard-working legal educators and scholars and their students and graduates.
March 09, 2017
February 13, 2017
This is amusing, courtesy of law professor Ryan Whalen (Dalhousie), a recent JD/PhD graduate of Northwestern. One minor drawback is that faculty who retire and move elsewhere are treated as ordinary lateral moves. (So, too, with moves to assume Deanships: there too, the reasons for the move are different than ordinary lateral moves.)
January 31, 2017
Brad Hillis called this data compilation he did to my attention; I haven't verified its accuracy, but the recent (2005-17) data looks roughly right. Readers can weigh in at Wikipedia to correct the data if need be. Neither list is adjusted for class size.
Here are the twenty law schools that have produced the most Supreme Court clerks since 1882:
Rank/ Law School/ # clerks / % of all clerks
1) Harvard 607 27%
2) Yale 396 18%
3) Chicago 156 7%
4) Stanford 137 6%
5) Columbia 135 6%
6) Virginia 110 5%
7) Michigan 87 4%
8) Georgetown 61 3%
9) Berkeley 59 3%
10) NYU 54 2%
11) Penn 48
12) Northwestern 42
13) Texas 35
14) GW 26
15) Duke 21
16) UCLA 19
17) Notre Dame-17
18) BYU 13
19) Indiana 11
And here is Mr. Hillis's list of the top 20 law schools which have produced the most clerks since 2005 through 2017 (again, note that Harvard is more than twice the size of Yale, Stanford, and Chicago; that Virginia, Columbia, and NYU are about twice the size of the latter; etc.):
January 17, 2017
We just updated our charts about law journal submissions, expedites, and rankings from different sources for the Spring 2017 submission season covering the 203 main journals of each law school.
A couple of the highlights from this round of revisions are:
First, again the chart includes as much information as possible about what law reviews are not accepting submissions right now and what dates they say they'll resume accepting submissions. Most of this is not specific dates, because the journals tend to post only imprecise statements about how the journal is not currently accepting submissions but will start doing so at some point in spring.
Second, while 72 law reviews still prefer or require submission through ExpressO, the movement toward the number of journals using and preferring Scholastica continues: 27 schools now require Scholastica as the exclusive avenue for submissions, with 25 more preferring or strongly preferring it, and 25 accepting articles submitted through either ExpressO or Scholastica,.
The first chart contains information about each journal’s preferences about methods for submitting articles (e.g., e-mail, ExpressO, Scholastica, or regular mail), as well as special formatting requirements and how to request an expedited review. The second chart contains rankings information from U.S. News and World Report as well as data from Washington & Lee’s law review website.
Information for Submitting Articles to Law Reviews and Journals: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1019029
The Washington & Lee data on citations to law reviews is not very useful, since it does not correct for volume of publication. As a rule of thumb, law review status tracks the hosting law school's status, though the further down the hierarchy one goes, the less meaningful the distinctions become. 2nd-tier specialty journals at some top schools can offer be a better bet than the main law review at other schools--you need to ask colleagues in your specialty to find out.
January 09, 2017
With the start of a new year, here they are:
1. Cass Sunstein (Harvard), 266,146 downloads of 232 papers (posting papers since 1996)
2. Daniel Solove (George Washington), 263,111 downloads of 45 papers (remarkably, more than 60% of the downloads are due to a single paper!) (posting papers since 2001)
3. Lucian Bebchuk (Harvard), 249,457 downloads of 174 papers (posting papers since 1996)
4. Mark Lemley (Stanford), 188,578 downloads of 161 papers (posting papers since 1996)
5. Bernard Black (Northwestern), 178,719 downloads of 155 papers (posting papers since 1996)
6. Stephen Bainbridge (UCLA), 123,522 downloads of 98 papers (posting papers since 1997)
7. Dan Kahan (Yale), 122,574 downloads of 69 papers (posting papers since 1996)
8. Brian Leiter (Chicago), 122,416 downloads of 67 papers (posting papers since 2000)
9. Orin Kerr (George Washington), 108,160 downloads of 54 papers (posting papers since 2002)
10. Eric Posner (Chicago), 105,954 downloads of 135 papers (posting papers since 1997)