January 31, 2017
Supreme Court clerks by school, 1882-2017, 2005-2017 (UPDATED & CORRECTED)
Brad Hillis called this data compilation he did to my attention; I haven't verified its accuracy, but the recent (2005-17) data looks roughly right. Readers can weigh in at Wikipedia to correct the data if need be. Neither list is adjusted for class size.
Here are the twenty law schools that have produced the most Supreme Court clerks since 1882:
Rank/ Law School/ # clerks / % of all clerks
1) Harvard 607 27%
2) Yale 396 18%
3) Chicago 156 7%
4) Stanford 137 6%
5) Columbia 135 6%
6) Virginia 110 5%
7) Michigan 87 4%
8) Georgetown 61 3%
9) Berkeley 59 3%
10) NYU 54 2%
11) Penn 48
12) Northwestern 42
13) Texas 35
14) GW 26
15) Duke 21
16) UCLA 19
17) Notre Dame-17
18) BYU 13
19) Indiana 11
And here is Mr. Hillis's list of the top 20 law schools which have produced the most clerks since 2005 through 2017 (again, note that Harvard is more than twice the size of Yale, Stanford, and Chicago; that Virginia, Columbia, and NYU are about twice the size of the latter; etc.):
Rank / Law School / # clerks 2005-2017
1 Harvard 124
2 Yale 120
3 Stanford 39
4 Virginia 32
5 Columbia 24
5 Chicago 24
7 NYU 17
8 Georgetown 14
8 Michigan 14
10 Northwestern 10
11 Berkeley 9
11 Duke 9
13 Penn 7
13 GW 7
15 Georgia 6
16 BYU 4
16 Texas 4
16 Notre Dame 4
16 Utah 4
20 Cornell 2
20 Minnesota 2
20 Pepperdine 2
20 Vanderbilt 2
January 30, 2017
"Malevolence tempered by incompetence"
This is one of the more incisive and damning analyses of Trump's executive order on refugrees and visas.
January 27, 2017
Startling development: Harvard Law students think they are more important than they really are!
Some want to play an "indispensable" role in the search for a new Dean. I'm sure student feedback on candidates will receive some weight, but that's about it. Were I a betting man (I am not), I would bet on John Goldberg or John Manning--both current HLS faculty--to be chosen as the new Dean.
January 26, 2017
Cost-benefit analysis and the late Justice Scalia spell trouble for Trump's proposed border wall
Interesting analysis by my colleagues Daniel Hemel, Jonathan Masur, and Eric Posner.
January 23, 2017
The increasingly ugly demise of Charlotte Law School
A local news outlet records the details. Charlotte is now betting on an Education Department under Besty DeVos being more favorable to them!
January 19, 2017
Latest LSAC data: applicants down 4.2% from January 2016 (UPDATED!)
UPDATED: MOVING TO FRONT FROM YESTERDAY
Here's the report:
As of 1/6/17, there are 134,007 applications submitted by 21,711 applicants for the 2017–2018 academic year. Applicants are down 4.2% and applications are down 2.2% from 2016–2017.
Last year at this time, we had 40% of the preliminary final applicant count.
Although there has been a trend towards increasingly later applications, this figure does suggest that we are going to see a slight, but not negligible, decline in applicants this cycle.
UPDATE: But now LSAC reports that LSAT-takers in December were up nearly 8% from the prior year! The likely explanation though, is a scheduling change, which led more applicants to skip the early fall LSAT in favor of the December one. But that would also account for the decline in applicants noted in the 1/6/17 report. So my guess now is that we won't be seeing any decline in the applicant pool this year, so we really are at "the new normal."
Established datasets, proxies, and customized data collection: The case of international LLMs (Michael Simkovic)
How should researchers make tradeoffs between the costs of data collection, the speed of the analysis, the precision of the measurements, reproducibility by other researchers, and broader context about the meaning of the data: how we might compare one group or one course of action to another, how we might understand historical trends, and the like?
Must we always measure the precise group of interest, with zero tolerance for over-inclusion or under-inclusion? Or might one or a series of proxy groups be sufficient, or even preferable for some purposes? What if the proxies have substantial overlap with the groups of interest and biases introduced by use of proxy groups are reasonably well understood? How close must the proxy group be to the group of interest?
These are important questions raised by a group of legal profession researchers which includes several of the principal investigators of the widely used After the JD dataset.
Professors Carole Silver, Ethan Michelson, Robert Nelson, Nancy Reichman, Rebecca Sandefur, and Joyce Sterling (hereinafter, Silver et al.) recently wrote a three-part response (Parts 1, 2, and 3) to my two-part blog post from December about International LLM students who remain in the United States (Part 1) and International LLM students who return to their home countries (Part 2). The bulk of Silver et al.’s critique appears in Part 2 of their post, and focuses mainly on Part 1 of my LLM post.
My post, which I described as “a very preliminarily, quick analysis intended primarily to satisfy my own curiosity” used U.S. Census data from the American Community Survey and two proxy groups for international LLM (“Masters of Law”) graduates to make inferences about the financial benefits of LLM degrees to international students who remain in the U.S. Silver et al. agree with several of the limitations of this analysis that I noted in paragraphs 5 through 8 of Part 1 of my post. They also note that historically, many LLMs have returned to their home countries and argue that the benefits of LLM programs to returning students may be greater than the benefits to those who remain in the United States. (While I am skeptical of this last claim—especially if we focus exclusively on pecuniary benefits—it seems likely that both groups benefit).
Silver et al. have also helpfully made several additional points about limitations in my proxy approach and ways in which proxies could over-count or under-count foreign LLMs. The most important of these limitations can be addressed with a few modifications to the LLM proxy group approach. Those interested in the technical details are encouraged to read footnote 1 below.
Returning to broader questions about the use of proxy groups, my view is that proxy groups can be helpful and potentially necessary for certain kinds of analysis.
Suppose that we wish to know the temperature in New York’s Central Park before we take a stroll, but we only have temperature readings for LaGuardia and Newark airport. While neither of those proxies will tell us the precise temperature in Central Park, they will usually be sufficiently close that we can ascertain with a reasonable degree of certainty whether we should bring our winter coats, wear sweaters, or proceed with short sleeves. Indeed, readings from Boston or Philadelphia will probably suffice, particularly if we’re aware of the direction and magnitude of typical temperature differences relative to Central Park.
Should we refuse to venture out until we can obtain a temperature reading from Central Park itself?
Perhaps if we need accuracy to within one or two degrees Celsius. Otherwise, the airport readings may be good enough, and the cost and delay required for further data collection may be prohibitive relative to the benefits.
Now suppose that we wish to know when we can pack our winter clothes into storage based on historical seasonal weather patterns. If we have a precise current reading for our location, but only have long term data for adjacent proxies, it may be more sensible for us to focus on the proxy data rather than the data for our current location.
In the context of legal education and the legal profession, there are many advantages to proxy data using large, nationally representative government data sets such as the American Community Survey, particularly if one wishes to make comparisons to other groups or other periods of time, and resources are limited. Since many other researchers use these datasets, their properties and any response biases tend to be relatively well understood. Many datasets also are updated regularly and routinely, and they are carefully administered and weighted to be as representative as possible.
ACS is not the only data set currently available to assess LLM programs. There are other off-the-shelf surveys, more targeted toward immigrants, that may be useful, and which Professor Silver and her co-authors may wish to consult.
Custom data sets can address problems that off-the-shelf data cannot because they can be designed to answer very specific questions. But if such surveys are not designed carefully, they risk losing the broader context that enables the results to be readily interpretable. Thus a data set that only reports on the earnings or other outcomes for “LLM graduates” is less useful for assessing the benefits of such programs than one that also provides the same information for a relevant control group who did not obtain LLMs, but are reasonably similar in important respects that predict outcome variables.
In many cases, results from off-the-shelf and custom data sets can be mutually reinforcing. For example, the results of After the JD III suggested that most law graduates were doing well financially 12 years after graduation, while The Economic Value of a Law Degree suggested that they probably could not have done nearly so well had they entered the labor market with only a bachelor’s degree. Timing Law School suggested that the results of AJD III were not a fluke due to respondents graduating in a good year.
Silver et al.’s interest extend beyond earnings premiums, and they believe that they can advance our understanding of the benefits of LLM programs by building a custom data set. I look forward to their findings.
 Perhaps the most important of these points is that foreign-born individuals could include those who immigrated to the United States prior to obtaining their bachelor’s degrees, and therefore do not resemble the typical international LLM graduate. The typical international LLM graduate has obtained a bachelor’s degree outside of the United States and a graduate degree in the United States.
Fortunately, this problem can be readily addressed. ACS includes variables for both year of birth and year of immigration. These variables can be used to exclude those who immigrated to the United States prior to the age at which they likely completed their bachelor’s degrees (i.e., age 22-26), depending on the country from which they immigrated.
Silver et al. also object to the exclusion of Hispanics from the analysis because LSAC data suggests that approximately 18 percent of LLMs in recent years come from Central and South America and the Caribbean. While many immigrants from these regions do not typically describe themselves to the Census as Hispanic—for example, those from Brazil, Belize or Trinidad—the objection to excluding Hispanics is reasonable.
Re-running the original analysis with Hispanics included does not change the results very much—earnings for both the non-LLM control group and the LLM proxy group both fall a bit, and the implied earnings premium in dollars decreases slightly. (Compare either the first proxy with and without Hispanics; or second proxy with and without Hispanics).
Silver et al. also argue that my proxy approach could underestimate the benefits of an international LLM, because they believe that international LLMs who remain in the U.S. are very likely to work as lawyers and judges and very unlikely to work as paralegals or legal assistants. It would be simple enough to construct an LLM proxy group that includes only foreign-born lawyers and judges with Masters degrees who immigrated to the United States after the age at which they likely completed their bachelor’s degrees. In combination with the broader proxy groups, this would provide a range for the earnings premium. Frank McIntyre and I have used a similar three-proxy-group approach in our research on the value of a law degree by college major.
Silver et al. also ask how the Census deals with unemployment and occupation. In IPUMS ACS, individuals who are unemployed report their most recent occupation. There is a separate occupation category for those who are seeking their first job and have never worked, and for those who have been unemployed for more than 5 years straight.
January 17, 2017
Professor Levit & Rostron's guide to submitting to law reviews is updated
We just updated our charts about law journal submissions, expedites, and rankings from different sources for the Spring 2017 submission season covering the 203 main journals of each law school.
A couple of the highlights from this round of revisions are:
First, again the chart includes as much information as possible about what law reviews are not accepting submissions right now and what dates they say they'll resume accepting submissions. Most of this is not specific dates, because the journals tend to post only imprecise statements about how the journal is not currently accepting submissions but will start doing so at some point in spring.
Second, while 72 law reviews still prefer or require submission through ExpressO, the movement toward the number of journals using and preferring Scholastica continues: 27 schools now require Scholastica as the exclusive avenue for submissions, with 25 more preferring or strongly preferring it, and 25 accepting articles submitted through either ExpressO or Scholastica,.
The first chart contains information about each journal’s preferences about methods for submitting articles (e.g., e-mail, ExpressO, Scholastica, or regular mail), as well as special formatting requirements and how to request an expedited review. The second chart contains rankings information from U.S. News and World Report as well as data from Washington & Lee’s law review website.
Information for Submitting Articles to Law Reviews and Journals: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1019029
The Washington & Lee data on citations to law reviews is not very useful, since it does not correct for volume of publication. As a rule of thumb, law review status tracks the hosting law school's status, though the further down the hierarchy one goes, the less meaningful the distinctions become. 2nd-tier specialty journals at some top schools can offer be a better bet than the main law review at other schools--you need to ask colleagues in your specialty to find out.
Science at work: the "most influential" people in legal education
Blog Emperor Caron reports. Thanks to the many Deans and law faculty who have been regular readers and correspondents over the years!
January 16, 2017
U of Washington/Tacoma to delay opening new law school
This is probably wise, absent clear evidence of unusually strong and growing demand for legal services in the Pacific Northwest.