Thursday, January 26, 2017
Monday, January 23, 2017
Thursday, January 19, 2017
UPDATED: MOVING TO FRONT FROM YESTERDAY
Here's the report:
As of 1/6/17, there are 134,007 applications submitted by 21,711 applicants for the 2017–2018 academic year. Applicants are down 4.2% and applications are down 2.2% from 2016–2017.
Last year at this time, we had 40% of the preliminary final applicant count.
Although there has been a trend towards increasingly later applications, this figure does suggest that we are going to see a slight, but not negligible, decline in applicants this cycle.
UPDATE: But now LSAC reports that LSAT-takers in December were up nearly 8% from the prior year! The likely explanation though, is a scheduling change, which led more applicants to skip the early fall LSAT in favor of the December one. But that would also account for the decline in applicants noted in the 1/6/17 report. So my guess now is that we won't be seeing any decline in the applicant pool this year, so we really are at "the new normal."
Established datasets, proxies, and customized data collection: The case of international LLMs (Michael Simkovic)
How should researchers make tradeoffs between the costs of data collection, the speed of the analysis, the precision of the measurements, reproducibility by other researchers, and broader context about the meaning of the data: how we might compare one group or one course of action to another, how we might understand historical trends, and the like?
Must we always measure the precise group of interest, with zero tolerance for over-inclusion or under-inclusion? Or might one or a series of proxy groups be sufficient, or even preferable for some purposes? What if the proxies have substantial overlap with the groups of interest and biases introduced by use of proxy groups are reasonably well understood? How close must the proxy group be to the group of interest?
These are important questions raised by a group of legal profession researchers which includes several of the principal investigators of the widely used After the JD dataset.
Professors Carole Silver, Ethan Michelson, Robert Nelson, Nancy Reichman, Rebecca Sandefur, and Joyce Sterling (hereinafter, Silver et al.) recently wrote a three-part response (Parts 1, 2, and 3) to my two-part blog post from December about International LLM students who remain in the United States (Part 1) and International LLM students who return to their home countries (Part 2). The bulk of Silver et al.’s critique appears in Part 2 of their post, and focuses mainly on Part 1 of my LLM post.
My post, which I described as “a very preliminarily, quick analysis intended primarily to satisfy my own curiosity” used U.S. Census data from the American Community Survey and two proxy groups for international LLM (“Masters of Law”) graduates to make inferences about the financial benefits of LLM degrees to international students who remain in the U.S. Silver et al. agree with several of the limitations of this analysis that I noted in paragraphs 5 through 8 of Part 1 of my post. They also note that historically, many LLMs have returned to their home countries and argue that the benefits of LLM programs to returning students may be greater than the benefits to those who remain in the United States. (While I am skeptical of this last claim—especially if we focus exclusively on pecuniary benefits—it seems likely that both groups benefit).
Silver et al. have also helpfully made several additional points about limitations in my proxy approach and ways in which proxies could over-count or under-count foreign LLMs. The most important of these limitations can be addressed with a few modifications to the LLM proxy group approach. Those interested in the technical details are encouraged to read footnote 1 below.
Returning to broader questions about the use of proxy groups, my view is that proxy groups can be helpful and potentially necessary for certain kinds of analysis.
Suppose that we wish to know the temperature in New York’s Central Park before we take a stroll, but we only have temperature readings for LaGuardia and Newark airport. While neither of those proxies will tell us the precise temperature in Central Park, they will usually be sufficiently close that we can ascertain with a reasonable degree of certainty whether we should bring our winter coats, wear sweaters, or proceed with short sleeves. Indeed, readings from Boston or Philadelphia will probably suffice, particularly if we’re aware of the direction and magnitude of typical temperature differences relative to Central Park.
Should we refuse to venture out until we can obtain a temperature reading from Central Park itself?
Tuesday, January 17, 2017
We just updated our charts about law journal submissions, expedites, and rankings from different sources for the Spring 2017 submission season covering the 203 main journals of each law school.
A couple of the highlights from this round of revisions are:
First, again the chart includes as much information as possible about what law reviews are not accepting submissions right now and what dates they say they'll resume accepting submissions. Most of this is not specific dates, because the journals tend to post only imprecise statements about how the journal is not currently accepting submissions but will start doing so at some point in spring.
Second, while 72 law reviews still prefer or require submission through ExpressO, the movement toward the number of journals using and preferring Scholastica continues: 27 schools now require Scholastica as the exclusive avenue for submissions, with 25 more preferring or strongly preferring it, and 25 accepting articles submitted through either ExpressO or Scholastica,.
The first chart contains information about each journal’s preferences about methods for submitting articles (e.g., e-mail, ExpressO, Scholastica, or regular mail), as well as special formatting requirements and how to request an expedited review. The second chart contains rankings information from U.S. News and World Report as well as data from Washington & Lee’s law review website.
Information for Submitting Articles to Law Reviews and Journals: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1019029
The Washington & Lee data on citations to law reviews is not very useful, since it does not correct for volume of publication. As a rule of thumb, law review status tracks the hosting law school's status, though the further down the hierarchy one goes, the less meaningful the distinctions become. 2nd-tier specialty journals at some top schools can offer be a better bet than the main law review at other schools--you need to ask colleagues in your specialty to find out.
Monday, January 16, 2017
Thursday, January 12, 2017
Law professors weigh in on Trump's "conflicts of interest" given his non-divestment from his businesses
Tuesday, January 10, 2017
U of Oregon President Mike Schill (former Dean of U of Chicago and UCLA Law Schools) responds to commentary on Shurtz case
Monday, January 9, 2017
With the start of a new year, here they are:
1. Cass Sunstein (Harvard), 266,146 downloads of 232 papers (posting papers since 1996)
2. Daniel Solove (George Washington), 263,111 downloads of 45 papers (remarkably, more than 60% of the downloads are due to a single paper!) (posting papers since 2001)
3. Lucian Bebchuk (Harvard), 249,457 downloads of 174 papers (posting papers since 1996)
4. Mark Lemley (Stanford), 188,578 downloads of 161 papers (posting papers since 1996)
5. Bernard Black (Northwestern), 178,719 downloads of 155 papers (posting papers since 1996)
6. Stephen Bainbridge (UCLA), 123,522 downloads of 98 papers (posting papers since 1997)
7. Dan Kahan (Yale), 122,574 downloads of 69 papers (posting papers since 1996)
8. Brian Leiter (Chicago), 122,416 downloads of 67 papers (posting papers since 2000)
9. Orin Kerr (George Washington), 108,160 downloads of 54 papers (posting papers since 2002)
10. Eric Posner (Chicago), 105,954 downloads of 135 papers (posting papers since 1997)