May 01, 2019
From Professor Thompson's website at Georgetown (I've added institutional affiliations):
The Corporate Practice Commentator is pleased to announce the results of its twenty-fifth annual poll to select the ten best corporate and securities articles. Teachers in corporate and securities law were asked to select the best corporate and securities articles from a list of articles published and indexed in legal journals during 2018. Just short of 400 articles were on this year’s list. Because of the vagaries of publication, indexing, and mailing, some articles published in 2018 have a 2017 date, and not all articles containing a 2018 date were published and indexed in time to be included in this year’s list.
The articles, listed in alphabetical order of the initial author, are:
Yakov Amihud (NYU Business), Markus Schmid (St. Gallen) & Steven Davidoff Solomon (Berkeley). Settling the Staggered Board Debate. 166 U. Pa. L. Rev. 1475-1510 (2018).
Tamara Belinfanti (New York Law School) & Lynn Stout (late of Cornell). Contested Visions: The Value of Systems Theory for Corporate Law. 166 U. Pa. L. Rev. 578-631 (2018).
James D. Cox (Duke) & Randall S. Thomas (Vanderbilt). Delaware’s Retreat: Exploring Developing Fissures and Tectonic Shifts in Delaware Corporate Law. 42 Del. J. Corp. L. 323-389 (2018).
Jill E. Fisch (Penn). Governance by Contract: The Implications for Corporate Bylaws. 106 Cal. L. Rev. 373-409 (2018).
Jill E. Fisch (Penn), Jonah B. Gelbach (Penn, moving to Berkeley) & Jonathan Klick (Penn). The Logic and Limits of Event Studies in Securities Fraud Litigation. 96 Tex. L. Rev. 553-618 (2018).
George S. Geis (Virginia). Traceable Shares and Corporate Law. 113 Nw. U. L. Rev. 227-277 (2018).
Cathy Hwang (Utah). Deal Momentum. 65 UCLA L. Rev. 376-425 (2018).
Dorothy S. Lund (Southern California). The Case against Passive Shareholding Voting. 43 J. Corp. L. 493-536 (2018).
Edward B. Rock & Daniel L. Rubinfeld (both NYU). Antitrust for Institutional Investors. 82 Antitrust L. J. 221-78 (2018).
Mark J. Roe (Harvard). Stock-Market Short-Termism’s Impact. 167 U. Pa. L. Rev. 71-121 (2018).
March 13, 2019
Dru Stevenson (South Texas) writes:
I've enjoyed your recent blog posts about the law school rankings. As far as I can tell, HeinOnline counts two-author articles as an article for each coauthor, which means that when faculty at the same school coauthor an article, citations to that article count once for each author, and twice for the institution, no? In other words, for lower-ranked law schools that are concerned about their scholarly rankings, co-authored publications from their own faculty count double. When USNews starts using HeinOnline citation counts, it will reward institutions where a lot of professors co-author articles. I'm not sure this would be a bad thing - coauthorship is much more common in some other academic disciplines, and I think the legal academy might benefit from more collaboration and scholarly mentoring relationships. But it also is susceptible to gaming, of course. Any thoughts on this?
Does anyone know if this is how Hein searches will work? And thoughts on Professor Stevenson's question also welcome. Signed comments will be strongly preferred, thanks.
March 12, 2019
Blog Emperor Caron unwisely hypes his school's favorable overall ranking in the USNews.com charade. This is unwise because it legitimates the nonsense number (i.e., the overall rank), which will likely come back to bite Pepperdine in another year (much as they got bitten rather unfairly last year). With resources, any school can move up in the rankings by shrinking their student body (especially the 1L class) and holding everything else constant. As I've noted before, almost every change, for better or worse, in the USNews.com overall ranking has nothing to do with reality: it reflects moves to game the rankings either by the school doing better, or by one's immediate competitors for those schools that do worse.
The Blog Emperor also usefully produces the "peer [academic] reputation" scores for the most recent law school rankings. These scores typically track the overall USNews.com ranking in recent years, with small deviations. This year's amusing small deviation is for Yale, which comes in at 4.8, behind Harvard and Stanford at 4.9. Yale is still #1 in the overall ranking, while Harvard is #3, behind Stanford at #2--the way it's been for a number of years now. This result is entirely a function of one-and-only one factor (which USNews.com doesn't print): spending per capita. Harvard is rich but large, with economies of scale for which it is penalized in the ranking formula; Yale and Stanford are rich, but very small. Hence the results.
March 07, 2019
Attention Bob Morse: this is quite important in using Hein for a scholarly impact study (UPDATED--SEE BELOW, IMPORTANT!)
The point is due to Robert Anderson (Pepperdine): "[T]o the extent that interdisciplinary work has an impact in law, it will be cited in law reviews and therefore captured in the ranking. Some of the papers most often cited in law reviews were published in economics or finance journals (Jensen and Meckling, Coase). The key here is ensuring that Hein and US News take into account citations TO interdisciplinary work FROM law reviews, not just citations TO law reviews FROM law reviews as it appears they might do. That would be too narrow. Sisk currently captures these interdisciplinary citations FROM law reviews, and it is important for Hein to do the same. The same applies to books."
It's not yet clear how they will utilize the Hein database. When I search my own name in the law library journal, I get a much higher count than I do with Westlaw, because Hein actually has a much larger number of foreign law journals than Westlaw. And I find citations to scholarship that did not appear in law journals as well, including books. But maybe that isn't how it's going to be done?
UPDATE: Kevin Gerson, Director of the Law Library at UCLA, writes with extremely helpful (but also alarming) information:
I’ve been reading with interest your posts and thoughts on the new US News scholarly impact ranking (along with all of your other posts). From the information we have available so far, I think it’s pretty clear how US News will make use of the Hein database. Two years ago, with Hein’s help, I set up UCLA Law faculty author profile pages within HeinOnline. In order to create those pages, Hein sent me an information request by way of an Excel spreadsheet that included about a dozen informational columns to be filled out. The columns included such things as Known Name Variations, Affiliation Website Link, and Author E-mail Address. The tell is that when US News made their information request of law schools in mid-February, they sent a nearly identical information request by way of the same Excel spreadsheet that also included a Known Name Variations column. What this means is that US News is having Hein create the very same author profile pages that I (and others) had created for their schools. Those author pages include: (1) journal article citations to the author’s articles that are contained in Hein but only if those citations are made by other articles also contained in Hein and only if those citing articles use a recognized citation abbreviation, such as the Bluebook; and (2) case citations to the author’s articles that are contained in Hein but only if those citations are made by cases available in HeinOnline or Fastcase. Citations to books will not be included. Nor will citations to journals not contained in Hein. By using this method, US News has designed a purely automated way to calculate “impact.”
When you searched for yourself in the HeinOnline Law Journal library, you were conducting a different search than the one used to create author pages. You were thus able to pull up references to books not contained in Hein. Try instead searching in the Law Journal Library (under the advanced search) for yourself using the Author/Creator field. The results you see there, along with the citation counts, are what form the basis of your Hein faculty author page, and that is what will be used for the US News metric, IMO. The only unknown is how US News will combine that impact metric with a “productivity” metric during the same period.
If Mr. Gersen is correct about what U.S. News is planning on doing, then their impact study will be garbage, since law review citations to work in other law reviews is only a small part of the landscape of scholarship and impact, as Professor Anderson noted. Yes it is more work to search names, as Sisk and colleagues too, but it is far more complete and meaningful then what appears to be in the offing.
February 25, 2019
It appears concerns about which faculty would count for impact purpose have been heard: as the Blog Emperor notes USNews.com will still ask schools to list all tenure-stream faculty, but will also ask for their primary role to be identified (e.g., "doctrinal" or "clinical" or "legal research and writing"). USNews.com has not yet decided what to do with this information, but I have some advice: study the scholarly impact only of the academic or "doctrinal" faculty. If other categories are included this will have the effect of leading schools to exclude them from the tenure track, given that, typically, they are not expected to produce scholarship as much as the doctrinal faculty.
February 15, 2019
...regardless of whether or not scholarly writing is part of their duties. Following up on yesterday, a colleague elsewhere writes: "I saw your post on US News’s new impact rankings. I wrote to Bob Morse earlier this week to ask for clarification about whether to include clinical, LRW, and library faculty if they are tenure/tenure-track but do not have full (or any) scholarship requirements. He wrote back to say that they are all included: US News is using the bright line of tenure/tenure-track regardless of tenure classification or scholarly requirements."
February 14, 2019
...basically on the model I used to do and Greg Sisk (St. Thomas) has continued now for several years, but with a couple of differences/unknowns. I guess they didn't want to be left behind by the new "gold standard"!
First, the similarities: they will examine only a five-year window (2014-2018, no doubt because Sisk just did 2013-2017); and they will collect data on citations to the median and mean faculty member, as Sisk did. But now the differences: they appear to be planning on including tenure-track faculty, not just tenured faculty, even though tenure-track faculty have much lower citation rates; they are using Hein instead of Westlaw; and they are also going to count publications (how is a bit unclear). Also unclear is whether they plan to combine productivity with impact measures: given Bob Morse's affection for meaningless aggregations of apples and oranges, I fear that's what they may do. But we'll see.
USNews.com does not plan on incorporating the impact/productivity ranking into this year's law school rankings, but I bet money they will incorporate it going forward, which is consistent with changes they've made to their overall Business and Medical School rankings, incorporating more "objective" data, although not impact metrics. Obviously USNews.com knows it has been repeatedly burned by misleading self-reporting by schools that it never carefully audits, so switching to non-manipulable metrics no doubt seems preferable. And since their academic reputation surveys are now just echo chambers of recent overall rankings, adding in an impact/productivity component would be a slight corrective to that. (Contrast, e.g., Stanford's academic reputation in U.S. News [tied with Yale and Harvard] with its scholarly impact performance.)
Schools that have their clinical faculty on the tenure-stream, even if there are not publication expectations, may be in particular trouble here. Sisk's policy, which was mine, was to exclude clinical faculty, since at many schools, even those where they have tenure, their responsibilties do not include scholarship. But USNews.com is asking for all tenured and tenure-track faculty, regardless of primary role or function.
February 01, 2019
Brian Leiter and Paul Caron both recently noted a study by Adam Chilton, Jonathan Masur, and Kyle Rozema which argues that law schools can increase average faculty productivity by making it harder for tenure track faculty to get tenure. While this seems plausible, denying tenure more often is no free lunch.
A highly regarded study by Ron Ehrenberg (published in the Review of Economics and Statistics) found that professors place a high monetary value on tenure, and a university that unilaterally eliminated tenure would either have to pay more in salary and bonus or suffer a loss in faculty quality. After controlling for faculty quality, university rank, and cost of living, university economics departments that are less likely to offer faculty tenure must pay untenured faculty more, in part to compensate for increased risk. Reduced tenure rates is associated with higher productivity, but it is costly.
It's easy to understand why. A promising candidate with offers from otherwise comparable universities A and B would be unlikely to take an offer from A knowing that A denies tenure 70 percent of the time while B only denies tenure 10 percent of the time.
Faculty who are untenured and at an institution with high tenure denial rates would also have strong incentives to spend their most productive years avoiding publishing anything that might upset private sector employers who could give them a soft landing in the event that they are denied tenure. Quantitative measures of faculty "productivity" based on number of citations and publications don't capture the harmful qualitative shift this would produce in faculty research, particularly in an area like law.
There are numerous other advantages to tenure (and disadvantages to weakening it), which I've discussed here and here, including protecting intelletual freedom, encouraging faculty to share rather than hoard knowledge, promoting investment in specialized skills, aligning faculty and institutional incentives, increasing the rigor of teaching and improving outcomes for students (compared to use of adjuncts).
January 31, 2019
That's the conclusion of a study by three colleagues of mine, Adam Chilton (just tenured, easy case!), Jonathan Masur, and Kyle Rozema (our Behavioral L&E Fellow). I've not looked at the details of the study, but I wonder how much the results are affectedd by Harvard's historical pattern (changed in recent years) of hiring and then tenuring everyone based on good grades in law school, which results in more "dead wood" there than elsewhere. Even if Harvard has some effect on the findings, I think their basic point is correct: law schools, especially those maintaining a high scholarly profile, should be more demanding about tenure.