June 08, 2015
1. Lucian Bebchuk (Harvard) (230,377 downloads, 172 papers)
2. Daniel Solove (George Washington) (229,918 downloads, 41 papers)*
3. Cass Sunstein (Harvard) (205,141 downloads, 189 papers)
4. Mark Lemley (Stanford) (161,607 downloads, 141 papers)
5. Bernard Black (Northwestern) (161,459 downloads, 138 papers)
6. Stephen Bainbridge (UCLA) (111,432 downloads, 95 papers)
7. Brian Leiter (Chicago) (103,669 downloads, 59 papers)
8. Dan Kahan (Yale) (95,120 downloads, 54 papers)
9. Eric Posner (Chicago) (92,878 downloads, 122 papers)
10. Orin Kerr (George Washington) (89,492 downloads, 49 papers)
*A single paper accounts for nearly two-thirds of Prof. Solove's downloads!
May 28, 2015
1. Cass Sunstein (Harvard) (28,599 downloads, 24 new papers)
2. Dan Kahan (Yale) (18,796 downloads, 5 new papers)
3. Daniel Solove (George Washington) (18,503 downloads, 2 new papers)
4. Mark Lemley (Stanford) (14,973 downloads, 8 new papers)
5. Lucian Bebchuk (Harvard) (13,940 downloads, 0 new papers)
6. Orin Kerr (George Washington) (12,254 downloads, 4 new papers)
7. Brian Leiter (Chicago) (12,097 downloads, 9 new papers)
8. Bernard Black (Northwestern) (10,561 downloads, 5 new papers)
9. Jeremy Waldron (NYU) (8,214 downloads, 6 new papers)
10. Tim Wu (Columbia) (8,158 downloads, 2 new papers)
And given how close to the top ten, I should note that my colleague Eric Posner had 8,065 downloads and six new papers in the last 12 months.
As the cases of Solove and Bebchuk show, "oldies but goodies" can keep the downloads pouring in!
May 20, 2015
Here. Prof. Lawsky counts only tenure-track hires, whether academic or clinical; she reports a total of 70 new hires this year, slightly down from last year. (It's lower if one substracts the tenure-track clinical hires, though I have not counted carefully.) The relatively small number of Yale JDs hired (only 6) is striking, though we don't know how many graduates of each school were on the market, though based on past years I would be surprised if there weren't several dozen Yale candidates seeking, meaning the vast majority failed to land positions. 21 of the 70 hires had Harvard JDs (though several of those were coming off Fellowships, like the Bigelow), while another 27 came from just five schools (Stanford, Yale, Chicago, Berkeley, and NYU).
May 12, 2015
The announcement in full:
The Top 10 Corporate and Securities Articles of 2014
The Corporate Practice Commentator is pleased to announce the results of its twenty-first annual poll to select the ten best corporate and securities articles. Teachers in corporate and securities law were asked to select the best corporate and securities articles from a list of articles published and indexed in legal journals during 2014. More than 525 articles were on this year’s list. Because of the vagaries of publication, indexing, and mailing, some articles published in 2014 have a 2013 date, and not all articles containing a 2014 date were published and indexed in time to be included in this year’s list.
The articles, listed in alphabetical order of the initial author, are:
Bainbridge, Stephen M. (UCLA) and M. Todd Henderson (Chicago). Boards-R-Us: Reconceptualizing Corporate Boards. 66 Stan. L. Rev. 1051-1119 (2014).
Fisch, Jill E. and Tess Wilkinson-Ryan (both Penn). Why Do Retail Investors Make Costly Mistakes? An Experiment on Mutual Fund Choice. 162 U. Pa. L. Rev. 605-647 (2014).
Fried, Jesse M. (Harvard). Insider Trading via the Corporation. 162 U. Pa. L. Rev. 801-839 (2014).
Hamermesh, Lawrence A. (Widener-Delaware). Director Nominations. 39 Del. J. Corp. L. 117-159 (2014).
Hansmann, Henry (Yale) and Mariana Pargendler (Vargas Law School, Sao Paulo). The Evolution of Shareholder Voting Rights: Separation of Ownership and Consumption. 123 Yale L.J. 948-1013 (2014).
Morley, John (Yale). The Separation of Funds and Managers: A Theory of Investment Fund Structure and Regulation. 123 Yale L.J. 1228-1287 (2014).
Roe, Mark J. (Harvard). Structural Corporate Degradation Due to Too-Big-to-Fail Finance. 162 U. Pa. L. Rev. 1419-1464 (2014).
Roe, Mark J. (Harvard) and Frederick Tung (BU). Breaking Bankruptcy Priority: How Rent-Seeking Upends the Creditors' Bargain. 99 Va. L. Rev. 1235-1290 (2013).
Strine Jr., Leo E. (CJ Delaware Supreme Court). Can We Do Better by Ordinary Investors? A Pragmatic Reaction to the Dueling Ideological Mythologists of Corporate Law. 114 Colum. L. Rev. 449-502 (2014).
Subramanian, Guhan (Harvard). Delaware's Choice. 39 Del. J. Corp. L. 1-53 (2014).
May 05, 2015
A better grading system
Professor Merritt argues that mandatory grading curves can be unfair when one class has stronger students than another. I agree.
Statistician Valen Johnson—whom I cite in my last post as an authority on grade inflation— has developed a clever solution to this problem which involves adjusting grading curves within each class based on the ability levels of the students. A Johnson-inspired proposal was nearly adopted at Duke University in the late 1990s, but was blocked by departments that offered higher grades and attracted weaker students.
Most law schools try to balance their sections in term of student ability levels and overall quality of faculty. Nevertheless, anomalies like a “smart section” (as Professor Merritt calls it) may occasionally occur. Johnson’s proposal would be an excellent solution to this problem.
Professor Merritt asserts that there is some sort of problem with the market for lawyers and law graduates that makes competition and inequality uniquely bad in the context of law. These assertions are implausible given the low barriers to entry for both law schools and lawyers, aggressive competition between law schools for students and between lawyers for clients, and widespread inequality outside of law school and legal practice. Some form of regulation is the norm in many areas of employment and in many industries, and a licensing regime for lawyers and an accreditation system for law schools do not in any way make these occupations and institutions unique or unusual. According to a recent study, nearly a third of U.S. workers are licensed, licensing is more common as education and skill levels increase, and licensing does not affect inequality among the licensed.
As a general matter, deregulated market competition and greater inequality are a package deal. Inequality can be reduced through regulation, taxation, and politicization of compensation through unionization or growth of public sector employment.
Professor Merritt’s critiques follow the standard playbook of law school critics—take something about law schools that is widespread and common out of context, claim that it is somehow unique to law schools when it is neither unique nor unusual, and then demonize it.
Jeremy Telman responds.
May 04, 2015
April 29, 2015
A number of critics have argued against extrapolation from Professor Merritt’s study of the Ohio legal market to the national legal market. In her response, Professor Merritt makes some good points, and also several key points with which I disagree.
Professor Merritt suggests that an important contribution of her study is providing up-to-date information about national legal employment through the prism of Ohio. However, there is no shortage of up-to-date data that can provide a more accurate picture of national trends than a study specifically focused on Ohio.* The primary value of Professor Merritt’s study is as an isolated snapshot of a single cohort in Ohio at a particular point in time. Without additional information, it is hard to know how much, if at all, Professor Merritt’s findings should be generalized to other legal markets or other time periods.
There is no reason to believe that the single Ohio cohort tracked by Professor Merritt will better predict outcomes for those currently enrolling in law school than a national cohort. The single Ohio cohort will likely be less predictive than a long-term national average across multiple cohorts. Indeed, as Professor Merritt acknowledges, her study is not a study of going to law school in Ohio because of selection issues from law graduates leaving for larger markets, coming to Ohio from other markets, and from non-bar passage. **
Year-to-year changes in employment, earnings, and economic growth can vary widely from state to state. Absent evidence of a history of correlated economic activity, a single state should not be used as a proxy for the U.S. as a whole or for other states.
There is no reason to believe that the trajectory of Ohio’s legal market from year to year will closely track national trends, particularly when the national legal market is heavily concentrated elsewhere. Washington D.C. and the top 5 states by size of legal market*** collectively account for more than half of the national legal market.
If Professor Merritt wishes to use Ohio as a proxy for the rest of the U.S., then she should supply evidence that Ohio tracks national trends, and she should compare Ohio to Ohio at different points in time and Ohio to the U.S. at the same point in time.
Second, Professor Merritt suggests that focusing on Ohio is just as reasonable as focusing on New York or California. New York and California collectively constitute 28 percent of the national legal market.*** Ohio constitutes 2.5 percent of the national legal market. Moreover, the New York legal market is unusually large relative to the New York economy, while Ohio has a legal market that is small relative to its economy.
Third, Professor Merritt suggests that Ohio can be made nationally representative by deflating salaries elsewhere by cost of living differences. Cost of living differences are not the reason corporations—who can send legal work anywhere— pay a premium for lawyers in the major legal markets such as New York, D.C., Los Angeles, Boston and Houston. Rather, corporate clients believe that differences in quality of work justify higher billing rates for important matters. New York, D.C. and other high-paying markets are importers of top legal talent from across the country.
Differences in costs of living are not random, but rather reflect real differences in quality. Cost of living indexes often focus on quantitative rather than qualitative factors. For example, a restaurant meal in Manhattan may cost more than a restaurant meal in Buffalo, but the quality of the experience in the restaurant in Manhattan will on average be higher because the high prices restaurants in Manhattan can charge will attract the most talented restaurateurs. Similarly, there may be differences in the quality of healthcare, legal services, education, policing, parks and recreation, environmental safety, transit, housing and other factors. Money attracts talent. Some amenities or opportunities may only be available in particular locations, and people are willing to pay for proximity to consumption, employment, and social opportunities.
Many costs are not local, but rather national. These include automobiles, items ordered online, higher education at major universities, and investments (stocks, bonds, etc.). For law school graduates—who will typically be able to earn far more than they consume in a given year—it is financially better to work where both income and costs are proportionately higher because this will maximize the dollar value of savings. Law graduates can always retire to a lower-cost location later in life if they wish.
One quantitative measure for differences in quality of life is differences in life expectancy.**** High cost, high income, high infrastructure states like New York, Connecticut, and Massachusetts generally rank well on this measure, while lower cost, lower income states rank less well. This pattern can also be seen internationally and individually—higher income and higher life expectancy are correlated.*****
There will indeed be some lucky individuals who find low-cost locales both more attractive and less expensive, and some unlucky individuals who find high cost locales unworthy of the price. Costs of living reflect the aggregation by the market of many individual preferences, not any particular person’s idiosyncratic views. Nevertheless, local prices can contain important information about quality of life that we should not assume away.
* There are numerous sources of up-to-date (2013 or even 2014) national information, including data from:
- the U.S. Department of Labor Bureau of Labor Statistics (BLS)
- the U.S. Census Bureau’s
- American Community Survey (ACS),
- Current Population Survey (CPS), and
- Survey of Income and Program Participation (SIPP),
- the National Association for Law Placement (NALP), and
- from the American Bar Association (ABA)
NALP and ABA data are for the most recent graduating class shortly after graduation. SIPP earnings data includes earnings as recently as 2013, but only through the class of 2008. ACS and CPS have young lawyers and young professional degree holders, but cannot specifically identify young law degree holders. The Department of Education also has information on student loan default rates for recent cohorts. Default rats remain much lower for former law students than for most other borrowers.
Another valuable source of information is After the JD III. Professor Merritt notes that response rates for higher income individuals may be higher in After the JD, but the After the JD researchers, like the U.S. Census, weight their sample to take into account differences in response rates.
**The selection bias issues may be more severe than Merritt has acknowledged. Looking at Ohio State’s 509 report for 2011, there were 24 students who took the NY bar vs. 136 who took the Ohio bar—a substantial percentage of the class taking a bar in a non-adjacent state. The New York bar takers had much higher bar passage rates (11% above the state average for N.Y. vs. 1.3% above the state average for Ohio), which is consistent with positive selection out of state. In any given year, roughly 25 to 50 percent of Ohio State law school graduates who are employed 9 or 10 months after graduation are employed outside of Ohio. For Case Western graduates, employment seems to be even less Ohio-centered than Ohio State.
*** Size of the legal market calculated using ACS data, multiplying number of lawyers by average total personal income per lawyer to get aggregate pay to all lawyers. In other words, the measure is a dollar count, not a body count.
**** It is probably preferable to consider life expectancy within race (life expectancy varies by race, and racial demographics vary by geography).
***** After controlling for GDP per capita, societies with less income dispersion tend to have higher life expectancy. Another issue is selection effects vs. causation. For example, those with higher life expectancy to begin with may choose to pursue additional education and therefore have the opportunity to live in high cost, high income states.
March 26, 2015
Law schools and prospective law students may be paying more attention to employment outcomes shortly after graduation than this short-term data deserves. One potential use of the aggregate data about entry level employment and salaries is to assess whether now is a good or bad time to apply to law school. But fluctuations in employment outcomes for recent graduates do not predict fluctuations in employment outcomes 3 or 4 years in the future when those currently deciding whether to enroll would graduate.
Nevertheless, law students and the press pay close attention to the short-term outcome data. Starting salary data from the National Association for Law Placement (NALP) is covered by the press and is a good predictor of the number of law school applicants two years later (We assume one year lag for data collection and dissemination; one year lag to apply to law school).
Why are students responding to this data even though it does not predict their own short-term outcomes? And does the responsiveness of enrollment to short-term outcomes mean that law students care only about the short term?
Law students likely think more long term. If law students were so impatient that they only cared about one or a few years of earnings, it is doubtful that law students would have completed college, since college also makes sense only as a long-term investment. Indeed, students who were so focused on the short term might not even have finished high school. While temporal preferences can change over time, education appears to shift people toward thinking more long term. Aging from adolescence through the age of 30 is also associated with becoming more oriented toward the future.
Perhaps students are focused on the short term because they mistakenly believe that swings in short term outcomes predict more than they do. Students would not be alone in this error.
Some widely read back-of-the-envelope analyses started with initial salaries, assumed unrealistically low earnings growth along with high discount rates or an arbitrary payback period (lack of concern for the future) and reached the erroneous conclusion that going to law school does not make sense financially. (For a discussion see here; for examples of erroneous studies, see here and here )
Students may be focused on the short term because they mistakenly believe it predicts more than it does. Or they may focus on the short term because it is the only information that is readily available to them.
Legal educators and the press can and should make greater efforts to inform students of the long term as opposed to the short-term consequences of legal education. We should also shift the discussion away from raw outcomes and toward estimates of causation and value-added relative to the next best option.
This will be a challenge. Short-term raw outcome data is embedded in American Bar Association-required disclosures, in NALP’s data collection efforts and in the U.S. News rankings. Thinking in value-added terms requires us all to understand basic principles of causal inference and labor economics. But shifting toward long-term value added is ultimately the right thing to do if we are serious about providing students with meaningful disclosure and facilitating informed decision making.
This is not meant to justify indifference to the plight of young people who have suffered the misfortune of graduating into an unfavorable economic climate over the last several years. To help alleviate youth unemployment, we must understand that the cause of this misfortune is the macro-economy, not higher education. Education is an important part of the solution. Among those who are young and inexperienced, those with more education continue to do better in the labor market than those with less, and this difference appears to be largely caused by the differences in level of education.
Insurance programs like income-based repayment of student loans and flexible and extended repayment plans can help young people manage the unpredictable and uncontrollable risk that they might happen to graduate into a bad economy. If this insurance leads to more people pursuing higher education, earning higher incomes, and paying more taxes, it will benefit not only students and educators, but also the federal government and the broader economy.
March 16, 2015
Last week, a website, USNews.com, released its annual rankings of law and other professional schools. (Since I was off-line last week, my comments had to wait.) Commentary, predictably, focused on the overall rank assigned by the website--what is known to "insiders" as the "nonsense number," since it is the upshot of an inexplicable weighting of 12 different factors, many self-reported and so of dubious accuracy anyway, but the amalgamation basically stipulative. Fortunately for USNews.com, many superficial journalists report the nonsense number, and changes in the nonsense number, as though they meant something.
So, for example, much ink was spilled on the "fact" that the University of Michigan Law School had a nonsense number of 11th, just outside "the top ten" where it usually resides. I did not see any journalist note, however, that just one raw score point (83 vs. 84) separated Michigan from Duke, Virginia, and Berkeley, all with a nonsense number of 8th. In other words, even by its own terms, the USNews.com demotion of Michigan to 11th was meaningless. (For those paying attention, Yale, because of its off-the-charts per capita expenditures, got a raw score of 100, Harvard and Stanford got 96, Columbia and Chicago 93, NYU 89, Penn 88, and then Duke et al. with 84.)
The most interest was in the nonsense number for UC Irvine, ranked for the first time this year. UCI came in at #30, the highest nonsense number debut I've ever seen in USNews.com. In USNews.com land, this puts UCI third in the UC system--behind Berkeley and UCLA, and ahead of UC Davis (31st) and UC Hastings (59th). (Hastings has probably been the most dramatic victim, over many years, of the small, private school bias in the USNews.com rankings.) Interestingly, UCI got this result despite weaker reputational scores: 29th in reputation among lawyers/judges (a rather good result, though, for a new school), and only 42nd among academics. Almost every school with a nonsense number around UCI had a higher academic reputation score, and my guess is UCI's will now improve accordingly. (The evidence for the echo chamber effect of the "overall rank" on the reputation scores in subsequent yeras is even greater now than in the past.) My guess is all those annoyed by the UC system starting a new law school penalized UCI in the reputational survey--if USNews.com published the median and mode, we'd have some idea, but I wouldn't be surprised if the distribution was skewed in that way. Counting against UCI is that it is still very small, and will presumably have to grow, which may affect other metrics.
USNews.com reported some curious data in various categories. I note two examples. Columbia, for the first time, reported the best student-faculty ratio in the nation: 6.3 to 1. Yale was 7.6/1, Stanford 7.3/1. Virginia reported 97.3% employed at graduation, but only 97% nine months out. This might be an artifact of the fact that USNews.com for the first time did not give schools full credit for graduates in law school funded positions--though many of these positions are quite legitimate.
The USNews.com reign of terror has now been going on for 25 years. It has been a disaster for legal education, though a boon for students with the favored numbers. In the 1990s, I used to try to reason with the USNews folks, and they in fact corrected some of their worst mistakes--for example, using starting salary data without taking into account regional differences; doing reputation surveys based on "quartiles" (meaning the dumbest evaluator--the one who forgot to put Harvard in the top quartile--determined the score); and failing to adjust expenditures for cost-of-living differences. But with regard to the basic problems--namely, that the weightings of the inputs are arbitrary, and that a lot of the data relied upon is bogus--they've done nothing. The only remedy for the USNews.com reign of terror will be competing systems, though hopefully not ones that simply replicate the USNews.com mistakes, though that is mostly what we have had so far.