February 09, 2016
The latest unscientific fad among law school watchers is comparing job openings projections for lawyers from the Bureau of Labor Statistics* with the number of students expected to graduate from law school. Frank McIntyre and I tested this method of predicting earnings premiums--the financial benefits of a law degree--using all of the available historical projections from the BLS going back decades. This method of prediction does not perform any better than random chance.** Labor economists--including those working at the BLS--have explicitly stated that BLS projections should not be used to try to value particular courses of study. Instead, higher education should be valued based on earnings premiums.
Bloggers who report changes in BLS projections and compare projected job openings to the number of students entering law school might as well advise prospective law students to make important life decisions by flipping a coin.
Many law graduates won't practice law. Many engineering graduates won't become engineers. Many students in every field end up working jobs that are not directly related to what they studied. They still typically benefit financially from their degrees by using them in other occupations where additional education boosts earnings and likelihood of employment.
And if one's goal really is to practice law even if practicing law is not more lucrative than other opportunities opened by a law degree, then studying law may not be a guarantee, but it still dramatically improves the odds.
* BLS job opening projections--which are essentially worthless as predictors for higher education--should not be confused with BLS occupational employment statistics, which provide useful data about earnings and employment in many occupations, including for lawyers.
** There isn’t even strong evidence that changes in the ratio between BLS projected lawyer job openings and law class size predict changes in the percent of law graduates who will practice law, although the estimates are too noisy to be definitive. Historically, the ratio of BLS projected openings to law graduates (or first year enrollments 3 years prior) has systematically under-predicted by a wide margin the proportion of law graduates practicing law shortly after graduation, although it is clear that a large minority of law graduates do not practice law.
February 9, 2016 in Guest Blogger: Michael Simkovic, Law in Cyberspace, Legal Profession, Ludicrous Hyperbole Watch, Of Academic Interest, Professional Advice, Science, Student Advice, Web/Tech, Weblogs | Permalink
February 02, 2016
Smaller or Larger Law Class Sizes Don’t Predict Changes in Financial Benefits of Law School (Michael Simkovic)
One of the most surprising and controversial findings from Timing Law School was that changes in law school graduating class size do not predict changes in the boost to earnings from a law degree.* Many law professors, administrators, and critics believe that shrinking the supply of law graduates must surely improve their outcomes, because if supply goes down, then price—that is, earnings of law graduates—should go up.
In a new version of Timing Law School, Frank McIntyre and I explore our counterintuitive results more thoroughly. (The new analysis and discussion appear primarily in Part III.C. “Interpreting zero correlation for cohort size and earnings premium” on page 18-22 of the Feb. 1, 2016 draft and in Table 10 on the final page).
Our results of no relationship between class size and earnings premiums were robust to many alternative definitions of cohort size that incorporated changes in the number of law graduates over several years. This raises questions about whether our findings are merely predictive, or should be given a causal interpretation.
We considered several interpretations that could reconcile our results with a supply and demand model and with the data. The most plausible interpretation seemed to be that when law class sizes change, law graduates switch between practicing law and other employment opportunities that are equally financially rewarding. While changes in the number of law graduates might have an impact on the market for entry-level lawyers, such changes are much less likely to have a discernible impact on the much larger market for highly educated labor.
Although law graduates who practice law on average earn more than those who do not, at the margin, those who switch between practicing law and other options seem to have law and non-law opportunities that are similarly lucrative. We found that the proportion of recent law graduates who practice law does decline as class size increases, but earnings premiums remain stable. This is consistent with the broader literature on underemployment, and supports the view of law school as a flexible degree that provides benefits that extend to many occupations. (See here and here).
A related explanation is that relatively recent law school graduates may be reasonably good substitutes for each other for several years, blunting the impact of changes in class size on earnings.
Interpretations that depend on law students and law schools perfectly adjusting class size in anticipation of demand for law graduates or future unemployment seem implausible given the unrealistic degree of foresight this would require. Offsetting changes in the quality of students entering law school—an explanation we proposed in an earlier version of the paper—seems able to explain at most a very small supply effect. Although credentials of entering classes appear to decline with class sizes, these changes in credentials are relatively small even amid dramatic changes in class size, and probably do not predict very large changes in earnings. Imprecision in our estimates is another possibility, although for graduates with more than a few years of experience, our estimates are fairly precise.
Even if there are effects of law class size on law earnings premiums, they are probably not very large and not very long lasting.
* The finding is consistent with mixed results for cohort size in other econometric studies of cohort effects, but nevertheless was contrary to many readers’ intuitions.
December 14, 2015
In the era of Google Maps, instant language translation, and digital music libraries, law students still spend countless hours flipping pages to find the right subclause or definition in a statute. This process can and should be automated.
Computerized calculations have liberated STEM students from tedious, repetitive tasks so that they can focus on the more intellectually simulating and creative aspects of math, science and engineering. Word processing software has freed us all from applying whiteout and waiting for it to dry and from manually retyping manuscripts to correct a few errors. This has enabled us to focus on our ideas and not the mechanics of fixing them permanently on paper.
Law is an inherently conservative field, focused on precedent, tradition, and risk avoidance. But when the case for change is compelling, we are prepared to try new tools.
I’ve been thinking about the problems of statutory interpretation for years and how automation could streamline the process. I’m very excited to announce a new electronic statutory supplement, LawEdge. (Full disclosure: I helped develop it).
LawEdge aims to do for statutory interpretation what the calculator did for mathematics.
The U.S. Code includes thousands of defined terms. A reader must understand what each of the defined terms means to understand the meaning of each provision containing those defined terms.
Unfortunately, defined terms are not always clearly labeled. Even when defined terms are labeled as defined terms, understanding one provision may require flipping back and forth to several other locations in the code. This process can be slow and cumbersome with paper statutes. Even electronic statutes often will not take users to the precise location in the code where a definition appears, but will instead take readers to a the section containing the definition, forcing readers to search for the definition.
LawEdge makes working with defined terms simple and easy. Defined terms are clearly labeled. Clicking on a defined term generates a popup window showing its meaning. Definitions are also hyperlinked to their meaning.
Definitions are context-specific and do not apply to all sections of the code. For example, the definition of “property” in Section 317(a) of the Internal Revenue Code does not apply to Section 351 of the same title. LawEdge recognizes context and links definitions appropriately.
LawEdge is easy to navigate. For example, suppose that you wish to read § 21(b)(2)(B). With a paper statutory supplement, you could flip to section 21, then look for subsection (b), then read down to paragraph (2) and finally find subparagraph (B). The entire process might take 30 seconds, and along the way you might accidentally look at the wrong provision. With LawEdge, this process is nearly instantaneous and error free. You would simply type s21b2b in the search bar. This feature works all the way down to the subclause level.
Browsing a statute is also easier and more intuitive. Structural components are color coded to be more recognizable.
The underlying technology is algorithmic, which means it is easy to update and support as the U.S. Code changes.
LawEdge has all of the benefits of paper—notes, highlights, bookmarks, offline access—and many advantages only available electronically. It can be used on exams with the latest version of ExamSoft, which offers on option to only block internet access but not the hard drive.
If you’re interested in trying it for your class, please feel free to contact me for an evaluation copy.
December 08, 2015
November 24, 2015
This Thanksgiving, I'm thankful for the Financial Times.
While some leading business and financial newspapers have adulterated their coverage to appeal to a mass audience or reduce costs, the Financial Times continues to produce high quality, fact-based reporting about serious business, financial, and economic issues. The FT's target audience continues to be legal and financial professionals who are prepared to pay a premium for reliable information. The FT includes a minimum of hyperbole and fluff. It also offers a more global perspective than most U.S. papers, while still providing strong coverage of important U.S. issues.
For the last 5 years, I've routinely recommended the FT to students in my business law classes, who are generally more familiar with U.S. papers. The FT is available on Lexis (with a few days delay), but is well worth the cost of a subscription.
If you're not a regular reader of the FT, but have been following U.S. newspapers' higher education coverage, you can get a sense of the differences between the FT and U.S. newspapers' approach across subject areas by reading this article about fees at public UK universities exceeding those at U.S. universities. The article is entirely focused on costs and benefits of education and how those costs and benefits are distributed between students, government, and employers. There are no unrepresentative anecdotes, no histrionics, only summaries of data. When advocacy groups are cited, their interests are noted. This is what journalism can and should be.
Pearson recently sold the FT to Nikkei. Hopefully the new owners maintain the FT's high quality.
September 01, 2015
Student loan borrowers with the highest debt levels have the lowest default rates (Michael Simkovic)
August 28, 2015
August 27, 2015
August 10, 2015
In a recent column, the New York Times’ Nicholas Kristof confessed, “One of our worst traits in journalism is that when we have a narrative in our minds, we often plug in anecdotes that confirm it.” The quote is timely, given recent controversy surrounding New York Times’ coverage.
Newspapers tend to emphasize anecdotes over data. This gives journalists, editors, and their sources tremendous freedom to frame a story. A few individuals can serve as ostensible examples of a broader phenomenon. But if those examples are unrepresentative or taken out of context, the news story can be misleading by omission and emphasis. If you get your information from the newspaper, you might worry more about stabbings and shootings than diet and exercise, but you are roughly 38 times more likely to die from heart disease than from violent crime.
Similar qualitative problems—sensationalism, reliance on extreme and unrepresentative anecdotes, lack of context, and omission of relevant data and peer reviewed research—characterized press coverage of law schools and the legal profession. (See New York Times; The Wall Street Journal; New York Times again)
Newspapers conflated a generally weak labor market—in which law graduates continued to have substantial earnings and employment advantages over similar bachelor's degree holders (see The Economic Value of a Law Degree; Timing Law School; Compared to What? (here and here); Recent Entry Level Outcomes and Growth in Lawyer Employment and Earnings)—with a law-specific problem. They criticized law schools—and only law schools—for practices that are widespread in higher education and in government. (see competitive scholarships; school-funded jobs, measuring employment / unemployment) And they uncritically reported research, no matter how flawed, that fit the anti-law school narrative. (see Failing Law Schools' problems with data and citations; a free education as a hypothetical alternative to student loans; and other inapposite comparisons (here, here and here)).
Newspapers' sensationalist law school coverage may have helped their circulation—negative coverage attracts eyeballs—but it mislead students in harmful ways. Recent research suggests that each year of delaying law school—for example, to wait until unemployment declines—is counterproductive. Even taking into account the potential benefits of graduating into a better economy, these delaying strategies typically cost the prospective law student more than $30,000 per year because of the high opportunity cost of lower earnings with a bachelor's degree instead of a law degree. The longer the delay, the higher the cost.
So which newspapers and journalists provided the most negative coverage? And how has the news slant evolved over time? For an explanation of methodology, see the footnote at the bottom.*
The most negative newspapers were the Wall Street journal, the Chicago Tribune, and the New York Times, in that order. The Wall Street Journal was exceptionally negative—more than 7 times as negative as the average newspaper. A few newspapers, such as the Orange County Register, were net positive.
2011 to 2013 were exceptionally negative years, with dramatic reductions in negativity in 2014.
Did negative press coverage cause a decline in law school applications, independent of the events being covered? Differences in press coverage can move financial markets, according to research exploiting variation in local coverage of identical national events and local trading patterns, so perhaps press coverage can also affect other markets. (The leaders of Law School Transparency apparently believe that negative press coverage can reduce law school applications. One of them explained his efforts to pitch negative news stories in specific parts of the country where he thought law school enrollments were too high.) (PDF here)**
The New York Times and WSJ both went negative early, but the Wall Street Journal remained more negative for a much longer period of time. Most of the uncredited (no byline) stories in the NY Times and WSJ about law school were negative.
The WSJ had an unusually deep bench of anti-law school journalists. By contrast, most newspapers had a few very negative journalists and otherwise a fairly even mix of slightly negative and slightly positive journalists. The most anti-law school journalist was Ameet Sachdev of the Chicago Tribune, whose coverage was about twice as negative as either David Segal of the New York Times or Jennifer Smith of the Wall Street Journal.
Geographically, the hardest hit areas were New York, Illinois (Chicago), and Washington D.C. (This is counting the New York Times and Wall Street Journal as New York papers). Ohio was the only state that saw net positive coverage.
The pattern of coverage does not seem to have much relationship to the strength of the local legal employment market, but rather seems to turn more heavily on idiosyncratic editorial policies at particular newspapers that happen to be headquartered in certain states.
* I asked my research assistant (a third year law student) to gather articles about legal education and the legal profession from the top 25 U.S. newspapers by circulation for which data was available from Proquest back to at least 2010. My RA then rated each article as "positive", "negative" or "neutral" depending on whether the article would have made him more or less likely to attend law school if he had read it while deciding. For each newspaper or journalist, the number of positive articles was subtracted from the number of negative articles to arrive at a net-negative count, and newspapers were ranked on this metric. There are some obvious limitations of this approach--it doesn't measure how positive or negative each article is, it assumes that one positive article can balance out one negative article (negative articles probably have a bigger impact than positive ones), it relies on the opinion of a single third year law student. It also lacks context—perhaps newspaper coverage about all topics is generally negative. Perhaps newspaper coverage of all higher education was negative during this period. Nevertheless, this approach may provide some useful insights. All editions of the Wall Street Journal and New York Times tracked by Proquest are combined, but identical articles published in different editions are counted only once. The WSJ blog is included as part of the WSJ.
** Contrary to popular belief, there is little evidence that larger law school graduating class sizes predict worse outcomes for law school graduates, nor is there evidence that smaller graduating class sizes predict better outcomes. See (Timing Law School and a summary). In a recent robustness check considering many alternative definitions of cohort size (but not yet reported in the draft paper), McIntyre and Simkovic continued to find no evidence that smaller graduating cohorts predict higher earnings premiums for recent graduates.