February 09, 2016
The latest unscientific fad among law school watchers is comparing job openings projections for lawyers from the Bureau of Labor Statistics* with the number of students expected to graduate from law school. Frank McIntyre and I tested this method of predicting earnings premiums--the financial benefits of a law degree--using all of the available historical projections from the BLS going back decades. This method of prediction does not perform any better than random chance.** Labor economists--including those working at the BLS--have explicitly stated that BLS projections should not be used to try to value particular courses of study. Instead, higher education should be valued based on earnings premiums.
Bloggers who report changes in BLS projections and compare projected job openings to the number of students entering law school might as well advise prospective law students to make important life decisions by flipping a coin.
Many law graduates won't practice law. Many engineering graduates won't become engineers. Many students in every field end up working jobs that are not directly related to what they studied. They still typically benefit financially from their degrees by using them in other occupations where additional education boosts earnings and likelihood of employment.
And if one's goal really is to practice law even if practicing law is not more lucrative than other opportunities opened by a law degree, then studying law may not be a guarantee, but it still dramatically improves the odds.
* BLS job opening projections--which are essentially worthless as predictors for higher education--should not be confused with BLS occupational employment statistics, which provide useful data about earnings and employment in many occupations, including for lawyers.
** There isn’t even strong evidence that changes in the ratio between BLS projected lawyer job openings and law class size predict changes in the percent of law graduates who will practice law, although the estimates are too noisy to be definitive. Historically, the ratio of BLS projected openings to law graduates (or first year enrollments 3 years prior) has systematically under-predicted by a wide margin the proportion of law graduates practicing law shortly after graduation, although it is clear that a large minority of law graduates do not practice law.
February 9, 2016 in Guest Blogger: Michael Simkovic, Law in Cyberspace, Legal Profession, Ludicrous Hyperbole Watch, Of Academic Interest, Professional Advice, Science, Student Advice, Web/Tech, Weblogs | Permalink
February 03, 2016
February 02, 2016
Smaller or Larger Law Class Sizes Don’t Predict Changes in Financial Benefits of Law School (Michael Simkovic)
One of the most surprising and controversial findings from Timing Law School was that changes in law school graduating class size do not predict changes in the boost to earnings from a law degree.* Many law professors, administrators, and critics believe that shrinking the supply of law graduates must surely improve their outcomes, because if supply goes down, then price—that is, earnings of law graduates—should go up.
In a new version of Timing Law School, Frank McIntyre and I explore our counterintuitive results more thoroughly. (The new analysis and discussion appear primarily in Part III.C. “Interpreting zero correlation for cohort size and earnings premium” on page 18-22 of the Feb. 1, 2016 draft and in Table 10 on the final page).
Our results of no relationship between class size and earnings premiums were robust to many alternative definitions of cohort size that incorporated changes in the number of law graduates over several years. This raises questions about whether our findings are merely predictive, or should be given a causal interpretation.
We considered several interpretations that could reconcile our results with a supply and demand model and with the data. The most plausible interpretation seemed to be that when law class sizes change, law graduates switch between practicing law and other employment opportunities that are equally financially rewarding. While changes in the number of law graduates might have an impact on the market for entry-level lawyers, such changes are much less likely to have a discernible impact on the much larger market for highly educated labor.
Although law graduates who practice law on average earn more than those who do not, at the margin, those who switch between practicing law and other options seem to have law and non-law opportunities that are similarly lucrative. We found that the proportion of recent law graduates who practice law does decline as class size increases, but earnings premiums remain stable. This is consistent with the broader literature on underemployment, and supports the view of law school as a flexible degree that provides benefits that extend to many occupations. (See here and here).
A related explanation is that relatively recent law school graduates may be reasonably good substitutes for each other for several years, blunting the impact of changes in class size on earnings.
Interpretations that depend on law students and law schools perfectly adjusting class size in anticipation of demand for law graduates or future unemployment seem implausible given the unrealistic degree of foresight this would require. Offsetting changes in the quality of students entering law school—an explanation we proposed in an earlier version of the paper—seems able to explain at most a very small supply effect. Although credentials of entering classes appear to decline with class sizes, these changes in credentials are relatively small even amid dramatic changes in class size, and probably do not predict very large changes in earnings. Imprecision in our estimates is another possibility, although for graduates with more than a few years of experience, our estimates are fairly precise.
Even if there are effects of law class size on law earnings premiums, they are probably not very large and not very long lasting.
* The finding is consistent with mixed results for cohort size in other econometric studies of cohort effects, but nevertheless was contrary to many readers’ intuitions.
December 02, 2015
Developer of Law School Admission Test (LSAT) Disputes Advocacy Group's Bar Exam Claims (Michael Simkovic)
The Law School Admission Council (LSAC)--the non-profit organization which develops and administers the Law School Admission Test (LSAT)--recently issued a press release disputing claims by the advocacy group "Law School Transparency" about the relationship between LSAT scores and bar passage rates. "Law School Transparency," headed by Kyle McEntee, prominently cited the LSAC National Longitudinal Bar Passage Study (1998) as a key source for "Law School Transparency's" claims that many law schools are admitting students who are unlikely to pass the bar exam based largely on their LSAT scores. McEntee's group's claims of bar passage risk were widely disseminated by the national press.
However, according to LSAC, the Longitudinal Bar Passage Study does not provide much support for "Law School Transparency's" claims. Moreover, "Law School Transparency's" focus on first time bar passage rates is potentially misleading:
"The LSAC [National Longitudinal Bar Passage] study did state that 'from the perspective of entry to the profession, the eventual pass rate is a far more important outcome than first-time pass rate.' This statement is as true today as it was 25 years ago. As noted by LST, the LSAC study participants who scored below the (then) average LSAT score had an eventual bar passage rate of over 90 percent.
Kyle McEntee and David Frakt responded to some of LSAC's critiques--partly on substance by pointing out disclaimers in the full version of "Law School Transparency's" claims, partly by smearing the technical experts at LSAC as shills for law school--but notably did not explain why "Law School Transparency" chose to focus on first time bar passage rates rather than seemingly more important--and much higher--eventual bar passage rates.
Eventual bar passage rates were the focus of the National Longitudinal Bar Passage Study. The LSAC study's executive summary highlights eventual bar passage rates and detailed data is presented on page 32 and 33. Even among graduates of the lowest "cluster" of law schools, around 80 percent eventually passed the bar exam.
According to LSAC:
"The LSAC National Longitudinal Bar Passage Study was undertaken primarily in response to rumors and anecdotal reports suggesting that bar passage rates were so low among examinees of color that potential applicants were questioning the wisdom of investing the time and resources necessary to obtain a legal education."
"Law School Transparency" has revived similar concerns, but without a specific focus on racial minorities.*
There may be legitimate concerns about long term eventual bar passage rates for some law students. However, "Law School Transparency's" back-of-the-envelope effort, focused on short term outcomes, does not provide much insight into long-term questions. The most rigorous study of this issue to date--the LSAC National Longitudinal Bar Passage Study--concluded that "A demographic profile that could distinguish first-time passing examinees from eventual-passing or never-passing examinees did not emerge from these data. . . . Although students of color entered law school with academic credentials, as measured by UGPA and LSAT scores, that were significantly lower than those of white students, their eventual bar passage rates justified admission practices that look beyond those measures."Unfortunately, some newspapers reported "Law School Transparency's" bar passage risk claims in ways that suggested the claims were blessed by LSAC, or even originated from LSAC. For example, one prominent newspaper's editorial board wrote that "In 2013, the median LSAT score of students admitted to [one law school] was in the bottom quarter of all test-takers nationwide. According to the test’s administrators, students with scores this low are unlikely to ever pass the bar exam."
October 28, 2015
N.Y. Times is Mistaken: Law Student Loans are Safe and Profitable for the Government (Michael Simkovic)
This weekend, The New York Times Editorial Board published a sensationalist lead editorial, “The Law School Debt Crisis,” claiming that law student borrowing is harmful to taxpayers. The New York Times is mistaken.
The Times cited Florida Coastal School of Law, a for-profit institution, as its prime example of law schools “vacuuming up hordes of young people, charging them outrageously high tuition and, after many of the students fail to become lawyers, sticking taxpayers with the tab for their loan defaults.” Florida Coastal seems like an easy target—even a Federal Court which dismissed a fraud suit against Florida Coastal described it as having “some of the lowest admissions standards of accredited or provisionally accredited law schools in the nation.” The Times has repeatedly criticized for-profit colleges, which it deems “predatory” based on their unusually high student loan default rates. (See opinion, upshot, news and news again).
If the Editorial Board's accusations were true—if the “majority of law schools” really were running “a scam” in which they load down their students with “crushing amounts of debt” which “they can’t repay”—Florida Coastal and other law schools should have among the highest default rates of any institutions of higher education in the country.
They don’t and they aren’t.
For the cohort entering repayment in 2012—the most recent year of data available*—the national 3-year cohort default rate on federal student loans was 11.8 percent. The comparable figure for Florida Coastal was only 1.1 percent—more than 10 times lower.
Other measures tracked by the Department of Education, like repayment rates, also show law school borrowers performing as well or better than most.
We see the same pattern across law schools and going back decades for which data is available.** Even low ranked law schools with allegedly “outrageously high” tuition generally have much lower student loan default rates than either the national average, or the average for institutions that grant bachelor’s or advanced degrees.
Law students not only have higher debts than most student loan borrowers; as professional students, they also pay higher interest rates on government loans than undergraduates.
Law students rarely default because the financial benefits they receive from attending law school are usually far greater than the costs.*** Law school typically boosts annual earnings by around $30,000 (median) to $60,000 per year (mean) compared to a bachelor’s degree.**** Even at the 25th percentile, toward the low end of the distribution, the annual boost to earnings is around $20,000 per year—more than enough to repay typical law school loans over the course of a career.
Taxpayers also benefit. For every extra dollar a law graduate earns, the federal government receives an extra 30 to 40 cents in payroll and income taxes. The federal government charges far more in taxes than most law schools charge in tuition.
But the government isn’t paying for most law graduates’ education. In fact, loans to law students are among the most profitable in the federal government’s student loan portfolio, thanks to high interest rates and low default rates. Many law graduates are such good credit risks, and are overcharged so much by the government, that private lenders have offered to refinance law graduate loans for substantially lower interest rates.
There are cases in which particular individuals have unusually bad outcomes and struggle to repay their loans. Thankfully, these situations are relatively rare among law graduates.
Incomes for law graduates may seem low when they first graduate, but typically climb rapidly over the next several decades. Education loans exist precisely so that borrowed money can be repaid later in life, when employment is more stable and incomes are usually higher.
The New York Times is right that many law school graduates—around 40 percent—do not practice law. But law graduates do not have to practice law or earn spectacular salaries to benefit financially from their degrees and repay their loans over their careers. They need only earn roughly $10,000 per year more than they would have earned without a law degree. The overwhelming majority of law graduates, including those not practicing law, receive substantially larger boosts to their earnings.
Thanks to income based repayment programs with debt forgiveness and progressive taxation, the overwhelming majority of successful law school graduates can offset the risks of investment in education for those rare unfortunate individuals who do not benefit as much from their educations.
It would be a mistake to let the small tail of defaults wag the much larger dog of public benefits.
Scaling back access to federal student loans to law students will not benefit taxpayers. To the contrary, the loss of revenue would mean larger deficits for the government, and eventually higher taxes for the rest of us.
September 01, 2015
Student loan borrowers with the highest debt levels have the lowest default rates (Michael Simkovic)
August 28, 2015
August 10, 2015
In a recent column, the New York Times’ Nicholas Kristof confessed, “One of our worst traits in journalism is that when we have a narrative in our minds, we often plug in anecdotes that confirm it.” The quote is timely, given recent controversy surrounding New York Times’ coverage.
Newspapers tend to emphasize anecdotes over data. This gives journalists, editors, and their sources tremendous freedom to frame a story. A few individuals can serve as ostensible examples of a broader phenomenon. But if those examples are unrepresentative or taken out of context, the news story can be misleading by omission and emphasis. If you get your information from the newspaper, you might worry more about stabbings and shootings than diet and exercise, but you are roughly 38 times more likely to die from heart disease than from violent crime.
Similar qualitative problems—sensationalism, reliance on extreme and unrepresentative anecdotes, lack of context, and omission of relevant data and peer reviewed research—characterized press coverage of law schools and the legal profession. (See New York Times; The Wall Street Journal; New York Times again)
Newspapers conflated a generally weak labor market—in which law graduates continued to have substantial earnings and employment advantages over similar bachelor's degree holders (see The Economic Value of a Law Degree; Timing Law School; Compared to What? (here and here); Recent Entry Level Outcomes and Growth in Lawyer Employment and Earnings)—with a law-specific problem. They criticized law schools—and only law schools—for practices that are widespread in higher education and in government. (see competitive scholarships; school-funded jobs, measuring employment / unemployment) And they uncritically reported research, no matter how flawed, that fit the anti-law school narrative. (see Failing Law Schools' problems with data and citations; a free education as a hypothetical alternative to student loans; and other inapposite comparisons (here, here and here)).
Newspapers' sensationalist law school coverage may have helped their circulation—negative coverage attracts eyeballs—but it mislead students in harmful ways. Recent research suggests that each year of delaying law school—for example, to wait until unemployment declines—is counterproductive. Even taking into account the potential benefits of graduating into a better economy, these delaying strategies typically cost the prospective law student more than $30,000 per year because of the high opportunity cost of lower earnings with a bachelor's degree instead of a law degree. The longer the delay, the higher the cost.
So which newspapers and journalists provided the most negative coverage? And how has the news slant evolved over time? For an explanation of methodology, see the footnote at the bottom.*
The most negative newspapers were the Wall Street journal, the Chicago Tribune, and the New York Times, in that order. The Wall Street Journal was exceptionally negative—more than 7 times as negative as the average newspaper. A few newspapers, such as the Orange County Register, were net positive.
2011 to 2013 were exceptionally negative years, with dramatic reductions in negativity in 2014.
Did negative press coverage cause a decline in law school applications, independent of the events being covered? Differences in press coverage can move financial markets, according to research exploiting variation in local coverage of identical national events and local trading patterns, so perhaps press coverage can also affect other markets. (The leaders of Law School Transparency apparently believe that negative press coverage can reduce law school applications. One of them explained his efforts to pitch negative news stories in specific parts of the country where he thought law school enrollments were too high.) (PDF here)**
The New York Times and WSJ both went negative early, but the Wall Street Journal remained more negative for a much longer period of time. Most of the uncredited (no byline) stories in the NY Times and WSJ about law school were negative.
The WSJ had an unusually deep bench of anti-law school journalists. By contrast, most newspapers had a few very negative journalists and otherwise a fairly even mix of slightly negative and slightly positive journalists. The most anti-law school journalist was Ameet Sachdev of the Chicago Tribune, whose coverage was about twice as negative as either David Segal of the New York Times or Jennifer Smith of the Wall Street Journal.
Geographically, the hardest hit areas were New York, Illinois (Chicago), and Washington D.C. (This is counting the New York Times and Wall Street Journal as New York papers). Ohio was the only state that saw net positive coverage.
The pattern of coverage does not seem to have much relationship to the strength of the local legal employment market, but rather seems to turn more heavily on idiosyncratic editorial policies at particular newspapers that happen to be headquartered in certain states.
* I asked my research assistant (a third year law student) to gather articles about legal education and the legal profession from the top 25 U.S. newspapers by circulation for which data was available from Proquest back to at least 2010. My RA then rated each article as "positive", "negative" or "neutral" depending on whether the article would have made him more or less likely to attend law school if he had read it while deciding. For each newspaper or journalist, the number of positive articles was subtracted from the number of negative articles to arrive at a net-negative count, and newspapers were ranked on this metric. There are some obvious limitations of this approach--it doesn't measure how positive or negative each article is, it assumes that one positive article can balance out one negative article (negative articles probably have a bigger impact than positive ones), it relies on the opinion of a single third year law student. It also lacks context—perhaps newspaper coverage about all topics is generally negative. Perhaps newspaper coverage of all higher education was negative during this period. Nevertheless, this approach may provide some useful insights. All editions of the Wall Street Journal and New York Times tracked by Proquest are combined, but identical articles published in different editions are counted only once. The WSJ blog is included as part of the WSJ.
** Contrary to popular belief, there is little evidence that larger law school graduating class sizes predict worse outcomes for law school graduates, nor is there evidence that smaller graduating class sizes predict better outcomes. See (Timing Law School and a summary). In a recent robustness check considering many alternative definitions of cohort size (but not yet reported in the draft paper), McIntyre and Simkovic continued to find no evidence that smaller graduating cohorts predict higher earnings premiums for recent graduates.
August 05, 2015
College costs more than it used to. It's also worth a lot more than it used to be worth. The increase in value of a college education exceeds the increase in the cost of a college education by a very wide margin.
How much has the cost of college actually increased? It may be less than you think.
According to the Department of Education and the National Center for Education Statistics, at 4 year institutions, average college tuition is up about $1,900 in real (inflation-adjusted) terms in the five years from 2008-09 ($21,996) to 2012-13 ($23,872). This is an average increase of less than $500 per year. The real increase during this 5-year period has been higher at public colleges ($2,100) than at private non-profit and for-profit colleges ($1,400).
That's before taking into account scholarships and grants.
After subtracting scholarship and grants, according to the College Board, real net tuition and fees at 4 year private non-profit institutions have actually gone down. Real net tuition and fees increased at 4-year public institutions over the last 6 years by about $1,000, or about $170 per year.
So how much would the value of higher education need to increase to justify this increase in cost? The increases at public institutions come to around $5,000 more for a bachelor's degree.*
That extra $5,000 will pay for itself if 4-year colleges spend the extra money in a way that boosts their former students' real annual earnings relative to high school graduates by $220.** When we take into account increases in college completion rates over time and longer life expectancy, the required increase in annual earnings could be even lower.
So yes, improvements in the quality of education can easily pay for increases in the costs of education. If the rising earnings premiums and increase in completion rates within race over the last three decades are caused by increased college expenditures, tens of thousands of dollars in increased expenditures per bachelor's degree have more than paid for themselves so far, and by a very wide margin.***
The labor economics literature generally suggests that the marginal rate of return to higher education is high, whether the "margin" is defined as upgrading individual education from high school to 4 years, from 2 years to a bachelor's, or from a bachelor's to an advanced degree. Within a given level of education and category of institution, those with more resources can generally do more to boost their students' earnings. A high marginal rate of return to education means we should invest more in higher education if we want the economy to grow faster, and invest less in things with lower marginal rates of return. (See here).
Investing more in education without increasing taxes means that tuition will likely increase. When we consider the benefits education provides, more investment in education is a good thing. When we consider our political system's allergic reaction to tax increases, increasing tuition may be the only realistic way to get there.
* Multiplying $1,000 by 5 years (assuming it takes 5 years to complete a bachelor's degree), we get an increase of $5,000 at public 4-year institutions (and a decline in cost at private institutions). For an individual, the aggregate increase in real net-tuition during 5 years of college might be less. The idea of the estimate is to compare the aggregate cost of college for individuals who completed college 5 years apart.
** This assumes a 40 year career and nominal (real) discount rate of 6 (3) percent. The $220 figure is before taxes and represents the aggregate social benefit to the government as tax collector and to the graduate, who will earn higher wages. If the entire cost is placed on the student, assuming 35 percent tax rates on the earnings premium, real annual earnings premiums would need to increase by $340 to make the student better off after taxes.
*** The differences in earnings in the column charts are raw differences by level of education rather than estimates of causal differences. However, the change in the raw differences over time may provide a good proxy for the change in the causal earnings premium over time.
Over at TaxProf, Paul Caron covers a student loan working paper inaccurately. Caron's headline is "NY Fed: Federal Aid For College Has Jacked Up Tuition (Especially In Graduate Schools)." (Emphasis added).
I've already discussed some of the methodological limitations of the working paper in question (read the bottom of the post). Beyond these serious issues, the working paper notably is not the view of the NY Fed (it is the individual work of 3 researchers, two of whom happen to work at the Fed) and it does not make claims about graduate school tuition. The study focuses on undergraduate tuition.
From the study:
"The views expressed in this paper are those of the authors and do not necessarily reflect the position of the Federal Reserve Bank of New York or the Federal Reserve System."
"In this paper, we used a Bartik-like approach to identify the effect of increased loan supply on tuition following large policy changes between 2008 and 2010 in the maximum federal aid amounts available to undergraduate students."
Kevin Drum at Mother Jones manages to do an even worse job than either the WSJ or TaxProf, declaring "As Federal Aid Goes Up, College Costs Rise Enough to Gobble It All Up." The claim in the working paper is not that an extra dollar of aid increases tuition by a dollar. The claim is that federal aid is associated with an increase in tuition of between 0 and 65 cents for every dollar of aid--depending on the type of aid and the control variables selected by the researchers--but the study failed to account for the fact that much of that increase in tuition will be returned to students as increased grants and scholarships.