September 01, 2015

Student loan borrowers with the highest debt levels have the lowest default rates (Michael Simkovic)

August 28, 2015

"Timing Law School" in The Washington Post (Michael Simkovic)

August 10, 2015

Newspapers’ negative law school coverage, 2010-2015 (Michael Simkovic)

In a recent column, the New York Times’ Nicholas Kristof confessed, “One of our worst traits in journalism is that when we have a narrative in our minds, we often plug in anecdotes that confirm it.”  The quote is timely, given recent controversy surrounding New York Times’ coverage.

Newspapers tend to emphasize anecdotes over data.  This gives journalists, editors, and their sources tremendous freedom to frame a story.  A few individuals can serve as ostensible examples of a broader phenomenon.  But if those examples are unrepresentative or taken out of context, the news story can be misleading by omission and emphasis.  If you get your information from the newspaper, you might worry more about stabbings and shootings than diet and exercise, but you are roughly 38 times more likely to die from heart disease than from violent crime.

Similar qualitative problems—sensationalism, reliance on extreme and unrepresentative anecdotes, lack of context, and omission of relevant data and peer reviewed research—characterized press coverage of law schools and the legal profession.  (See New York Times; The Wall Street Journal; New York Times again)

Newspapers conflated a generally weak labor market—in which law graduates continued to have substantial earnings and employment advantages over similar bachelor's degree holders (see The Economic Value of a Law Degree; Timing Law SchoolCompared to What? (here and here); Recent Entry Level Outcomes and Growth in Lawyer Employment and Earnings)—with a law-specific problem.  They criticized law schools—and only law schools—for practices that are widespread in higher education and in government. (see competitive scholarships; school-funded jobs, measuring employment / unemployment)   And they uncritically reported research, no matter how flawed, that fit the anti-law school narrative. (see Failing Law Schools' problems with data and citations; a free education as a hypothetical alternative to student loans; and other inapposite comparisons (here, here and here)).

Newspapers' sensationalist law school coverage may have helped their circulation—negative coverage attracts eyeballs—but it mislead students in harmful ways.  Recent research suggests that each year of delaying law school—for example, to wait until unemployment declines—is counterproductive.  Even taking into account the potential benefits of graduating into a better economy, these delaying strategies typically cost the prospective law student more than $30,000 per year because of the high opportunity cost of lower earnings with a bachelor's degree instead of a law degree.  The longer the delay, the higher the cost.

So which newspapers and journalists provided the most negative coverage?  And how has the news slant evolved over time?  For an explanation of methodology, see the footnote at the bottom.*

Net-negative newspapers ranked

The most negative newspapers were the Wall Street journal, the Chicago Tribune, and the New York Times, in that order.  The Wall Street Journal was exceptionally negative—more than 7 times as negative as the average newspaper.  A few newspapers, such as the Orange County Register, were net positive.

Net-negative stories by year

2011 to 2013 were exceptionally negative years, with dramatic reductions in negativity in 2014.

Did negative press coverage cause a decline in law school applications, independent of the events being covered?  Differences in press coverage can move financial markets, according to research exploiting variation in local coverage of identical national events and local trading patterns, so perhaps press coverage can also affect other markets.  (The leaders of Law School Transparency apparently believe that negative press coverage can reduce law school applications.  One of them explained his efforts to pitch negative news stories in specific parts of the country where he thought law school enrollments were too high.) (PDF here)**   

The New York Times and WSJ both went negative early, but the Wall Street Journal remained more negative for a much longer period of time.  Most of the uncredited (no byline) stories in the NY Times and WSJ about law school were negative.  

The WSJ had an unusually deep bench of anti-law school journalists.  By contrast, most newspapers had a few very negative journalists and otherwise a fairly even mix of slightly negative and slightly positive journalists.  The most anti-law school journalist was Ameet Sachdev of the Chicago Tribune, whose coverage was about twice as negative as either David Segal of the New York Times or Jennifer Smith of the Wall Street Journal.

Net negative journalists

Geographically, the hardest hit areas were New York, Illinois (Chicago), and Washington D.C.  (This is counting the New York Times and Wall Street Journal as New York papers).  Ohio was the only state that saw net positive coverage.  

Net negative stories by state

The pattern of coverage does not seem to have much relationship to the strength of the local legal employment market, but rather seems to turn more heavily on idiosyncratic editorial policies at particular newspapers that happen to be headquartered in certain states.  


*  I asked my research assistant (a third year law student) to gather articles about legal education and the legal profession from the top 25 U.S. newspapers by circulation for which data was available from Proquest back to at least 2010.  My RA then rated each article as "positive", "negative" or "neutral" depending on whether the article would have made him more or less likely to attend law school if he had read it while deciding.  For each newspaper or journalist, the number of positive articles was subtracted from the number of negative articles to arrive at a net-negative count, and newspapers were ranked on this metric.  There are some obvious limitations of this approach--it doesn't measure how positive or negative each article is, it assumes that one positive article can balance out one negative article (negative articles probably have a bigger impact than positive ones), it relies on the opinion of a single third year law student.  It also lacks context—perhaps newspaper coverage about all topics is generally negative.  Perhaps newspaper coverage of all higher education was negative during this period.  Nevertheless, this approach may provide some useful insights.  All editions of the Wall Street Journal and New York Times tracked by Proquest are combined, but identical articles published in different editions are counted only once.  The WSJ blog is included as part of the WSJ.

** Contrary to popular belief, there is little evidence that larger law school graduating class sizes predict worse outcomes for law school graduates, nor is there evidence that smaller graduating class sizes predict better outcomes.  See (Timing Law School and a summary).  In a recent robustness check considering many alternative definitions of cohort size (but not yet reported in the draft paper), McIntyre and Simkovic continued to find no evidence that smaller graduating cohorts predict higher earnings premiums for recent graduates.

August 10, 2015 in Guest Blogger: Michael Simkovic, Legal Profession, Of Academic Interest, Professional Advice, Science, Student Advice, Weblogs | Permalink

August 05, 2015

Do increases in the cost of college pay for themselves? (Michael Simkovic)

College costs more than it used to.  It's also worth a lot more than it used to be worth.  The increase in value of a college education exceeds the increase in the cost of a college education by a very wide margin.

How much has the cost of college actually increased?  It may be less than you think.   

According to the Department of Education and the National Center for Education Statistics, at 4 year institutions, average college tuition is up about $1,900 in real (inflation-adjusted) terms in the five years from 2008-09 ($21,996) to 2012-13 ($23,872).  This is an average increase of less than $500 per year. The real increase during this 5-year period has been higher at public colleges ($2,100) than at private non-profit and for-profit colleges ($1,400).  

That's before taking into account scholarships and grants.  

After subtracting scholarship and grants, according to the College Board, real net tuition and fees at 4 year private non-profit institutions have actually gone down.  Real net tuition and fees increased at 4-year public institutions over the last 6 years by about $1,000, or about $170 per year.

So how much would the value of higher education need to increase to justify this increase in cost?  The increases at public institutions come to around $5,000 more for a bachelor's degree.* 

That extra $5,000 will pay for itself if 4-year colleges spend the extra money in a way that boosts their former students' real annual earnings relative to high school graduates by $220.**  When we take into account increases in college completion rates over time and longer life expectancy, the required increase in annual earnings could be even lower.

So yes, improvements in the quality of education can easily pay for increases in the costs of education.   If the rising earnings premiums and increase in completion rates within race over the last three decades are caused by increased college expenditures, tens of thousands of dollars in increased expenditures per bachelor's degree have more than paid for themselves so far, and by a very wide margin.***

Slide1 Slide2

The labor economics literature generally suggests that the marginal rate of return to higher education is high, whether the "margin" is defined as upgrading individual education from high school to 4 years, from 2 years to a bachelor's, or from a bachelor's to an advanced degree.  Within a given level of education and category of institution, those with more resources can generally do more to boost their students' earnings.  A high marginal rate of return to education means we should invest more in higher education if we want the economy to grow faster, and invest less in things with lower marginal rates of return. (See here).

Investing more in education without increasing taxes means that tuition will likely increase.  When we consider the benefits education provides, more investment in education is a good thing.  When we consider our political system's allergic reaction to tax increases, increasing tuition may be the only realistic way to get there.  

* Multiplying $1,000 by 5 years (assuming it takes 5 years to complete a bachelor's degree), we get an increase of $5,000 at public 4-year institutions (and a decline in cost at private institutions).  For an individual, the aggregate increase in real net-tuition during 5 years of college might be less.  The idea of the estimate is to compare the aggregate cost of college for individuals who completed college 5 years apart.

** This assumes a 40 year career and nominal (real) discount rate of 6 (3) percent.  The $220 figure is before taxes and represents the aggregate social benefit to the government as tax collector and to the graduate, who will earn higher wages.  If the entire cost is placed on the student, assuming 35 percent tax rates on the earnings premium, real annual earnings premiums would need to increase by $340 to make the student better off after taxes.

*** The differences in earnings in the column charts are raw differences by level of education rather than estimates of causal differences.  However, the change in the raw differences over time may provide a good proxy for the change in the causal earnings premium over time.

August 5, 2015 in Guest Blogger: Michael Simkovic, Of Academic Interest, Professional Advice, Science, Student Advice, Web/Tech, Weblogs | Permalink

August 03, 2015

Public versus Private Student Loans (Michael Simkovic)

John Brooks (Georgetown) and Jonathan Glater (UC Irvine) argue in today’s Los Angeles Times that the Federal Government should raise the borrowing limit on federal student loans so that college students can borrow more from the government and less from private lenders.*

“Banks and other lenders offer so-called private loans, which often have higher interest rates and less flexible repayment terms [than Federal Student loans]. . . Private student loans are usually much more costly for students; a government report from 2012 found interest rates in excess of 16%, and nothing has improved since then. By contrast, the rate on the most widely used federal student loan currently is 4.29%.

[B]ecause federal loan caps have not budged even as tuition has increased, private lending is rising . . . borrowing is going to happen in some form anyway. This is not about whether college is a good investment (although it is), it is about whether students should be forced to take out loans that put them at greater risk of repayment difficulty and possible default.”

Brooks and Glater have effectively framed the student loan debate.  Federal Student loan policy is not a question of how much students should be allowed to borrow, but rather only a question of who they should borrow from, how much they should pay, and when they should pay.  Any government imposed loan limit is the point at which the borrowers will shift to expensive private sources of credit. 

In other words, private student lenders have a strong incentive to scale back public student loan programs.  The less available and less generous public programs become, the larger the market opportunity for private lenders.  (It is possible that higher or lower interest rates could affect the amount that students ultimately borrow—i.e., the quantity of credit demanded may respond to the price of credit—but Brooks and Glater are clearly correct that a federal student loan limit is not a hard cap on borrowing). 

The idea that increases in federal student loan availability or other public higher education funding programs will increase tuition is sometimes called the “Bennett Hypothesis,” and those who wish to scale back public investment in higher education frequently tout it.  However, there is little evidence in the peer-reviewed literature that increases in the availability of public student loans drive up tuition net of scholarships and grants at non-profit and public institutions of higher education (there is some evidence that this might be the case at for-profit trade schools).  The evidence of harm to students is even slimmer when one considers the potential benefits of tuition increases, which can fund better instruction, better administrative support, more modern facilities, and more generous scholarships, and the possible role of public funding in increasing enrollment and completion rates. By contrast, higher interest payments will generally only benefit student lenders, unless higher rates convey useful information about risk to which students respond. For a review of the literature, see here and here.**

Those advocating scaling back federal student loans argue that it is theoretically possible that income based repayment with debt forgiveness could lead to an explosion of tuition growth because, for some students, the marginal cost of additional borrowing will be zero and these students will not be price sensitive.  (See here

However, federal student loan critics have not shown that the introduction of IBR with debt forgiveness, or changes to the terms of these programs, has actually affected the rate of tuition increase net scholarships and grants.  (Indeed, tuition increases, less scholarship, have been relatively mild in recent years).  And this is not surprising—most students do not know in advance whether they will need or qualify for debt forgiveness, and will not know for sure until 10 or 20 years after they graduate.  Most of them will likely ultimately repay their loans in full.  Ex ante and in expectation—when they are shopping for a college or professional school—student borrowers do not face zero marginal cost. 

Similarly, think tank arguments about high costs to taxpayers from income-based repayment and debt forgiveness rely on dubious assumptions such as:

  1. Starting salaries for recent college and professional school graduates will grow at an extremely low rate (much lower than one could reasonably forecast after examining the historical data)
  2. Every single dollar of debt forgiveness is a cost of the debt forgiveness program, because if not for debt forgiveness programs, no borrower would ever fail to repay their loans and the government would collect every last dollar on time
  3. A loan in which the government recoups partial payments with a present value exceeding the amount of the original loan is not a profitable loan; it’s actually a loss
  4. The cost of lending $100,000 and receiving partial payments over the next 10 or 20 years is somehow much higher to the government than the cost of giving away $100,000 today and receiving no payments in return (this is related to assumptions 2 and 3 above, as well as  inappropriate uses of discount rates and growth rates).
  5. Income based repayment plans have no impact on enrollment and provide no benefits to the government in the form of a more educated workforce that pays higher taxes and depends less on taxpayer funded social services

* Brooks and Glater also praise income based repayment plans as a progressive-income-tax-like system of higher education finance in which those who earn more pay more.  These arguments will be familiar to those who have followed Brooks and Glater’s research. (here  and here)

**The Wall Street Journal publicized a recent working paper that claims to have found a possible link between federal student loan availability and tuition growth at undergraduate institutions. While some of the nuance may not have been reflected in the WSJ's coverage, the authors of that working paper note that: (1) They do not have good data that can distinguish an increase in borrowing from a shift between public and private loans; (2) They are only looking at sticker tuition, not actual tuition paid less scholarship and grants; (3) There are many ways in which public funding can benefit students even if it does increase sticker tuition; and their findings do not demonstrate that public funding is a harmful policy; (4) Variables omitted from their analyses could be driving tuition increases and introduction of additional controls dramatically changes their results.

August 3, 2015 in Guest Blogger: Michael Simkovic, Of Academic Interest, Professional Advice, Science, Student Advice, Weblogs | Permalink

July 31, 2015

What Deregulated Law Schools Really Look Like (Michael Simkovic)

One of the key claims of critics of legal education in general, and of ABA-approved law schools in particular, is that accreditation requirements drive up the costs of legal education without improving quality. If only we could deregulate law schools and unleash the creative power of free market competition and the awesome technological potential of online learning, legal education would become cheaper without any loss of quality.  Or so the story goes.

Fortunately, deregulated law schools exist alongside regulated law schools, so we can get a sense of what deregulation might look like.  And while unaccredited California law schools are less expensive than their accredited counterparts, their completion rates and bar passage rates are much lower than those for even the lowest ranked ABA approved law schools, as revealed by a recent Los Angeles Times investigation.

This is likely due at least in part to the incoming academic credentials and life circumstances of the students who enroll in unaccredited schools, and not simply due to differences in quality of education.  But there is no law preventing unaccredited law schools from competing with accredited law schools for the best students who want to stay in California, a large and prosperous state where many lawyers will spend their entire careers.  If accreditation is really an inefficient waste of time and resources, the unaccredited schools should have substantial advantages in the competition for students, and those students should have advantages in the competition for jobs.

At first glance, deregulation hardly looks like the panacea its advocates have made it out to be. ABA accreditation also looks pretty plausibly like standard consumer protection--a paternalistic attempt to eliminate low quality, low cost, and high-risk options--rather than a self-serving scheme to inflate prices. 

There are usually tradeoffs between cost and quality. It's not surprising that as goes the world, so goes legal education.  

July 31, 2015 in Guest Blogger: Michael Simkovic, Legal Profession, Of Academic Interest, Professional Advice, Science, Student Advice, Web/Tech, Weblogs | Permalink

The Wall Street Journal’s Coverage of Law School Funded Jobs (Michael Simkovic)

The Wall Street Journal’s recent story about law-school funded jobs is a good example of the slant that has pervaded its law school coverage for the last several years.  The general outline of the WSJ story is as follows: job outcomes for law school graduates have become so terrible that law schools are creating fake jobs for their graduates, not to help students succeed, but to game the U.S. News rankings.   The implication of the story is that law school is not only a bad idea for economic reasons, but that law schools are fundamentally corrupt and dishonest.  

The problem is that the WSJ has taken information out of context and presented it in a way that is misleading.   Like a Rorschach test, the story reveals more about the Wall Street Journal than it reveals about the subject of the story.

Here are some problems with the WSJ's coverage:

1. The data visualizations are misleading

There is a standard and widely accepted way to present percentage data.  The minimum possible value is 0 percent.  The maximum possible value is 100 percent.  Therefore, a figure showing percentages should almost always be scaled from zero to 100 percent.  The Wall Street Journal violates this rule of data visualization in ways that are revealing.

  WSJ School Funded Jobs Images


The WSJ scaled the figure at the left, showing law school employment, from 60 percent to 95 percent.  This makes law school employment look lower than it really is, and exaggerates the decline in employment. 

The middle chart, showing law-school funded employment is scaled from 0 to 6 percent.  This makes law-school funded jobs look like a huge proportion of employment rather than a tiny one (4 to 5 percent).  Contrary to the thrust of the WSJ’s story, there does not seem to be much of a relationship between overall employment outcomes and the proportion of school-funded jobs.

(The third chart, showing the proportion of school-funded jobs that are full time, long-term legal jobs increasing over time, is not commented on in the text of the story).

2. There is no discussion of what percentage of graduates of other programs are working positions funded by their institutions and little discussion of whether such jobs might be helpful

School-funded jobs are not unique to law schools.  Whereas press coverage of law schools hiring their own graduates has been overwhelmingly negative, coverage of colleges hiring their own graduates has generally been positive or the issue simply hasn’t been covered.  People might have doubts about educational institutions that never hire their own graduates for open positions, just as we might doubt a manufacturer or retailer that did not use any of its own products.

Are law schools more likely than other educational programs to hire their own graduates?  Are law-school funded jobs better or worse than these other school-funded jobs?  Are law-school funded jobs more or less likely to lead to good outcomes over the long term?

None of these important contextual issues are raised by the WSJ.

Even Above the Law provided a more balanced discussion of the possible upsides and downsides of school-funded jobs.

A similar issue arose with press coverage of competitive merit scholarships. Law schools were condemned harshly for policies that are also widely used by colleges and state governments, whereas colleges generally received more balanced coverage.  This was the case even though law students were actually more likely to keep their competitive scholarships than were many undergraduates.

3. There is no discussion of how overall law school employment compares to employment for recent college graduates or graduates of other programs. 

When it comes to apples-to-apples comparisons of law school graduates to similar bachelor’s degree holders with similar levels of work experience at the same point in time, law school graduates are more likely to be employed, more likely to be employed full time, and no less likely to be employed in a job that is related to what they studied.  They are also likely to be earning substantially more money than their less educated counterparts.  For the overwhelming majority of law school graduates, the lifetime boost to earnings more than makes up for the cost of law school.

The problem is not law school employment outcomes.  The problem is that the labor market in general is challenging for everyone, especially the young and inexperienced.  Law graduates generally do better than similar college graduates, who in turn generally do better than similar high school graduates. 

Law schools are not the employment story.  The employment story is the debate about aggregate demand and fiscal stimulus, and how best to provide more workers with the benefits conferred by higher education.

4. There is no discussion of how law school employment reporting compares to employment reporting for other educational institutions or standard definitions of “employment” used by the government

Under standard definitions of employment used by the U.S. government and just about everyone else, employment counts as employment whether it is school-funded or not, whether it is long term or full time or not, whether it is highly paid or not.

The use of non-standard definitions by law schools makes law school difficult to compare to alternatives.  This does not reflect higher or lower ethical standards—it simply reflects data collection and reporting practices that are not well thought out.  It can bias the presentation of the data in a way that makes law school look worse relative to alternatives when in fact law school employment outcomes under standardized measurements are usually better than many likely alternatives.

The standard definition of employment is not the only interesting measure of outcomes, so law schools may also want to consider other measures.  But any measure they use needs to be standardized and comparable across educational programs rather than used exclusively by law schools.

July 31, 2015 in Guest Blogger: Michael Simkovic, Legal Profession, Of Academic Interest, Professional Advice, Science, Student Advice, Weblogs | Permalink

July 22, 2015

Rostron & Levit's useful compilation of information about submitting to law reviews...

...has been updated again.  They write:

We  just updated our charts about law journal submissions, expedites, and rankings from different sources for the Fall 2015 submission season covering the 204 main journals of each law school.  

A couple of the highlight from this round of revisions are: 

First, the chart now includes as much information as possible about what law reviews are not accepting submissions right now and what dates they say they'll resume accepting submissions.  Most of this is not specific dates, because the journals tend to post only imprecise statements about how the journal is not currently accepting submissions but will start doing so at some point in spring.

Second, there continues to be a gradual increase in the number of journals using and preferring Scholastica instead of ExpressO or accepting emails submissions: 22 journals prefer or strongly prefer Scholastica, 14 more list it as one of the alternative acceptable avenues of submission, and 10 now list Scholastica as the exclusive method of submission.  

The first chart contains information about each journal’s preferences about methods for submitting articles (e.g., e-mail, ExpressO, Scholastica, or regular mail), as well as special formatting requirements and how to request an expedited review.  The second chart contains rankings information from U.S. News and World Report as well as data from Washington & Lee’s law review website.

The Washington & Lee data, I should note, is mostly silly (among other things, it does not control for publication volume by the journals).  Law review prominence and visibility tracks law school reputation, full stop.  For some specialty journals, the W&L data is somewhat useful, but that's about it.

July 22, 2015 in Advice for Academic Job Seekers, Of Academic Interest, Professional Advice | Permalink

July 14, 2015

What attributes predict student success in law school?

Two Colorado law professors (actual scholars, not the notorious clown!) have undertaken an interesting longitudinal study of law school success, looking at data, though, from just two schools:  Colorado and Case Western.   It is informative about schools with similar profiles, but I wonder whether the results hold if one looks at much stronger or much weaker schools?

(Thanks to Dean Rowan for the pointer.)

July 14, 2015 in Legal Profession, Of Academic Interest, Professional Advice, Student Advice | Permalink

June 18, 2015

Risk Based Student Loans in Bloomberg (Michael Simkovic)

Here.  For earlier coverage, see here.  For the original paper, see here.

June 18, 2015 in Guest Blogger: Michael Simkovic, Of Academic Interest, Professional Advice, Science, Student Advice, Weblogs | Permalink