August 10, 2015
Jake Brooks at Georgetown comments on the Department of Education's proposed regulations for Revised Pay As You Earn (REPAYE). Brooks focuses on the cap on monthly payments, notch-and-cliff rules around the repayment period, and definitions surrounding interest rates, focusing mainly on technical problems with several of the proposed rules.
Gregory Crespi at SMU comments as well, arguing that spousal income inclusion rules and the long repayment period (25 years) will discourage many professional students from enrolling. Crespi thinks the Department of Education's enrollment estimates are too high by a factor of 3. If Crespi is correct, then estimates of the cost of the program to taxpayers, and of the benefits to professionals, may be greatly exaggerated.
Frank A. Pasquale at Maryland also comments, arguing that the marriage penalty (previously identified by Phil Schrag) should be softened, the repayment period should be shortened, and estimates of the costs of the program should also include its potential benefits to taxpayers in terms of increasing the educational level, and therefore the income, of the workforce, which would increase tax revenue and reduce costs of various social programs. In other words, the Department of Education should use cost-benefit analysis rather than just cost analysis.
In a recent column, the New York Times’ Nicholas Kristof confessed, “One of our worst traits in journalism is that when we have a narrative in our minds, we often plug in anecdotes that confirm it.” The quote is timely, given recent controversy surrounding New York Times’ coverage.
Newspapers tend to emphasize anecdotes over data. This gives journalists, editors, and their sources tremendous freedom to frame a story. A few individuals can serve as ostensible examples of a broader phenomenon. But if those examples are unrepresentative or taken out of context, the news story can be misleading by omission and emphasis. If you get your information from the newspaper, you might worry more about stabbings and shootings than diet and exercise, but you are roughly 38 times more likely to die from heart disease than from violent crime.
Similar qualitative problems—sensationalism, reliance on extreme and unrepresentative anecdotes, lack of context, and omission of relevant data and peer reviewed research—characterized press coverage of law schools and the legal profession. (See New York Times; The Wall Street Journal; New York Times again)
Newspapers conflated a generally weak labor market—in which law graduates continued to have substantial earnings and employment advantages over similar bachelor's degree holders (see The Economic Value of a Law Degree; Timing Law School; Compared to What? (here and here); Recent Entry Level Outcomes and Growth in Lawyer Employment and Earnings)—with a law-specific problem. They criticized law schools—and only law schools—for practices that are widespread in higher education and in government. (see competitive scholarships; school-funded jobs, measuring employment / unemployment) And they uncritically reported research, no matter how flawed, that fit the anti-law school narrative. (see Failing Law Schools' problems with data and citations; a free education as a hypothetical alternative to student loans; and other inapposite comparisons (here, here and here)).
Newspapers' sensationalist law school coverage may have helped their circulation—negative coverage attracts eyeballs—but it mislead students in harmful ways. Recent research suggests that each year of delaying law school—for example, to wait until unemployment declines—is counterproductive. Even taking into account the potential benefits of graduating into a better economy, these delaying strategies typically cost the prospective law student more than $30,000 per year because of the high opportunity cost of lower earnings with a bachelor's degree instead of a law degree. The longer the delay, the higher the cost.
So which newspapers and journalists provided the most negative coverage? And how has the news slant evolved over time? For an explanation of methodology, see the footnote at the bottom.*
The most negative newspapers were the Wall Street journal, the Chicago Tribune, and the New York Times, in that order. The Wall Street Journal was exceptionally negative—more than 7 times as negative as the average newspaper. A few newspapers, such as the Orange County Register, were net positive.
2011 to 2013 were exceptionally negative years, with dramatic reductions in negativity in 2014.
Did negative press coverage cause a decline in law school applications, independent of the events being covered? Differences in press coverage can move financial markets, according to research exploiting variation in local coverage of identical national events and local trading patterns, so perhaps press coverage can also affect other markets. (The leaders of Law School Transparency apparently believe that negative press coverage can reduce law school applications. One of them explained his efforts to pitch negative news stories in specific parts of the country where he thought law school enrollments were too high.) (PDF here)**
The New York Times and WSJ both went negative early, but the Wall Street Journal remained more negative for a much longer period of time. Most of the uncredited (no byline) stories in the NY Times and WSJ about law school were negative.
The WSJ had an unusually deep bench of anti-law school journalists. By contrast, most newspapers had a few very negative journalists and otherwise a fairly even mix of slightly negative and slightly positive journalists. The most anti-law school journalist was Ameet Sachdev of the Chicago Tribune, whose coverage was about twice as negative as either David Segal of the New York Times or Jennifer Smith of the Wall Street Journal.
Geographically, the hardest hit areas were New York, Illinois (Chicago), and Washington D.C. (This is counting the New York Times and Wall Street Journal as New York papers). Ohio was the only state that saw net positive coverage.
The pattern of coverage does not seem to have much relationship to the strength of the local legal employment market, but rather seems to turn more heavily on idiosyncratic editorial policies at particular newspapers that happen to be headquartered in certain states.
* I asked my research assistant (a third year law student) to gather articles about legal education and the legal profession from the top 25 U.S. newspapers by circulation for which data was available from Proquest back to at least 2010. My RA then rated each article as "positive", "negative" or "neutral" depending on whether the article would have made him more or less likely to attend law school if he had read it while deciding. For each newspaper or journalist, the number of positive articles was subtracted from the number of negative articles to arrive at a net-negative count, and newspapers were ranked on this metric. There are some obvious limitations of this approach--it doesn't measure how positive or negative each article is, it assumes that one positive article can balance out one negative article (negative articles probably have a bigger impact than positive ones), it relies on the opinion of a single third year law student. It also lacks context—perhaps newspaper coverage about all topics is generally negative. Perhaps newspaper coverage of all higher education was negative during this period. Nevertheless, this approach may provide some useful insights. All editions of the Wall Street Journal and New York Times tracked by Proquest are combined, but identical articles published in different editions are counted only once. The WSJ blog is included as part of the WSJ.
** Contrary to popular belief, there is little evidence that larger law school graduating class sizes predict worse outcomes for law school graduates, nor is there evidence that smaller graduating class sizes predict better outcomes. See (Timing Law School and a summary). In a recent robustness check considering many alternative definitions of cohort size (but not yet reported in the draft paper), McIntyre and Simkovic continued to find no evidence that smaller graduating cohorts predict higher earnings premiums for recent graduates.
August 05, 2015
Over at TaxProf, Paul Caron covers a student loan working paper inaccurately. Caron's headline is "NY Fed: Federal Aid For College Has Jacked Up Tuition (Especially In Graduate Schools)." (Emphasis added).
I've already discussed some of the methodological limitations of the working paper in question (read the bottom of the post). Beyond these serious issues, the working paper notably is not the view of the NY Fed (it is the individual work of 3 researchers, two of whom happen to work at the Fed) and it does not make claims about graduate school tuition. The study focuses on undergraduate tuition.
From the study:
"The views expressed in this paper are those of the authors and do not necessarily reflect the position of the Federal Reserve Bank of New York or the Federal Reserve System."
"In this paper, we used a Bartik-like approach to identify the effect of increased loan supply on tuition following large policy changes between 2008 and 2010 in the maximum federal aid amounts available to undergraduate students."
Kevin Drum at Mother Jones manages to do an even worse job than either the WSJ or TaxProf, declaring "As Federal Aid Goes Up, College Costs Rise Enough to Gobble It All Up." The claim in the working paper is not that an extra dollar of aid increases tuition by a dollar. The claim is that federal aid is associated with an increase in tuition of between 0 and 65 cents for every dollar of aid--depending on the type of aid and the control variables selected by the researchers--but the study failed to account for the fact that much of that increase in tuition will be returned to students as increased grants and scholarships.
August 04, 2015
...which will remain in their current locations (Camden and Newark), but operate as a single unit. Two points worth noting: Rutgers began developing the merger before Hamline/William Mitchell, and unlike the latter, the impetus was not dwindling enrollments.
July 31, 2015
One of the key claims of critics of legal education in general, and of ABA-approved law schools in particular, is that accreditation requirements drive up the costs of legal education without improving quality. If only we could deregulate law schools and unleash the creative power of free market competition and the awesome technological potential of online learning, legal education would become cheaper without any loss of quality. Or so the story goes.
Fortunately, deregulated law schools exist alongside regulated law schools, so we can get a sense of what deregulation might look like. And while unaccredited California law schools are less expensive than their accredited counterparts, their completion rates and bar passage rates are much lower than those for even the lowest ranked ABA approved law schools, as revealed by a recent Los Angeles Times investigation.
This is likely due at least in part to the incoming academic credentials and life circumstances of the students who enroll in unaccredited schools, and not simply due to differences in quality of education. But there is no law preventing unaccredited law schools from competing with accredited law schools for the best students who want to stay in California, a large and prosperous state where many lawyers will spend their entire careers. If accreditation is really an inefficient waste of time and resources, the unaccredited schools should have substantial advantages in the competition for students, and those students should have advantages in the competition for jobs.
At first glance, deregulation hardly looks like the panacea its advocates have made it out to be. ABA accreditation also looks pretty plausibly like standard consumer protection--a paternalistic attempt to eliminate low quality, low cost, and high-risk options--rather than a self-serving scheme to inflate prices.
There are usually tradeoffs between cost and quality. It's not surprising that as goes the world, so goes legal education.
The Wall Street Journal’s recent story about law-school funded jobs is a good example of the slant that has pervaded its law school coverage for the last several years. The general outline of the WSJ story is as follows: job outcomes for law school graduates have become so terrible that law schools are creating fake jobs for their graduates, not to help students succeed, but to game the U.S. News rankings. The implication of the story is that law school is not only a bad idea for economic reasons, but that law schools are fundamentally corrupt and dishonest.
The problem is that the WSJ has taken information out of context and presented it in a way that is misleading. Like a Rorschach test, the story reveals more about the Wall Street Journal than it reveals about the subject of the story.
Here are some problems with the WSJ's coverage:
1. The data visualizations are misleading
There is a standard and widely accepted way to present percentage data. The minimum possible value is 0 percent. The maximum possible value is 100 percent. Therefore, a figure showing percentages should almost always be scaled from zero to 100 percent. The Wall Street Journal violates this rule of data visualization in ways that are revealing.
The WSJ scaled the figure at the left, showing law school employment, from 60 percent to 95 percent. This makes law school employment look lower than it really is, and exaggerates the decline in employment.
The middle chart, showing law-school funded employment is scaled from 0 to 6 percent. This makes law-school funded jobs look like a huge proportion of employment rather than a tiny one (4 to 5 percent). Contrary to the thrust of the WSJ’s story, there does not seem to be much of a relationship between overall employment outcomes and the proportion of school-funded jobs.
(The third chart, showing the proportion of school-funded jobs that are full time, long-term legal jobs increasing over time, is not commented on in the text of the story).
2. There is no discussion of what percentage of graduates of other programs are working positions funded by their institutions and little discussion of whether such jobs might be helpful
School-funded jobs are not unique to law schools. Whereas press coverage of law schools hiring their own graduates has been overwhelmingly negative, coverage of colleges hiring their own graduates has generally been positive or the issue simply hasn’t been covered. People might have doubts about educational institutions that never hire their own graduates for open positions, just as we might doubt a manufacturer or retailer that did not use any of its own products.
Are law schools more likely than other educational programs to hire their own graduates? Are law-school funded jobs better or worse than these other school-funded jobs? Are law-school funded jobs more or less likely to lead to good outcomes over the long term?
None of these important contextual issues are raised by the WSJ.
Even Above the Law provided a more balanced discussion of the possible upsides and downsides of school-funded jobs.
A similar issue arose with press coverage of competitive merit scholarships. Law schools were condemned harshly for policies that are also widely used by colleges and state governments, whereas colleges generally received more balanced coverage. This was the case even though law students were actually more likely to keep their competitive scholarships than were many undergraduates.
3. There is no discussion of how overall law school employment compares to employment for recent college graduates or graduates of other programs.
When it comes to apples-to-apples comparisons of law school graduates to similar bachelor’s degree holders with similar levels of work experience at the same point in time, law school graduates are more likely to be employed, more likely to be employed full time, and no less likely to be employed in a job that is related to what they studied. They are also likely to be earning substantially more money than their less educated counterparts. For the overwhelming majority of law school graduates, the lifetime boost to earnings more than makes up for the cost of law school.
The problem is not law school employment outcomes. The problem is that the labor market in general is challenging for everyone, especially the young and inexperienced. Law graduates generally do better than similar college graduates, who in turn generally do better than similar high school graduates.
Law schools are not the employment story. The employment story is the debate about aggregate demand and fiscal stimulus, and how best to provide more workers with the benefits conferred by higher education.
4. There is no discussion of how law school employment reporting compares to employment reporting for other educational institutions or standard definitions of “employment” used by the government
Under standard definitions of employment used by the U.S. government and just about everyone else, employment counts as employment whether it is school-funded or not, whether it is long term or full time or not, whether it is highly paid or not.
The use of non-standard definitions by law schools makes law school difficult to compare to alternatives. This does not reflect higher or lower ethical standards—it simply reflects data collection and reporting practices that are not well thought out. It can bias the presentation of the data in a way that makes law school look worse relative to alternatives when in fact law school employment outcomes under standardized measurements are usually better than many likely alternatives.
The standard definition of employment is not the only interesting measure of outcomes, so law schools may also want to consider other measures. But any measure they use needs to be standardized and comparable across educational programs rather than used exclusively by law schools.
July 16, 2015
A new empirical article by Tom Ginsburg and Thomas J. Miles finds evidence of possible complementarity between scholarly output and quality of teaching at the University of Chicago.
From the conclusion:
The recent debate on the mission of American law schools has hinged on the assumption that a trade-off exists between teaching and research, and this article’s analysis, although limited in various ways, casts some doubt on that assumption.
Tom Ginsburg & Thomas J. Miles, The Teaching/ Research Trade-Off in Law: Data From the Right Tail, 39 Evaluation Rev. 46 (2015).
July 14, 2015
Two Colorado law professors (actual scholars, not the notorious clown!) have undertaken an interesting longitudinal study of law school success, looking at data, though, from just two schools: Colorado and Case Western. It is informative about schools with similar profiles, but I wonder whether the results hold if one looks at much stronger or much weaker schools?
(Thanks to Dean Rowan for the pointer.)
July 10, 2015
According to LSAC, June 2015 LSAT takers were up 6.6% from June 2014, the first time we've seen an increase since June 2010, and the biggest increase since June 2009. I wouldn't suppose that this means we will see a significant increase in applicants, but it certainly seems likely we've hit a plateau.