March 26, 2015
Law schools and prospective law students may be paying more attention to employment outcomes shortly after graduation than this short-term data deserves. One potential use of the aggregate data about entry level employment and salaries is to assess whether now is a good or bad time to apply to law school. But fluctuations in employment outcomes for recent graduates do not predict fluctuations in employment outcomes 3 or 4 years in the future when those currently deciding whether to enroll would graduate.
Nevertheless, law students and the press pay close attention to the short-term outcome data. Starting salary data from the National Association for Law Placement (NALP) is covered by the press and is a good predictor of the number of law school applicants two years later (We assume one year lag for data collection and dissemination; one year lag to apply to law school).
Why are students responding to this data even though it does not predict their own short-term outcomes? And does the responsiveness of enrollment to short-term outcomes mean that law students care only about the short term?
Law students likely think more long term. If law students were so impatient that they only cared about one or a few years of earnings, it is doubtful that law students would have completed college, since college also makes sense only as a long-term investment. Indeed, students who were so focused on the short term might not even have finished high school. While temporal preferences can change over time, education appears to shift people toward thinking more long term. Aging from adolescence through the age of 30 is also associated with becoming more oriented toward the future.
Perhaps students are focused on the short term because they mistakenly believe that swings in short term outcomes predict more than they do. Students would not be alone in this error.
Some widely read back-of-the-envelope analyses started with initial salaries, assumed unrealistically low earnings growth along with high discount rates or an arbitrary payback period (lack of concern for the future) and reached the erroneous conclusion that going to law school does not make sense financially. (For a discussion see here; for examples of erroneous studies, see here and here )
Students may be focused on the short term because they mistakenly believe it predicts more than it does. Or they may focus on the short term because it is the only information that is readily available to them.
Legal educators and the press can and should make greater efforts to inform students of the long term as opposed to the short-term consequences of legal education. We should also shift the discussion away from raw outcomes and toward estimates of causation and value-added relative to the next best option.
This will be a challenge. Short-term raw outcome data is embedded in American Bar Association-required disclosures, in NALP’s data collection efforts and in the U.S. News rankings. Thinking in value-added terms requires us all to understand basic principles of causal inference and labor economics. But shifting toward long-term value added is ultimately the right thing to do if we are serious about providing students with meaningful disclosure and facilitating informed decision making.
This is not meant to justify indifference to the plight of young people who have suffered the misfortune of graduating into an unfavorable economic climate over the last several years. To help alleviate youth unemployment, we must understand that the cause of this misfortune is the macro-economy, not higher education. Education is an important part of the solution. Among those who are young and inexperienced, those with more education continue to do better in the labor market than those with less, and this difference appears to be largely caused by the differences in level of education.
Insurance programs like income-based repayment of student loans and flexible and extended repayment plans can help young people manage the unpredictable and uncontrollable risk that they might happen to graduate into a bad economy. If this insurance leads to more people pursuing higher education, earning higher incomes, and paying more taxes, it will benefit not only students and educators, but also the federal government and the broader economy.
March 25, 2015
Many legal educators believe that shrinking class sizes will help the students they do admit find higher paid work more easily and boost the value of legal education. They reason that if the supply of law graduates shrinks, then the market price law graduates can command should increase.
According to another hypothesis, now popular in the press, a decline in the number of law school applicants reflects the wisdom of the crowds. Students now realize that a law degree simply isn’t worth it, and smaller class sizes reflect a consensus prediction of worse outcomes for law graduates in the future.
Frank McIntyre and I investigated whether changes in law cohort size predict earnings premiums. Historically, they have not. Not for recent graduates, and not for law graduates overall. Nor have changes in cohort size predicted much of anything about the entry-level measures used by the National Association for Law Placement (NALP)—starting salary, initial employment, initial law firm employment.
How can both of these theories be wrong? One possibility is that they are both right, but the two effects offset each other. This seems unlikely however. If neither macroeconomic data nor Bureau of Labor Statistics (BLS) employment projections can predict law employment conditions at graduation, then how likely is it that recent college graduates with less information and less expertise could make a better prediction?
A more likely possibility is that there are other factors at play that prevent any strong predictions about the relationship between cohort size and outcomes / value added. For example, law schools may become less selective as cohort size shrinks and more selective as it increases. In addition, the resources available to law schools, and therefore the quality of education and training they are able to provide, may also change with cohort size. Since physical facilities expenses are not particularly variable in the ordinary course, most budgetary adjustments at law schools presumably take place with respect to personnel.
Anecdotally, many law schools appear to be managing the recent decline in enrollments by shrinking their faculties and administrations and using remaining personnel to teach classes and perform functions outside of their areas of expertise. Reduced specialization and a lack of economies of scale could affect the quality of service provided to students, offsetting any benefits to students from less competition at graduation.
Previous research in labor economics has found that resources per student are an important predictor of value added by college education, and that the use of adjuncts can lead to worse outcomes for students. (See here for a review)
Much of this is speculative—we do not yet understand why changes in cohort size do not predict law graduate outcomes, only that they do not predict outcomes. Given the historical data, it is probably not advisable to read too much into what the decline in law school enrollment means for students who will graduate over the next few years.
Instead, we should focus on the long-term historical data and the value of a law degree across economic cycles and enrollment levels.
March 23, 2015
Labor economists have long cautioned against the misuse of Bureau of Labor Statistics (BLS) employment projections.
In 2004, Michael Horrigan at the BLS explained that the BLS projections should not be used to value education or to attempt to predict shortages or surpluses of educated labor. Instead, the value of education should be measured based on earnings premiums—the measure used in The Economic Value of a Law Degree and Timing Law School.
The general problem with addressing the question whether the U.S. labor market will have a shortage of workers in specific occupations over the next 10 years is the difficulty of projecting, for each detailed occupation, the dynamic labor market responses to shortage conditions. . . . Since the late 1970s, average premiums paid by the labor markets to those with higher levels of education have increased.
It is the growing distance, on average, between those with more education, compared with those with less, that speaks to a general preference on the part of employers to hire those with skills associated with higher levels of education.
The BLS takes the same position in its FAQ. The BLS does not project labor shortages or surpluses.
In 2006, Richard Freeman back-tested the BLS projections and found that “the projections of future demands for skills lack the reliability to guide policies on skill development.”
The BLS employment projections are not only unreliable. Comparing occupation-specific employment projections to number of graduates in related fields systematically underestimates the value of higher education.
In 2011 David Neumark, Hans Johnson, & Marisol Cuellar Mejia wrote:
If there are positive returns to education levels above those indicated as the skill requirement for an occupation in the BLS data – and especially if these wage premia are similar to those in other occupations – then relying on the BLS skill requirements likely substantially understates projected skill demands.
For nearly every occupational grouping, wage returns are higher for more highly-educated workers even if the BLS says such high levels of education are not necessary. For example . . . for management occupations, the estimated coefficients for Master’s, professional, and doctoral degrees are all above the estimated coefficient for a Bachelor’s degree, which is the BLS required level. . . ..
If the BLS numbers are correct, we might expect to see higher unemployment and greater underemployment of more highly-educated workers in the United States. As noted earlier, we do not find evidence of this kind of underemployment based on earnings data. Similarly, labor force participation rates are higher and unemployment rates are lower for more highly educated workers.
Neumark et. al. also noted that recent BLS projections appeared to be much too low for managerial and legal services occupations.
Starting around 2012 many law professors and pundits argued that the number of job openings for lawyers projected by the BLS relative to the number of expected law graduates suggested that too many students were attending law school and that they would not get much value out of their degrees.
The Bureau [of Labor Statistic]’s occupational employment projections . . . answer the very question that many law school applicants want to know: How many new lawyers will the economy be able to absorb this decade?
The Bureau currently estimates that the economy will create 218,800 job openings for lawyers and judicial law clerks during the decade stretching from 2010 through 2020. That number, unfortunately, falls far short of the number of aspiring lawyers that law schools are graduating.
The oversupply of entry-level lawyers deprives many graduates of any opportunity to practice law. At the same time, the lawyer surplus constrains entry-level salaries.
Merritt notes the possibility that law might be a versatile degree with value outside of legal practice.
Further evidence that law degrees are unlikely to become more valuable going forward can be found in the projections of the Bureau for Labor Statistics (BLS) . . . [which suggest many more law graduates than job openings].”
In 2013, Brian Tamanaha wrote:
The U.S. Bureau of Labor Statistics estimates about 22,000 lawyer openings annually through 2020 (counting departures and newly created jobs). Yet law schools yearly turn out more than 40,000 graduates. This bleak job market coexists with astronomically high tuition.
Several and journalists also started comparing BLS projections and job openings to make much the same argument.
In 2013, unaware of the problems with job openings projections, I (Simkovic) suggested that projections might be used to make adjustments to more objective historical baselines for risk-based student loan pricing.
On the chance that BLS projections that perform poorly in other contexts perform well in the legal education context, Frank McIntyre and I analyzed the extent to which BLS projections predict law graduate outcomes (earnings premiums). The answer is: no better than random chance.
As in other areas, BLS employment projections are not reliable or meaningful for predicting earnings premiums and are therefore not useful for valuing legal education.
But what about the number of job openings for lawyers? Can BLS projections at least predict that reasonably well?
It is unclear at this point if the new job opening projections method will predict earnings premiums better than the old ones. In any case, that was never their intended purpose, and it would be safer to predict earnings premiums and value education based on historical earnings premiums.
It remains likely that many law school graduates will not practice law. Such has been the case in the past, and such is the case in other fields. Many engineering, math and science graduates do not work as engineers, mathematicians or scientists in their fields of study. Most fields of study do not have a one-to-one correspondence with a particular occupation, but are more broadly useful in the labor market, and law is no exception. In spite of many individuals working outside their degree fields, higher education typically has been, and likely will remain, an investment with positive returns.
To best way to tell whether there is too much or too little investment in education is to consider relative returns that take into account risks and variability in employment. Are the returns to education higher or lower than returns that can be had elsewhere with similar levels of risk? The returns to education are generally much higher, and risk does not appear to explain this difference adequately. The high relative returns to education suggest underinvestment in education.
March 19, 2015
How can we test predictions about the future when we don’t yet have data showing what will happen in the future? One answer is hindcasting. You already believe in hindcasting if you believe in the science behind global warming (see also here and here).
“Hindcasting” (or “backtesting”) is using historical data to test prediction methods and it is widely used in finance, engineering, and climate science. The basic idea is that a prediction method can be reduced to a set of rules or mathematical formulas. Historical data from the more distant past can be fed into these rules and formulas, and the resulting predictions about the “future” (relative to the distant past that provided the data) will also be predictions about the past (relative to the period in which the researcher conducts the backtest).
Since data about the “future” is now available, predictions generated by the prediction method can be compared to what actually happened. A prediction method does not have to be correct all of the time to be useful; if a prediction method performs a bit better than random chance, it might still be useful in many contexts, especially in investment management. If it performs better than the next best prediction method, then it is still useful even if it is imperfect. But if a prediction method does not perform any better than random chance, it is discredited and discarded.
Using this hindcasting approach, Frank McIntyre and I test popular prediction methods used by various pundits and professors to try to predict whether now is a good or bad time to go to law school. (See Timing Law School) As in our previous research, our primary outcome variable of interest is law earnings premiums—the earnings of law school graduates relative to the earnings of similar bachelor’s degree holders. This is the relevant measure, because it goes to the value added by law school, and can be compared to the cost of attendance.
The peer-reviewed labor economics literature finds that a law degree has been a lucrative investment for the overwhelming majority of law school graduates compared to entering the labor market with just a bachelor’s degree. Nevertheless, questions persist about whether now is an unusually good or bad time to start law school.
According to one popular hypothesis, now is an unusually bad time to go to law school because employment outcomes for recent graduates 9 months after graduation have deteriorated. These graduates, it is argued, will not have the same career success as law school graduates in the past. Moreover, deterioration in outcomes for those who graduated last year predicts poor outcomes three or four years in the future and beyond for those who are entering law school now.
According to another popular hypothesis, now is an unusually good time to go to law school because so few people are doing it. When these small cohorts of law students eventually graduate, they will all be more likely to find a high paying job than the larger cohorts of the past. A variation on this argument is that now is still a bad time to go to law school in spite of falling enrollments because the number of law school graduates will still be greater than the number of BLS projected job openings for lawyers. (For a discussion of newer BLS projection methods showing more job openings, see here)
Our analysis includes graduates from 1964 through 2008 and earnings data from 1984 to 2013. This period captures numerous economic booms and recessions. As in The Economic Value of a Law Degree, our main source of data is the U.S. Census Bureau’s Survey of Income and Program Participation. We were able to backfill the data to include older versions of the survey and capture more years of macroeconomic variation thanks to grant funding from Access Group, Inc., and LSAC. (Because the older data has some limitations, those who are interested in the value of a law degree rather than the size of cohort effects should still consult our 2014 article).
None of the prediction methods we tested perform better than random chance. Cohort size is not predictive. Cohort size relative to BLS projections is equally useless. Although those who graduate in a boom when unemployment is low do indeed have higher earnings premiums in their first few years after graduation than those who graduate when unemployment is low or moderate, the effect fades after the first four years. More importantly, it is not possible to predict whether unemployment will be high or low four years in the future based on currently available data. Even those who are unlucky enough to graduate into a weak economy still generally benefit substantially from their law degrees.
Delaying law school to attempt to “time the market” is an imprudent strategy. It does not improve one’s chances of graduating into a favorable economic climate. It entails substantial opportunity costs in the form of fewer years of higher, post-law-school earnings. The cost of every year of delay averages tens of thousands of dollars. Popular prediction methods for market timing are not only scientifically baseless; they also appear to be financially toxic to prospective students who take them seriously.
The best guide to the future continues to be the long-term historical data. Short-term fluctuations around these averages are not readily predictable. Instead of trying to predict the unpredictable, it may be more prudent to focus on helping students manage these risks, for example through insurance programs similar to Income-Based Repayment of student loans. (See also here)
But what about more recent graduates? How much can we say about those who graduated after 2008, and is this time different? How can we explain our results in light of previous research on cohort effects focused on bachelor’s degree holders?
For answers to some of these questions, look for our next blog post.
January 26, 2015
January 23, 2015
January 16, 2015
Reader Gerard Ambreson writes:
In your capacity of chief commentator on legal education, perhaps you could provide your thoughts on your blog on named professorships/chairs. Some questions it would be interesting to see you address:
What should the criteria for awarding chairs be? For example, should schools take into account things besides scholarship, such as other contributions to the school, like excellent teaching or service? Is it appropriate to consider non-legal writings for chairs in law schools? See, for example, St. John's Law School's Reverend Joseph T. Tinnelly, C.M., Professor of Law, Lawrence Joseph, who may be better known for his poetry than his writing on legal matters?
My impression is that criteria vary with school and often with the particular endowment--many schools have named positions to recognize excellent teaching, for example.
December 04, 2014
MOVING TO FRONT FROM DEC. 1--THANKS TO THOSE WHO HAVE ALREADY COMMENTED, I WOULD LOVE TO HEAR FROM SOME OTHER TITLE IX EXPERTS ON THESE ISSUES
In my other academic field, philosophy, there has been much discussion of the move by the University of Colorado at Boulder to fire a tenured philosophy professor (David Barnett) for "retaliation" against a female complainant in a sexual assault case. A university investigation found against a male graduate student in philosophy (with whom Barnett had worked); Barnett conducted his own investigation of the university's investigation, and sent the University Chancellor a 38-page report alleging mistakes and misconduct in the university investigation. (A copy of this report has not been made public to my knowledge.)
So what constitutes "retaliation" under Title IX? Can alleging a university investigation was flawed constitute retaliation? How does "retaliation" under Title IX interact with the First Amendment rights of faculty and students? Any insight from readers would be welcome.
October 08, 2014
Barry Friedman (NYU) writes with an excellent set of questions and observations:
Here’s a thought worth maybe tooting on your blog. It never ceases to catch my attention how much school hiring is driven by signals from other schools. School X will interview candidate Y and love him/her, or will love him/her on paper, but will never move forward for an interview absent a strong signal from some number of schools they consider competitive. Yet, in this tight market, those signals get fewer – especially at the call back and offer stage. It has the effect I think of killing candidates that otherwise would get interviews or offers. Yet, paradoxically, if schools had confidence in their internal assessments (and it is not like this is one person deciding; it is an entire faculty or faculty committee) this sort of market provides a real opportunity to steal that person you loved without a fight.
So why do schools do this? I think in most cases it is because they lack confidence in their own judgments. But what do readers think? I would prefer signed comments, but you must, in any case, include a valid e-mail address, which will not appear.