March 23, 2015
Labor economists have long cautioned against the misuse of Bureau of Labor Statistics (BLS) employment projections.
In 2004, Michael Horrigan at the BLS explained that the BLS projections should not be used to value education or to attempt to predict shortages or surpluses of educated labor. Instead, the value of education should be measured based on earnings premiums—the measure used in The Economic Value of a Law Degree and Timing Law School.
The general problem with addressing the question whether the U.S. labor market will have a shortage of workers in specific occupations over the next 10 years is the difficulty of projecting, for each detailed occupation, the dynamic labor market responses to shortage conditions. . . . Since the late 1970s, average premiums paid by the labor markets to those with higher levels of education have increased.
It is the growing distance, on average, between those with more education, compared with those with less, that speaks to a general preference on the part of employers to hire those with skills associated with higher levels of education.
The BLS takes the same position in its FAQ. The BLS does not project labor shortages or surpluses.
In 2006, Richard Freeman back-tested the BLS projections and found that “the projections of future demands for skills lack the reliability to guide policies on skill development.”
The BLS employment projections are not only unreliable. Comparing occupation-specific employment projections to number of graduates in related fields systematically underestimates the value of higher education.
In 2011 David Neumark, Hans Johnson, & Marisol Cuellar Mejia wrote:
If there are positive returns to education levels above those indicated as the skill requirement for an occupation in the BLS data – and especially if these wage premia are similar to those in other occupations – then relying on the BLS skill requirements likely substantially understates projected skill demands.
For nearly every occupational grouping, wage returns are higher for more highly-educated workers even if the BLS says such high levels of education are not necessary. For example . . . for management occupations, the estimated coefficients for Master’s, professional, and doctoral degrees are all above the estimated coefficient for a Bachelor’s degree, which is the BLS required level. . . ..
If the BLS numbers are correct, we might expect to see higher unemployment and greater underemployment of more highly-educated workers in the United States. As noted earlier, we do not find evidence of this kind of underemployment based on earnings data. Similarly, labor force participation rates are higher and unemployment rates are lower for more highly educated workers.
Neumark et. al. also noted that recent BLS projections appeared to be much too low for managerial and legal services occupations.
Starting around 2012 many law professors and pundits argued that the number of job openings for lawyers projected by the BLS relative to the number of expected law graduates suggested that too many students were attending law school and that they would not get much value out of their degrees.
The Bureau [of Labor Statistic]’s occupational employment projections . . . answer the very question that many law school applicants want to know: How many new lawyers will the economy be able to absorb this decade?
The Bureau currently estimates that the economy will create 218,800 job openings for lawyers and judicial law clerks during the decade stretching from 2010 through 2020. That number, unfortunately, falls far short of the number of aspiring lawyers that law schools are graduating.
The oversupply of entry-level lawyers deprives many graduates of any opportunity to practice law. At the same time, the lawyer surplus constrains entry-level salaries.
Merritt notes the possibility that law might be a versatile degree with value outside of legal practice.
Further evidence that law degrees are unlikely to become more valuable going forward can be found in the projections of the Bureau for Labor Statistics (BLS) . . . [which suggest many more law graduates than job openings].”
In 2013, Brian Tamanaha wrote:
The U.S. Bureau of Labor Statistics estimates about 22,000 lawyer openings annually through 2020 (counting departures and newly created jobs). Yet law schools yearly turn out more than 40,000 graduates. This bleak job market coexists with astronomically high tuition.
Several and journalists also started comparing BLS projections and job openings to make much the same argument.
In 2013, unaware of the problems with job openings projections, I (Simkovic) suggested that projections might be used to make adjustments to more objective historical baselines for risk-based student loan pricing.
On the chance that BLS projections that perform poorly in other contexts perform well in the legal education context, Frank McIntyre and I analyzed the extent to which BLS projections predict law graduate outcomes (earnings premiums). The answer is: no better than random chance.
As in other areas, BLS employment projections are not reliable or meaningful for predicting earnings premiums and are therefore not useful for valuing legal education.
But what about the number of job openings for lawyers? Can BLS projections at least predict that reasonably well?
It is unclear at this point if the new job opening projections method will predict earnings premiums better than the old ones. In any case, that was never their intended purpose, and it would be safer to predict earnings premiums and value education based on historical earnings premiums.
It remains likely that many law school graduates will not practice law. Such has been the case in the past, and such is the case in other fields. Many engineering, math and science graduates do not work as engineers, mathematicians or scientists in their fields of study. Most fields of study do not have a one-to-one correspondence with a particular occupation, but are more broadly useful in the labor market, and law is no exception. In spite of many individuals working outside their degree fields, higher education typically has been, and likely will remain, an investment with positive returns.
To best way to tell whether there is too much or too little investment in education is to consider relative returns that take into account risks and variability in employment. Are the returns to education higher or lower than returns that can be had elsewhere with similar levels of risk? The returns to education are generally much higher, and risk does not appear to explain this difference adequately. The high relative returns to education suggest underinvestment in education.
March 19, 2015
How can we test predictions about the future when we don’t yet have data showing what will happen in the future? One answer is hindcasting. You already believe in hindcasting if you believe in the science behind global warming (see also here and here).
“Hindcasting” (or “backtesting”) is using historical data to test prediction methods and it is widely used in finance, engineering, and climate science. The basic idea is that a prediction method can be reduced to a set of rules or mathematical formulas. Historical data from the more distant past can be fed into these rules and formulas, and the resulting predictions about the “future” (relative to the distant past that provided the data) will also be predictions about the past (relative to the period in which the researcher conducts the backtest).
Since data about the “future” is now available, predictions generated by the prediction method can be compared to what actually happened. A prediction method does not have to be correct all of the time to be useful; if a prediction method performs a bit better than random chance, it might still be useful in many contexts, especially in investment management. If it performs better than the next best prediction method, then it is still useful even if it is imperfect. But if a prediction method does not perform any better than random chance, it is discredited and discarded.
Using this hindcasting approach, Frank McIntyre and I test popular prediction methods used by various pundits and professors to try to predict whether now is a good or bad time to go to law school. (See Timing Law School) As in our previous research, our primary outcome variable of interest is law earnings premiums—the earnings of law school graduates relative to the earnings of similar bachelor’s degree holders. This is the relevant measure, because it goes to the value added by law school, and can be compared to the cost of attendance.
The peer-reviewed labor economics literature finds that a law degree has been a lucrative investment for the overwhelming majority of law school graduates compared to entering the labor market with just a bachelor’s degree. Nevertheless, questions persist about whether now is an unusually good or bad time to start law school.
According to one popular hypothesis, now is an unusually bad time to go to law school because employment outcomes for recent graduates 9 months after graduation have deteriorated. These graduates, it is argued, will not have the same career success as law school graduates in the past. Moreover, deterioration in outcomes for those who graduated last year predicts poor outcomes three or four years in the future and beyond for those who are entering law school now.
According to another popular hypothesis, now is an unusually good time to go to law school because so few people are doing it. When these small cohorts of law students eventually graduate, they will all be more likely to find a high paying job than the larger cohorts of the past. A variation on this argument is that now is still a bad time to go to law school in spite of falling enrollments because the number of law school graduates will still be greater than the number of BLS projected job openings for lawyers. (For a discussion of newer BLS projection methods showing more job openings, see here)
Our analysis includes graduates from 1964 through 2008 and earnings data from 1984 to 2013. This period captures numerous economic booms and recessions. As in The Economic Value of a Law Degree, our main source of data is the U.S. Census Bureau’s Survey of Income and Program Participation. We were able to backfill the data to include older versions of the survey and capture more years of macroeconomic variation thanks to grant funding from Access Group, Inc., and LSAC. (Because the older data has some limitations, those who are interested in the value of a law degree rather than the size of cohort effects should still consult our 2014 article).
None of the prediction methods we tested perform better than random chance. Cohort size is not predictive. Cohort size relative to BLS projections is equally useless. Although those who graduate in a boom when unemployment is low do indeed have higher earnings premiums in their first few years after graduation than those who graduate when unemployment is low or moderate, the effect fades after the first four years. More importantly, it is not possible to predict whether unemployment will be high or low four years in the future based on currently available data. Even those who are unlucky enough to graduate into a weak economy still generally benefit substantially from their law degrees.
Delaying law school to attempt to “time the market” is an imprudent strategy. It does not improve one’s chances of graduating into a favorable economic climate. It entails substantial opportunity costs in the form of fewer years of higher, post-law-school earnings. The cost of every year of delay averages tens of thousands of dollars. Popular prediction methods for market timing are not only scientifically baseless; they also appear to be financially toxic to prospective students who take them seriously.
The best guide to the future continues to be the long-term historical data. Short-term fluctuations around these averages are not readily predictable. Instead of trying to predict the unpredictable, it may be more prudent to focus on helping students manage these risks, for example through insurance programs similar to Income-Based Repayment of student loans. (See also here)
But what about more recent graduates? How much can we say about those who graduated after 2008, and is this time different? How can we explain our results in light of previous research on cohort effects focused on bachelor’s degree holders?
For answers to some of these questions, look for our next blog post.
January 26, 2015
January 23, 2015
January 16, 2015
Reader Gerard Ambreson writes:
In your capacity of chief commentator on legal education, perhaps you could provide your thoughts on your blog on named professorships/chairs. Some questions it would be interesting to see you address:
What should the criteria for awarding chairs be? For example, should schools take into account things besides scholarship, such as other contributions to the school, like excellent teaching or service? Is it appropriate to consider non-legal writings for chairs in law schools? See, for example, St. John's Law School's Reverend Joseph T. Tinnelly, C.M., Professor of Law, Lawrence Joseph, who may be better known for his poetry than his writing on legal matters?
My impression is that criteria vary with school and often with the particular endowment--many schools have named positions to recognize excellent teaching, for example.
December 04, 2014
MOVING TO FRONT FROM DEC. 1--THANKS TO THOSE WHO HAVE ALREADY COMMENTED, I WOULD LOVE TO HEAR FROM SOME OTHER TITLE IX EXPERTS ON THESE ISSUES
In my other academic field, philosophy, there has been much discussion of the move by the University of Colorado at Boulder to fire a tenured philosophy professor (David Barnett) for "retaliation" against a female complainant in a sexual assault case. A university investigation found against a male graduate student in philosophy (with whom Barnett had worked); Barnett conducted his own investigation of the university's investigation, and sent the University Chancellor a 38-page report alleging mistakes and misconduct in the university investigation. (A copy of this report has not been made public to my knowledge.)
So what constitutes "retaliation" under Title IX? Can alleging a university investigation was flawed constitute retaliation? How does "retaliation" under Title IX interact with the First Amendment rights of faculty and students? Any insight from readers would be welcome.
October 08, 2014
Barry Friedman (NYU) writes with an excellent set of questions and observations:
Here’s a thought worth maybe tooting on your blog. It never ceases to catch my attention how much school hiring is driven by signals from other schools. School X will interview candidate Y and love him/her, or will love him/her on paper, but will never move forward for an interview absent a strong signal from some number of schools they consider competitive. Yet, in this tight market, those signals get fewer – especially at the call back and offer stage. It has the effect I think of killing candidates that otherwise would get interviews or offers. Yet, paradoxically, if schools had confidence in their internal assessments (and it is not like this is one person deciding; it is an entire faculty or faculty committee) this sort of market provides a real opportunity to steal that person you loved without a fight.
So why do schools do this? I think in most cases it is because they lack confidence in their own judgments. But what do readers think? I would prefer signed comments, but you must, in any case, include a valid e-mail address, which will not appear.
August 26, 2014
April 16, 2014
...that things aren't as awful as the various charlatans and other law-school haters claim, and, predictably (given the social psychology), the charlatans and haters go crazy. I won't link to the hysterical reactions (they are easy enough to find with Google), but they boil down to one complaint: Chemerinsky & Menkel-Meadow cited NALP data without treating it as bogus (e.g., that JD Advantage jobs are really jobs [actually many of them are, but never mind]). That's true, they linked to the NALP data, but they didn't spend the rest of their piece debunking that data based on speculation, skepticism, and occasionally other actual evidence. This has certainly been a standing problem in the debate about American legal education, as when serious data analysis showed that legal education was a sound economic investment for the vast majority of students, and critics refused to believe that was true, though without any contrary evidence or analysis. So we can all agree that we should be more careful about how we present data and its import.
That being said, my main disagreement with Chemerinsky & Menkel-Meadow is about the necessity of three years of legal education, as I've said before: two years could work, and work very well for many students. In reality, the biggest obstacle to reducing costs in legal education, however, is unnoted in their op-ed: it remains the lax tenure standards and the unwillingness of universities to terminate tenured faculty for cause, i.e., when they manifestly do not do their job.
Imagine, for example, a law school that pays a six figure salary (closing in on 200K) to someone with almost no legal experience and an M.A. in literature who teaches the same couple of substantive courses year in and year out, courses in which he has no experience, whose teaching evaluations are consistently below average, who hasn't written any serious legal scholarship in years, who is regarded as a joke by his colleagues at his own school and in the academy at large, and who mostly spends his time insulting, defaming, and blackmailing colleagues who do their jobs. It endangers the institution of tenure when universities do not initiate proceedings to terminate malevolent charlatans like this. Many law schools, as we've noted before, are offering financial inducements to "buy out" senior faculty, most of whom are not charlatans. Real cost reduction, however, will require universities to move against the charlatans and the de facto retired in their midst, even those who have tried to insulate themselves from termination for cause by setting up frivolous retaliation claims.
UPDATE: More thoughts on reforming legal education from Michael Madison (Pitt).
April 10, 2014
Anyone following Al Brophy's reports on the LSAC data will notice that, while applications are still down from last year, they are down a bit less with each subsequent report. That's consistent with anecdotal reports from colleagues who teach undergraduates who report being asked to write letters of recommendation later and later in the season than just a few years ago. One surmises that at least part of what is happening is that (1) students waivering about going to law school are realizing that they don't have other tangible professional plans, (2) students are realizing their chances of getting good admissions offers--either in terms of the caliber of the school and/or the cost--are much better this year than just a few years ago. Along with this indicator, I suspect the decline in applications is about to bottom out. It will still take a couple more years, though, for most law schools to begin hiring new faculty again given the dramatic decline in applications and enrollments of the last few years.