Saturday, April 18, 2015
Friday, April 17, 2015
MOVING TO FRONT--ORIGINALLY POSTED AUGUST 22, 2014
These are appointments with tenure that will begin in 2015; I will move this to the front at various intervals during the year; recent additions are bolded. There should be a senior departure from Yale to announce before the week is out.
*Owen Anderson (oil & gas law, natural resources) from the University of Oklahoma, Norman to the University of Texas, Austin.
*Jennifer Bard (health law, constitutional law) from Texas Tech University to the University of Cincinnati (to become Dean).
*Christopher Buccafusco (intellectual property, behavioral/experimental law & economics) from Chicago-Kent College of Law to Cardozo Law School.
*Aaron Bruhl (legislation, statutory interpretation, federal courts) from the University of Houston to the College of William & Mary.
*Irene Calboli (intellectual property, international trade, comparative law) from Marquette University to Texas A&M University.
*Joshua Cohen (political philosophy) resigned from Stanford University (where he taught in Law, Philosophy & Political Science) in October 2014 to join Apple University. He will now also be part-time at the University of California, Berkeley.
*Matthew Diller (administrative law, social welfare law & policy) from Cardozo Law School to Fordham University (as Dean).
*Marcella David (international law, foreign relations law) from the University of Iowa to Florida A&M University (as Provost).
*William Dodge (international law, international transactions, international dispute resolution) from the University of California, Hastings to the University of California, Davis.
*Susan Fortney (legal ethics, legal professions, legal malpractice, bioethics, torts) from Hofstra University to Texas A&M University.
*Brian Galle (tax) from Boston College to Georgetown University.
*Nuno Garoupa (law & economics, comparative law) from the University of Illinois to Texas A&M University.
*Elizabeth Garrett (legislation, administrative law) from the University of Southern California to Cornell University (to become President).
*Andrew Guzman (international law and trade, law & economics) from the University of California, Berkeley to the University of Southern California (as Dean).
*C. Scott Hemphill (antitrust, intellectual property, law & economics) from Columbia University to New York University.
*Sonia Katyal (intellectual property, civil rights, privacy, property, law & sexuality) from Fordham University to the University of California, Berkeley.
*Daniel Katz (empirical legal studies, computational legal studies, criminal procedure) from Michigan State University to Chicago-Kent College of Law.
*Paul Kirgis (alternative dispute resolution, evidence) from St. John's University to the University of Montana (to become Dean).
*Gillian Lester (employment law) from the University of California, Berkeley to Columbia University (as Dean in January 2015).
*Erik Luna (criminal law & procedure) from Washington & Lee University to Arizona State University.
*Glynn S. Lunney, Jr. (intellectual property, law & economics) from Tulane University to Texas A&M University.
*Timothy Lytton (regulatory law and policy, administrative law, torts) from Albany Law School to Georgia State University.
*Andrei Marmor (legal philosophy) from the University of Southern California to Cornell University.
*Andrea Matwyshyn (law & technology, cyberlaw, privacy) from the Wharton School at the University of Pennsylvania (untenured) to Northeastern University.
*Paul McGreal (constitutional law, law & religion, business ethics) from the University of Dayton to Creighton University (as Dean).
*Paul Ohm (law & technology, computer law, privacy, intellectual property) from the University of Colorado, Boulder to Georgetown University.
*Dave Owen (environmental law, natural resources, water law, administrative law) from the University of Maine to the University of California, Hastings.
*Mary-Rose Papandrea (constitutional law, media law, national security law) from Boston College to the University of North Carolina, Chapel Hill
*Dylan Penningroth (legal history) from Northwestern University (History Dept.) and American Bar Foundation to the University of California, Berkeley.
*James Salzman (environmental law) from Duke University to the University of California, Los Angeles (Law) and the University of California, Santa Barbara (Environmental Science & Management).
*Michael Schill (property, real estate law, urban policy) from University of Chicago to the University of Oregon (as President).
*David Schwartz (patents, intellectual property, empirical legal studies) from Chicago-Kent College of Law to Northwestern University.
*Kenneth Simons (torts, criminal law, law & philosophy) from Boston University to the University of California, Irvine.
*Alexander Somek (EU law, comparative constitutional law, legal theory) from the University of Iowa to the University of Vienna.
*Eric Talley (corporate law, law & economics) from the University of California, Berkeley to Columbia University (in July 2015).
*Steve Vladeck (federal courts, national security law, constitutional law) from American University to the University of Texas, Austin (effective 2016).
*Melanie Wilson (criminal law, criminal procedure, evidence) from the University of Kansas to the University of Tennessee (as Dean).
*Peter Yu (intellectual property, communications law and policy, and comparative and international law) from Drake University to Texas A&M University.
*Kathryn Zeiler (torts, health law, law & economics, empirical legal studies) from Georgetown University to Boston University.
Thursday, April 16, 2015
Wednesday, April 15, 2015
Tuesday, April 14, 2015
It breaks my heart to have to post this, since Mike Schill has been a terrific Dean here the last 5 1/2 years, but we all knew he was in demand elsewhere: he will be the new President of the University of Oregon, come July 1. Oregon is damn lucky, and I know I speak for everyone at Chicago in saying that Mike Schill will be greatly missed here.
Monday, April 13, 2015
Stephen Diamond (Santa Clara) has the details. He won't be missed, it's fair to say.
Saturday, April 11, 2015
Deborah Merritt and Kyle McEntee conflated “response rates” with nonresponse bias and response bias. After I brought this error to light, Professor Merritt explained that she and Mr. McEntee were not confused about basic statistical terminology, but rather were being intentionally vague in their critique to be more polite* to the law schools.
Professor Merritt also changed the topic of conversation from Georgetown’s employment statistics—which had been mentioned in The New York Times and discussed by me, Professor Merritt, and Kyle McEntee—to the employment statistics of the institution where I teach.**
What Professor Merritt meant to say is that law schools have not been properly weighting their data to take into account nonresponse bias. This is an interesting critique. However, proper weights and adjustments to data should take into account all forms of nonresponse bias and response bias, not just the issue of over-representation of large law firms in NALP salary data raised by Professor Merritt.
While such over-representation would have an effect on the mean, it is unclear how much impact, if any, it would have on reported medians—the measure of central tendency used by The New York Times and critiqued by Mr. McEntee.
Other biases such as systematic under-reporting of incomes by highly educated individuals,*** under-reporting of bonuses and outside income, and the like should be taken into account.**** To the extent that these biases cut in opposite directions, they can offset each other. It’s possible that in aggregate the data are unbiased, or that the bias is much smaller than examination of a single bias would suggest.
Moreover, focusing on first year salaries as indicative of the value of a lifetime investment is itself a bias. As The Economic Value of a Law Degree, showed, incomes tend to rise rapidly among law graduates. They do not appreciably decrease, either, until the fourth decade of employment.
If Professor Merritt’s view is that differences between NALP, ABA, and U.S. Census Bureau data collection and reporting conventions make law school-collected data more difficult to compare to other data sources and make law school data less useful, then I am glad to see Professor Merritt coming around to a point I have made repeatedly.
I have gone further and suggested that perhaps the Census Bureau and other government agencies should be collecting all data for graduate degree programs to ensure the accuracy and comparability of data across programs and avoid wasting resources on duplicative data collection efforts.
This could also help avoid an undue amount of focus on short-term outcomes, which can be misleading in light of the rapid growth of law graduate earnings as they gain experience. The inappropriate focus on the short term can be misleading if students are not aware of the growth trajectory and how it compares to the growth trajectory of likely earnings without a law degree.
** This tactic, bringing up the employment statistics of the institution where those whom she disagrees with teach, is something of a habit for Professor Merritt. See her response Anders Walker at St. Louis).
*** Law graduates outside of the big firms are highly educated, high-income individuals compared to most of the rest of individuals in the United States. That is the benchmark used by researchers when they identified the reporting biases in census data that lead to under-reporting of incomes.
**** The risk of under-reporting income in law may be particularly high because of opportunities for tax evasion for those who run small businesses or have income outside of their salary.
UPDATE (4/14/2015): I just confirmed with NALP that their starting salary data does not include end of year bonuses.
Friday, April 10, 2015
Did law schools behave unethically by providing employment and earnings information without simultaneously reporting survey response rates? Or is this standard practice?
The answer is that not reporting response rates is standard practice in communication with most audiences. For most users of employment and earnings data, response rates are a technical detail that is not relevant or interesting. The U.S. Government and other data providers routinely report earnings and employment figures separate from survey response rates.*
Sometimes, too much information can be distracting.** It’s often best to keep communication simple and focus only on the most important details.
Nonresponse is not the same thing as nonresponse bias. Law school critics do not seem to understand this distinction. A problem only arises if the individuals who respond are systematically different from those who do not respond along the dimensions being measured. Weighting and imputation can often alleviate these problems. The critics’ claims about the existence, direction, and magnitude of biases in the survey data are unsubstantiated.
High non-response rates to questions about income are not a sign of something amiss, but rather are normal and expected. The U.S. Census Bureau routinely finds that questions about income have lower response rates (higher allocation rates) than other questions.
Law school critics claim that law school graduates who do not respond to questions about income are likely to have lower incomes than those who do respond. This claim is not consistent with the evidence. To the contrary, high-income individuals often value privacy and are reluctant to share details about their finances.***
Another potential problem is “response bias”, in which individuals respond to survey questions in a way that is systematically different from the underlying value being measured. For example, some individuals may under report or over-report their incomes.
The best way to determine whether or not we have nonresponse bias or response bias problems is to gather additional information about non-responders and responders.
Researchers have compared income reported to Census surveys with administrative earnings data from the Social Security Administration and Internal Revenue Service. They find that highly educated, high-income individuals systematically under-report their incomes, while less educated, lower income individuals over-report. (Assuming the administrative data is more accurate than the survey data).
Part of the problem seems to be that bonuses are underreported, and bonuses can be substantial. Another problem seems to be that high-income workers sometimes report their take-home pay (after tax withholding and deductions for benefits) rather than their gross pay.
Other studies have also found that response bias and nonresponse bias lead to underestimation of earnings and employment figures.
In other words, there may indeed be biases in law school earnings data, but if there is, it is likely in the opposite direction of the one the law school critics have claimed.
Of course, the presence of such biases in law school data would not necessarily be a problem if the same biases exist in data on employment and earnings for alternatives to law school. After all, earnings and employment data is only useful when compared to a likely alternative to law school.
As with gross employment data, the critics are yet again claiming that an uncontroversial and nearly universal data reporting practice, regularly used by the United States Government, is somehow scandalous when done by law schools.
The only thing the law school critics have demonstrated is their unfamiliarity with basic statistical concepts that are central to their views.
* Reporting earnings and employment estimates without response rates in communication intended for a general audience—and even some fairly technically sophisticated audiences—is standard practice for U.S. government agencies such as the U.S. Census Bureau and the U.S. Department of Labor, Bureau of Labor Statistics. A few examples below:
- Earnings and unemployment by education level
- Unemployment rates
- Employment population ratio
- Tabular summaries from
** Information on response rates is available for researchers working with microdata to develop their own estimates, and for those who want to scour the technical and methodological documentation. But response rates aren’t of much interest to most audiences.
*** After the JD researchers noted that young law graduates working in large urban markets—presumably a relatively high-income group—were particularly reluctant to respond to the survey. From After the JD III:
“Responses . . . varied by urban and rural or regional status, law school rank, and practice setting. By Wave 2, in the adjusted sample, the significant difference between respondents and nonrespondents continued to be by geographic areas, meaning those from larger legal markets (i.e. New York City) were less likely to respond to the survey. By Wave 3, now over 12 years out into practice, nonrespondents and respondents did not seem to differ significantly in these selected characteristics.”
In the first wave of the study, non-respondents were also more likely to be male and black. All in all, it may be hard to say what the overall direction of any nonresponse bias might be with respect to incomes. A fairly reasonable assumption might be that the responders and non-responders are reasonably close with respect to income, at least within job categories.
Thursday, April 9, 2015
It was another tight year on the law teaching market, probably even more difficult than last year. Happily most of our candidates once again were successful at securing tenure-track positions. They are:
Laura Napoli Coordes '10 who will join the faculty at Arizona State University, where she is presently a VAP. She graduated with Honors from the Law School, where she was a member of the Law Review and served as a Legal Fellow at the Student Press Law Center upon graduation, before working at Weil, Gotshal & Manges as a bankruptcy associate in New York. Her teaching and research interests include bankruptcy, commercial law, contracts, and corporate finance.
Goldburn P. Maynard, Jr. '05, who will join the faculty at the University of Louisville. He was a member of the Law Review at Chicago,and also earned an LL.M. in tax at Northwestern. He was a tax associate at Skadden Arps in Chicago, and then an estate tax attorney with the I.R.S. for four years. He was a VAP at Washington University, St. Louis and at Florida State University. His research and teaching interests include federal tax, estates and trusts, and estate and gift tax.
Joshua Sellers '08 who will join the faculty at the University of Oklahoma, Norman. At Chicago, he was Articles Editor of the Law Review and also earned a Ph.D. in Political Science with a dissertation on "The 'Crown Jewel' at a Crossroads: Appraising the Contemporary Political Function of the Voting Rights Act." He clerked for Judge Barkett on the U.S. Court of Appeals for the Eleventh Circuit, and was an associate at Jenner & Block in Washington, D.C. for three years, where he primarily litigated insurance claims. Most recently, he was a post-doc in the Maxwell School of Public Policy at Syracuse University. His research and teaching interests include election law, civil rights, constitutional law, legislation, insurance law, and torts.
Sloan G. Speck '07 who will join the faculty at the University of Colorado, Boulder. He graduated with Honors from the Law School, where he also served as Articles Editor of the Law Review. He also earned an M.A. in History from Chicago (where he focused on the history of tax and business) and an LL.M. in tax (2010) from New York University, where he has been Acting Assistant Professor of Tax Law since 2013. He will receive his PhD in History from Chicago in 2016. He was a tax associate at Skadden Arps in Chicago for five years. His teaching and research interests include tax law and policy (including corporate and international tax), as well as legal and business history.
Matthew J. Tokson '08 who will join the faculty at the Salmon P. Chase College of Law at Northern Kentucky University. He graduated with High Honors and Order of the Coif from the Law School, where he served as both Executive Articles Editor and Book Review Editor of the Law Review. He clerked for Judge Randolph on the U.S. Court of Appeals for the D.C. Circuit, served first as a Kauffman Fellow then as a Bigelow Fellow at the Law School from 2009-2011, before clerking on the U.S. Supreme Court for both Justice Ginsburg and Justice Souter in 2011-12. He was also a litigation associate at WilmerHale in Washington, D.C. His teaching and research interests include criminal procedure, privacy, intellectual property, judicial behavior, criminal law and torts.
(I will have a separate post about our Bigelows and other Fellows once their situations are all settled; as always, they all have tenure-track offers.)
Wednesday, April 8, 2015
Opportunity costs and tradeoffs are foundational principles of micro-economics. Comparison between earnings with a law degree and earnings with likely alternatives to law school is the core of The Economic Value of a Law Degree.
In her recent post, Professor Merritt raises interesting questions about whether some students who now go to law school could have had more success elsewhere if they had majored in a STEM (Science Technology Engineering & Math) field rather than humanities or social sciences.
These questions, however, don’t invalidate our analysis. A percentage of those who major in STEM fields of course go on to law school, and our data suggest that they also receive a large boost to their earnings compared to a bachelor’s degree. Some studies suggest that among those who go to law school, the STEM and economics majors earn more than the rest.
Research on college major selection reveals that many more individuals intend to major in STEM fields than ultimately complete those majors. STEM/Econ majors who persist have higher standardized test scores than humanities/social science majors at the same institution and also higher scores than those who switch from STEM/Econ to humanities or social science. Those who switch out of STEM received lower grades in their STEM classes than those who persist. Compared to Humanities and Social Science majors, the STEM majors spend more time studying, receive lower grades, and take longer to complete their majors.
In other words, many of the individuals who end up majoring in the humanities and social sciences may have attempted, unsuccessfully, to major in STEM fields. (For a review of the literature, see Risk Based Student Loans and The Knowledge Tax).
In The Economic Value of a Law Degree, Frank McIntyre and I investigated whether the subset of humanities majors who go to law school had unusually high earning potential and found no evidence suggesting this. The humanities majors who attend law school are about as much above the average humanities major in terms of earning potential as the STEM majors who attend law school are above the average STEM major.
In her recent post, Professor Merritt does not suggest alternatives to law school. Instead she selectively discusses occupations other than being a lawyer. These are generally very highly paid and desirable occupations, such as senior managerial roles, and many individuals who pursue such jobs will be unable to obtain them. In other words, these high paid jobs cited by Professor Merritt are not the likely alternative outcome for most of those who now go to law school if they chose another path. (Indeed, given the high earnings premium to law school including the 40 percent of graduates who do not practice law, a law degree probably increases the likelihood of obtaining highly paid jobs other than practicing law).
Occupations are outcomes. Education is a treatment. Students choose education programs (subject to restrictive admissions policies and challenges of completing different programs), but have more limited control over their ultimate occupation. Comparing occupations as if they were purely choices would be an error. Not every MBA who sets out to be a Human Resources Manager will land that job, just as not every law school graduate will become a lawyer at a big firm. Analysis of nationally representative data from the U.S. Census Bureau using standard statistical techniques from labor economics to consider realistic earnings opportunities--rather than selective focus on the very highest paid occupations tracked by the BLS--suggests that most of the folks who go to law school would be in much less attractive positions if they had stuck with a bachelor’s degree.
Frank McIntyre and I have previously noted the importance of additional research into how the value of a law degree varies by college major, and how the causal effect of different kinds of graduate degrees varies for different sorts of people.
We appreciate Professor Merritt’s interest in these issues and look forward to discussing them in the future when more methodologically rigorous research becomes available. Professor Merritt raises some interesting ancillary issues about response rates, but discussion of those issues will have to wait for a future post.
So far the leadership of Dean Andrew Morriss seems to already be paying dividends, with five senior hires:
Irene Calboli (intellectual property, international trade, and comparative law) from Marquette University.
Susan Fortney (legal ethics, legal professions, legal malpractice, bioethics and torts) from Hofstra University.
Nuno Garoupa (law & economics, comparative law) from University of Illinois.
Glynn S. Lunney, Jr. (intellectual property, law & economics) from Tulane University.
Peter Yu (intellectual property, communications law and policy, and comparative and international law) from Drake University.
After my first post on employment definitions, a law school dean emailed me to suggest that perhaps the ABA felt it needed to be extra tough because it was worried it couldn’t trust some of the law schools to make close judgment calls in categorizing employment data.
The Census Bureau does a wonderful job collecting and reporting earnings and employment data using standard methods and definitions. Why not empower the Census Bureau to collect the relevant data about law schools and all programs of higher education?
There are two potential uses of employment outcome data of law school graduates.
(1) Comparing law school to alternatives to law school
(2) Comparing law schools to teach other
Census Bureau data is very well suited to the first use, and could also be useful for high level information about geography or rank even if not for comparisons of individual institutions. If the Current Population Survey and the American Community Survey—which have larger sample sizes and release data more regularly than the Survey of Income and Program Participation—were expanded to include questions on graduate education field (i.e., law, medicine, business) as well as level (B.A., PhD, Master’s, or Professional degree), and specific information about institution or caliber or geography of institution attended, that would go along way toward making law school data redundant. Census surveys will not have data on every law graduate, but as long as the sample is representative, that is not much of a problem.
The Census Bureau data would likely be superior to law school data in the most important respects because it would be comparable to data for those with other educational backgrounds. Since Census Bureau data is for a representative sample of the population, it would not encourage an unhealthy and misleading fixation on short-term outcomes.
As far as comparing individual law schools to each other, student loan default data from the Department of Education might serve this function at least as well as ABA data. To the extent we are concerned about poor outcomes at any particular law school, such poor outcomes will show up in higher student loan default rates.
Default rates will reflect outcomes not only for graduates, but also for those who fail to complete the program. This data would also not be sensitive to response bias on the low end—individuals who do not respond to their student loan bills will be counted as defaulters. Another advantage of this data is that it can be compared with other educational programs. Of course, we would still need to be mindful of the issue of selection versus causation. (Although we could quibble about how the Department of Education calculates its default rates (they publish more than one), the specifics of the definition are far less important that the fact that it is applied consistently across institutions, is used for comparative purposes, and is correlated with other validated measures).
If the Department of Education required colleges and universities to release separate default rate data for every field of graduate study (and perhaps for every college major), that would go a long way to helping inform students and increasing comparability of information about risk levels across programs. (I’ve discussed the merits of this kind of granular disclosure before).
The data won’t capture differences in the boost to earnings across law schools for students in the middle or high end of the distribution, since relatively few students default on their loans. It also won’t tell us anything about the students who don’t need to borrow. Nor will it tell us which schools have the strongest alumni networks in specific geographies or industries. That purpose might be better served by expanding longitudinal studies like After the JD, Baccalaureate and Beyond, National Longitudinal Survey of Youth, and the National Survey of College Graduates to include larger samples, better information about pre-law school differences in characteristics, and more long term information on post-graduate earnings and employment.
The Census Bureau’s ethics and incentives are unimpeachable. Putting data collection in its capable hands and into the hands of similar agencies charged with broad-based data collection would enable these agencies to do more of what they do best and free law schools from the burdens of a task they may not be well equipped to handle.
Resources that are now wasted collecting very precise but not very useful data about initial outcomes for law graduates could instead be redeployed to analyzing the higher quality data. (Or if we still think short term ABA and NALP data provide incremental value that exceeds the costs of collecting, reporting, and interpreting the data—and the costs of predictable misinterpretation and misuse—we could have that much more data to work with).
Food for thought.
Tuesday, April 7, 2015
Recently, The New York Times reported on law school and the legal profession based on hard data and peer reviewed research rather than anecdote and innuendo. The New York Times came to the conclusion that anyone looking honestly at the data would naturally come to—law school seems to be a pretty good investment, at least compared to a terminal bachelor’s degree.
Mr. McEntee suggests incorrectly that The New York Times reported Georgetown’s median private sector salary without providing information on what percentage of the class or of those employed were working in the private sector. (Mr. McEntee also seems to be confused about the difference between response rates—the percentage of those surveyed who respond to the survey or to a particular question—and response bias—whether those who respond to a survey are systematically different along the measured variable from those who do not).
The New York Times wrote:
Last year, 93.2 percent of the 645 students of the Georgetown Law class of 2013 were employed. Sixty percent of the 2013 graduates were in the private sector with a median starting salary of $160,000.
Deborah Merritt disputes the accuracy of these numbers, suggesting it is 60 percent of the 93.2 percent of the graduating class who were employed that were employed in the private sector. This would come to 56 percent of the class employed in the private sector and is a small enough difference that The New York Times may have simply rounded up.
In any case, it is clear that The New York Times provided information about the percent of graduates working in the private sector.
Mr. McEntee also repeats the odd claim that by reporting employment numbers that appear to be close to consistent with the standard definition of “employment” established by the U.S. Census Bureau and promulgated internationally by the International Labor Organization, The New York Times is somehow misleading its readers.
To the contrary, it is Mr. McEntee’s non-standard definitions of employment, taken out of context, that are likely to mislead those attempting to compare law school statistics to the employment statistics of the next best alternative. Mr. McEntee discusses full-time employment statistics for law schools without noting that the full-time employment rate for law graduates is higher than the full-time employment rate for bachelor’s degree holders with similar levels of experience and backgrounds under consistent definitions and survey methods. And he overlooks the evidence that those who do not practice law still benefit from their law degrees.
Mr. McEntee also inaccurately describes my research with Frank McIntyre, claiming incorrectly that we do not take into account those who graduated after 2008. Timing Law School was specifically designed to address this limitation of our earlier research.
Timing Law School includes an analysis of two proxies for law school graduates from the American Community Survey: (1) young professional degree holders excluding those working in medical professions, and (2) young lawyers. This analysis includes individuals who graduated as recently as 2013, and finds no evidence of a decline in recent law graduates’ outcomes relative to those of similar bachelor’s degree holders. (See also here for a discussion of recent data for the subset of law graduates who work as lawyers).
Timing Law School also simulates the long term effects on the earnings premium of graduating into a recession based on the experiences of those who have graduated into previous recessions. The differences between graduating into a recession and graduating into an average economy are not very large (there is a large boost for those graduating into a boom, but booms and recessions are not predictable at the time of law school matriculation).
Moreover, in Timing Law School we find that fluctuations in short-term outcomes for recent graduates are not good predictors of outcomes for those who are currently deciding whether or not to enter law school; long term historical data is a better predictor.
The Economic Value of a Law Degree did not include data on those who graduated after 2008 because such data was not available in the Survey of Income and Program Participation. However, it did include earnings data through 2013, and found no evidence of the earnings premium for law graduates declining in recent years to below its historical average.
Frank and I have noted repeatedly that our analysis compares a law degree to a terminal bachelor’s degree and that we think an important area for future research is careful comparative analysis of alternate graduate degrees, being mindful of selection effects (read The Economic Value of a Law Degree or for the most recent example, see our post from two days ago). While a casual (i.e., not causal) examination of raw data suggests that a law degree likely compares reasonably well to most alternatives other than a medical degree, we’ve noted that it’s possible that more rigorous analysis will reveal that another graduate degree is a better option for some prospective law students, especially when subjective preferences are taken into account along with financial considerations.
Mr. McEntee claims incorrectly that when it comes to other graduate degrees, “McIntyre and Simkovic don’t know and don’t care; they’re convinced that the value of a law degree as immutable as the laws of nature.”
Mr. McEntee insists that law graduates, even at the higher ranked schools, will find it challenging to repay their student loans. However, data from After the JD shows that law school graduates from the class of 2000/2001 have been paying down their loans rapidly.
What about those who entered repayment more recently, when tuition was higher and job prospects less plentiful?
Data from the U.S. Department of Education shows that law students, even at low ranked law schools, remain much less likely to default than most student borrowers. This is true even though law students typically graduate with higher debt levels.
Indeed, The Economic Value of a Law Degree suggests that law graduates generally have higher incomes after taxes and after paying additional debt service than they likely would have had with a terminal bachelor’s degree, even before taking into account debt forgiveness available under Income Based Repayment plans.
Based in part on our research, private student lenders have noticed how unlikely law graduates are to fail to repay their loans. These lenders offer refinancing at substantially lower rates than those charged by the federal government, further reducing the costs of legal education for many graduates (while earning a profit in the process).
No matter what new information becomes available, Mr. McEntee insists that law school is financially disastrous. This is curious for a public figure who claims that his goal is providing prospective law students more accurate information about law school.
Monday, April 6, 2015
Somehow Paul Campos--fresh from his latest smear piece on Michael Simkovic--got an opinion piece in The New York Times reporting his latest discovery: namely, that state spending on higher education has increased, not declined over the last several decades. How did he arrive at this contrarian conclusion? By looking at absolute dollars spent on education and largely ignoring or downplaying the increase in the number of students during this time. (And nothing, as usual, about Baumol's disease, which is crucial to any serious understanding of higher education costs.) We already knew Campos was not exactly an intellectual giant, but this latest muddle disappointed even my low expectations. For more detailed discussion, see IHE and Slate. And for some actual data on the growth in student population, try this, and on the decline in state spending, this.
(Thanks to Michael Simkovic for suggesting the title.)
UPDATE: CHE collects more responses to this idiocy.
ANOTHER: Still more on this fiasco, which will hopefully mean we won't have to hear from this ignoramus in a major forum again.
Sunday, April 5, 2015
The choice of whether or not to go to law school is always a choice between law school and the next best alternative. College graduates do not vanish from the face of the earth if they choose not to go to law school. They still must find work, continue their education, or find some other source of financial support.
The question everyone who decides not to go to law school, and every critic of law schools, must answer is—what else out there is better?*
To enable prospective students to compare law school to the next best alternative, we need standardized measurements that apply to both law school and alternatives to law school.
Professor Merritt objects to the standard definition of employment used by the United States government, which she believes is too loose, since it includes an individual as employed if the individual works just one hour during the week of the interview. (This is also the international standard promulgated by the International Labor Organization and widely used around the world).
Using the standard definition of employment and consistent survey and reporting methods reveals that law graduates are more likely to be employed than similar bachelor’s degree holders.
The important thing is not the measurement itself, but rather the relative (causal) differences between law school and the next best alternative. Only by using consistent measurements for law school and the alternatives to law school can we understand those differences.
A single measurement like employment status may not provide all of the information we want. (As a discrete variable, employment status will contain less information than a continuous variable like earnings or hours of work). The solution is to use several standard measurements consistently to compare two different populations. For example, in addition to employment, we might consider work hours, the percent of individuals working “full-time” (i.e., more than 35 hours per week), earnings, or wages (earnings per hour).
In The Economic Value of a Law Degree, Frank McIntyre and I find that no matter which of these measurements we use, the results always point in the same direction. Law graduates participate more actively in the work force and are much better paid than similar bachelor’s degree holders.
If what students really care about is whether law school is a good investment financially, then no isolated measurement taken 9 or 10 months after graduation will provide much insight. (Especially since other educational programs and data collection agencies are not specifically collecting data 9 or 10 months after graduation).
To answer the investment question, we need estimates of the causal effect of education on the present value of lifetime earnings—what Frank and I try to do in The Economic Value of a Law Degree.
To the extent that measurements at or shortly after graduation are useful at all, it is only for purposes of comparison, and only then while being mindful of the differences between outcomes and causation.
Using non-standard definitions means that ABA data at best can facilitate comparisons between different law schools,** but cannot readily be used to compare law school to any alternative.
Professor Merritt argues that the American Bar Association should require law schools to use a uniquely stringent system of measuring employment. To demonstrate that the legal profession holds itself to a higher standard of ethics, law schools should report lower employment rates than everyone else by using less inclusive, non-standard definitions of employment.
I disagree with the premise that different definitions lead to higher standards. Professor Merritt’s proposal would mean that law school statistics cannot be compared to any other employment statistics, and if history is any guide, will contribute greatly to student confusion and error.
The right thing to do is to report standardized measurements so that law school statistics can be readily compared to statistics of other education programs, as well as used to compare law schools to each other.
* Another graduate degree might be better than law school for a particular individual, especially when preferences for certain kinds of education or work are taken into account along with differences in financial value added. One of the frontiers of labor economics research is comparative analysis of the causal effects of different kinds of graduate education.
** I have reservations about the extent to which ABA initial outcome data should be used to compare the value added by one law school to the value added by another. There are large differences between the student bodies of different law schools along dimensions that predict earning potential—standardized test scores, GPA, college quality and college major, socioeconomic status, and demographics. The differences between students matriculating to the highest and lowest ranked law schools appear to be much larger than the differences between the average college graduate and the average law student. While law schools disclose information about their entering classes, they do not reveal information about their graduates. Entering characteristics could be different from graduating characteristics for schools that accept large numbers of transfer students or have unusually high attrition. In addition, the average growth rate of earnings at different law schools might be different and comparing only initial earnings could lead to misleading results.
Two William Mitchell law professors file suit as planned merger with Hamline may result in abrogation of tenure
Thursday, April 2, 2015
Paul Campos of the University of Colorado is once again confused by my research with Frank McIntyre. This time, the source of Professor Campos’s confusion is not present value calculations, but rather grant funding.
The Economic Value of a Law Degree was not funded through grants. No disclosure of grant funding appears in that article because there was no funding to disclose.
Two follow up studies, Timing Law School and an upcoming study about differences in the law earnings premium by college major, race and gender, are funded through grants from Access Group, Inc., a non-profit that provides financial education to students and schools and aims to promote broad access to education, and the Law School Admission Council (LSAC), which is an important provider of data and research about law schools (see here and here).
The funding provided through these grants is used to buy out time so that Frank and I can spend more time on research. I do not receive the money for my teaching buyout—Seton Hall is paid so that it can find replacements to teach the classes I would have taught. The grants also provide funding for research assistants, software and equipment, summer stipends, and conferences. The payments are scheduled over a two to three year period.
Frank and I are interested in methodological rigor, not in particular results or outcomes, which in any case are unknowable until after we analyze the data. We believe in maximizing the transparency of the methods we use for our research so that it can be replicated or challenged by future empirical researchers. There has never been any effort by LSAC or Access Group to influence or censor our results.
Frank and I are proud of our success securing funding from such highly regarded organizations. We trumpet their support in the first footnote of Timing Law School, and announced it in our first blog post about Timing Law School. I also list the grants and dollar amounts of each on my CV and on my LinkedIn page.
Curiously, Professor Campos and his followers seem to think that the fact that highly regarded non-profit organizations believe our research is worthy of funding is some sort of dirty secret. We’ve practically been shouting it from the rooftops, so I suppose we should thank him for pointing it out.
Wednesday, April 1, 2015
Recently, two criticisms have been leveled against law schools. The first is an economic critique—law school is not worth it financially compared to a terminal bachelor’s degree. This critique is incorrect for the overwhelming majority of law school graduates.
The second is a moral critique—that law schools behaved unethically or even committed fraud (see here, here, and here) by presenting their employment statistics in a misleading way. (While at least one of the 200+ American Bar Association (ABA) approved law schools misreported LSAT scores and GPAs of incoming students, and a former career services employee at another alleges specific misreporting of unemployment data at that law school, I am focusing here not on the outliers, but on the critique against all law schools generally).
The moral critique against law schools comes down to this: The law schools used the same standard method of reporting data as the U.S. Government.
According to the critics’ line of reasoning, “employment” means only full-time permanent work as a lawyer. Anything else should count as either “unemployment” or some special category of pseudo-unemployment (i.e., underemployment) . (This is apparently based on an incorrect belief that law school only benefits the subset of graduates who practice law).
Employment and unemployment statistics are not meaningful in a vacuum. They only become useful when they can be compared across time, for different groups, or for a different set of choices. For example, prospective law students might want to know that law school graduates are generally less likely to be unemployed or disabled than similar bachelor’s degree holders. (Frank McIntyre and I combine the unemployment and disability rates whenever possible because of research showing that disability is often a mask for unemployment, although we’d generally get similar results for relative rates if we just used unemployment).
To avoid confusion and ensure that data are comparable, the standard definitions used by the U.S. Government should be used when reporting employment statistics, unless there is an indication that non-standard definitions are being used.
The standard government definitions of “employment” and “unemployment” are the way we all use these words in ordinary speech when we say things like “the unemployment rate went down this year.” These are not obscure definitions. Googling “unemployment definition” and checking the first few results—Investopedia , Wikipedia, About.com, and the U.S. Bureau of Labor Statistics (BLS) website —will get you to the right answer.
So how does the United States government define “employment”?
The most commonly reported and cited official government employment statistics include individuals as “employed” whether such individuals are employed full-time or part-time, whether in permanent or nonpermanent positions, whether in jobs that do or do not require the level of education they have obtained.*
In other words, the U.S. Government counts individuals as employed even if they are employed in part-time, temporary jobs that do not require their level of education. Indeed, individuals count as employed even if they are self-employed or worked without pay in a family-owned business.
When the government reports education-level-specific employment statistics** it uses the same definitions and does not restrict employment to those who are employed in jobs that require their education level. Employment includes any employment, whether full-time or part-time, whether temporary or permanent, whether in a job that requires a given level of education or not.
What about the standard definition of “unemployment”?
Unemployment is not the absence of employment. Instead, there are three categories—employed, not-in-labor-force, and unemployed. An individual only counts as “unemployed” if he or she “had no employment during the reference week”, was “available for work, except for temporary illness” and recently “made specific efforts to find employment.”
Those who are not working and are not actively seeking work for whatever reason—for example, caring for dependents, disability, pursuing additional education—are not counted as part of the labor force. Unemployed persons as defined by CPS are used to calculate the widely cited “unemployment rate.” The unemployment rate is defined as unemployed persons as a percent of the labor force--in other words, excluding those who are neither working nor seeking work.***
Some law school critics have claimed that anyone who fails to respond to a survey about their employment status should be assumed to be unemployed. The Census and BLS disagree, and instead weight the data to account for non-respondents.
In addition to top-level information about employment status, some data sources such as the CPS may also include fields with more detailed information about full- or part-time work-status, industry or sector, and occupation. Law schools have also historically provided a detailed breakdown of employment categories shortly after graduation in the ABA-LSAC Official Guide To ABA-Approved Law Schools. In the last few years, law schools have provided even more detail in ABA-required disclosures. (We’ve previously noted some of the problems with focusing on employment outcomes shortly after graduation rather than long-term value added; The ABA's new employment data protocols have additional problems with their definition of "unemployed" discussed below ****). The National Association for Law Placement (NALP) also provides high level data and a more detailed breakdown.
The inclusion or non-inclusion of more detailed information does not alter the meaning of top-level information about employment status: the meaning of “employed” is established and well understood by users of employment data. Commonly used and cited employment statistics have been reported by the BLS from 1948 through the present, and are widely understood by users of employment data.
Indeed, the BLS has noted for decades in its Occupational Outlook Handbook that many law school graduates do not work as lawyers. Law schools and bar examiners publish bar passage rate statistics which clearly show that many recent law school graduates cannot legally be working as lawyers (unless everyone who failed a bar exam in one state passed a bar exam in another).
Comparing apples to apples using standard definitions reveals that law school graduates are doing relatively well compared to similar bachelor’s degree holders. By contrast, critics of law schools and plaintiffs lawyers have used non-standard definitions and compared apples to oranges.
It is not surprising that the courts have dismissed the lawsuits against law schools. If only the New York Times and the Wall Street Journal were as fair and judicious.
* The primary source of labor force statistics for the population of the United States is the Current Population Survey (CPS), sponsored jointly by the United States Department of Labor, Bureau of Labor Statistics and the United States Census Bureau (Census). CPS is the source of numerous high-profile economic statistics, including the national unemployment rate. CPS defines "Employed persons"* to broadly include anyone who has done any paid work during the week when it is measured, who worked for themselves or a family member, or who was temporarily absent from work.
“Employed persons”* as defined by CPS are used to calculate the “Employment-population ratio”. The Employment Population Ratio resembles the “Percent Employed” statistics reported by law schools.
“Employed Persons” includes:
- 16 years and over
- in the
- noninstitutional population
- who, during the reference week,
- did any work at all (at least 1 hour) as paid employees;
- worked in their own business, profession, or on their own farm,
- worked 15 hours or more as unpaid workers in an enterprise operated by a member of the family;
- all those who were not working
- but who had jobs or businesses
- from which they were temporarily absent
- because of
- bad weather,
- childcare problems,
- maternity or paternity leave,
- labor-management dispute,
- job training,
- other family or personal reasons,
- whether or not they were paid for the time off or were seeking other jobs. . . . “
** The BLS also reports Employment Population Ratios for specific education levels and age groups such as bachelor’s degree holders and above ages 25 to 34. These statistics are also reported by the United States Department of Education, National Center for Education Statistics. (To the extent economists have tried to define and measure “underemployment” (see here and here ), it appears to be as or more common among bachelor’s degree holders compared to similar law degree holders).
*** The “labor force” as defined by CPS consists only of persons who are either “employed” or “unemployed” under CPS definitions.
**** The ABA’s new data protocol counts individuals as “Unemployed” who would instead be considered “Not-in-labor-force” by the U.S. government. The ABA subcategory, “Unemployed—Seeking” is probably the closest to the standard definition of unemployment. This misalignment between ABA definitions and standard government definitions of unemployment could lead individuals comparing ABA data to standard and widely used government employment data to erroneously conclude that unemployment for law school graduates is higher relative to other groups than it really is.
Tuesday, March 31, 2015
The Absence of Evidence for Structural Change: Growth in Lawyer Employment and Earnings (Michael Simkovic)
There have been a lot of doom-and-gloom reports about layoffs and collapsing job opportunities for lawyers. As we’ve noted before, the relevant question for valuing legal education is the boost to earnings from the law degree across occupations, not the more specific question of what is happening to lawyers, or even more specifically, big law firms.
But for the sake of argument, focusing more narrowly on the under-inclusive category of lawyers only, what does the data actually show about lawyer employment? Are doom-and-gloom predictions justified for lawyers even if not for law degree holders? According to many of the proponents of the structural change hypothesis, signs of structural change were showing up as early as 2010, or perhaps even as early as 2008. We now have several years of historical data beyond that point to consider whether their predictions, thus far, have proven correct.
Lawyer employment is growing. This is true both in absolute numbers, and also relative to overall employment. In other words, lawyers are becoming a larger share of the U.S. workforce.
The data in the chart above is from the U.S. Department of Labor, Bureau of Labor Statistics (BLS), Occupation Employment Statistics (OES), which is a survey of establishments (employers). The blue columns scaled to the left axis represents the absolute number of lawyers, while the red line scaled to the right axis represents lawyers as a percentage of the total labor force. As can be seen from the above chart, both numbers are trending upward.
One limitations of BLS OES is that it focuses on employees, not owners, and therefore excludes law firm partners and solo practitioners. Another leading source of data, the U.S. Census Bureau’s Current Population Survey (CPS), is a survey of households, and includes solos and law firm partners.
CPS shows much the same trend as BLS OES. Lawyer employment is increasing, both in absolute terms and as a share of total employment. The charts below show CPS data.
The leading government data sources show the same thing—growth in employment of lawyers is faster than (or at least as fast as) overall employment growth.
The practice of law is also becoming more lucrative, at least over the long term. According to a recent draft paper by Richard Sander and E. Douglass Williams, after controlling for changes in the demographic composition of the legal profession, Sander and Williams find long-term growth in real (inflation-adjusted) lawyer earnings. (Sander and Williams use IPUMS-CPS data, and focus on white males, since historical data is not as readily available for women and minorities who have joined the legal profession in large numbers only in recent years; To understand the importance of controlling for demographic changes in the profession, consider Simpson’s Paradox).
Data from the Sander and Williams study is provided in the chart below.
After 2010, the picture for lawyer earnings is more mixed. BLS OES data suggests modest declines in real earnings of lawyers of around 6 percent by 2014. By contrast, the CPS suggests modest real growth in lawyer incomes of around 3 percent by 2014. Overall, it’s likely that real lawyer earnings have been close to flat. Flat earnings are consistent with what has been happening elsewhere in the labor market (see here and here). Given lawyers’ highly advantageous starting position relative to most other occupations, flat earnings or even modest declines suggest that lawyers have maintained a large relative advantage even as they have grown in relative numbers. (It’s possible that incomes for lawyers may have become more dispersed over time, notwithstanding the averages—indeed, it would be surprising if that were not the case, given the general trend toward widening income dispersion across the economy).
As noted previously, changes in entry level earnings and employment, though larger than those for the profession as a whole, are consistent with changes at the entry level for the rest of the labor market and established historical patterns. Young law graduates continue to earn substantially more than young bachelor’s degree holders post 2008.
Within a few years of graduation, about as large a proportion of employed young professional degree holders were working as lawyers after 2008 as before 2008.
Some critics of legal education have focused on “legal services” (mostly law firms). This is not a good measure of either the value of a law degree, or of the labor market for lawyers. Most employees in “legal services” are not lawyers, but rather support personnel such as secretaries, paralegals, and business and technology specialists. Many lawyers and law degree holders work outside law firms.
Changes taking place in “legal services” might be affecting the non-lawyers who work there rather than the lawyers. Changes in “legal services” affecting lawyers could be offset by changes affecting lawyers working in other industries. In other words, legal work could be moving out of the law firms and in house or into other professional service firms such as accounting firms.
BLS and CPS data for “lawyers” provides a much clearer picture of the legal employment market, while law earnings premiums across occupations are the most useful measure of the value of a law degree.
Growth in earnings and employment has been slower in recent years than in the past, to be sure, but that is generally true across the economy. The case for massive structural change in the legal profession eroding the value of a law degree is not well supported by the data.
Monday, March 30, 2015
Thursday, March 26, 2015
Some have claimed that deteriorating outcomes for recent law school graduates are a sign of permanent structural change in the legal industry and that these changes are reducing the value of legal education.
There are two important problems with this claim. First, the same changes are taking place across the labor market, and are not a law-specific problem. Indeed, the law degree has maintained its value relative to a bachelor’s degree. Second, entry-level employment and starting salaries are known to be volatile and cyclical, so large swings aren’t a sign of much of anything other than business as usual in a recession or boom.
The Economic Value of a Law Degree lacked data on those who graduated after 2008 because of limitations of the Survey of Income and Program Participation (SIPPP). Timing Law School supplements this data with additional information from the American Community Survey (ACS). Using ACS we look at young lawyers and young professional degree holders excluding those in medical occupations—two proxies for law graduates, one under inclusive, the other over inclusive. Both of these proxies (along with SIPP data) suggest that recent law graduates have maintained a large advantage relative to similar bachelor’s degree holders. The ACS data is presented below.
(A log earnings premium is similar to a percentage difference in earnings. A 0.6 log earnings premium means that young lawyers earn about 82 percent more than young bachelor's degree holders.)
What many in the press and some law professors mistook for a law-specific crisis was in fact a widely known phenomenon in labor economics—employment and salaries for inexperienced workers are more volatile and sensitive to economic cycles than employment and salaries for those with more experience.
Predicting structural change on the basis of established cyclical patterns is analogous to drawing conclusions about permanent climate change on the basis of temperature changes between summer and winter. Occasionally, the person making the prediction might get lucky and turn out to be right, but the evidence is weak, the analysis fails to test more plausible rival hypotheses, and the conclusion of permanent change is little more than a wild guess.
Climate Scientists are more careful than this. They use a back testing approach similar to the one Frank McIntyre and I use in Timing Law School. Back testing suggests that the prediction methods used to support the structural change hypothesis are baseless, at least with respect to such changes degrading the value of legal education.
Structural change can mean different things to different people. By structural change, some people may simply mean that subjectively, the practice of law feels different than it used to, not that law graduates are getting any less value for the money. Or they may mean more generally that the kind of work law graduates do is different, even if not relatively less well compensated. This softer, humanistic view of structural change may have merit, although once again, it may also reflect broader trends in the economy rather than law-specific issues.
Over the last two weeks I’ve discussed the case for structural change—Bureau of Labor Statistics projections, entry-level outcomes, etc.—and found little support for the hypothesis that the value of a law degree has permanently declined.
Next week I’ll discuss another pillar of the structural change argument—growth rates in the “legal services” industry.
Law schools and prospective law students may be paying more attention to employment outcomes shortly after graduation than this short-term data deserves. One potential use of the aggregate data about entry level employment and salaries is to assess whether now is a good or bad time to apply to law school. But fluctuations in employment outcomes for recent graduates do not predict fluctuations in employment outcomes 3 or 4 years in the future when those currently deciding whether to enroll would graduate.
Nevertheless, law students and the press pay close attention to the short-term outcome data. Starting salary data from the National Association for Law Placement (NALP) is covered by the press and is a good predictor of the number of law school applicants two years later (We assume one year lag for data collection and dissemination; one year lag to apply to law school).
Why are students responding to this data even though it does not predict their own short-term outcomes? And does the responsiveness of enrollment to short-term outcomes mean that law students care only about the short term?
Law students likely think more long term. If law students were so impatient that they only cared about one or a few years of earnings, it is doubtful that law students would have completed college, since college also makes sense only as a long-term investment. Indeed, students who were so focused on the short term might not even have finished high school. While temporal preferences can change over time, education appears to shift people toward thinking more long term. Aging from adolescence through the age of 30 is also associated with becoming more oriented toward the future.
Perhaps students are focused on the short term because they mistakenly believe that swings in short term outcomes predict more than they do. Students would not be alone in this error.
Some widely read back-of-the-envelope analyses started with initial salaries, assumed unrealistically low earnings growth along with high discount rates or an arbitrary payback period (lack of concern for the future) and reached the erroneous conclusion that going to law school does not make sense financially. (For a discussion see here; for examples of erroneous studies, see here and here )
Students may be focused on the short term because they mistakenly believe it predicts more than it does. Or they may focus on the short term because it is the only information that is readily available to them.
Legal educators and the press can and should make greater efforts to inform students of the long term as opposed to the short-term consequences of legal education. We should also shift the discussion away from raw outcomes and toward estimates of causation and value-added relative to the next best option.
This will be a challenge. Short-term raw outcome data is embedded in American Bar Association-required disclosures, in NALP’s data collection efforts and in the U.S. News rankings. Thinking in value-added terms requires us all to understand basic principles of causal inference and labor economics. But shifting toward long-term value added is ultimately the right thing to do if we are serious about providing students with meaningful disclosure and facilitating informed decision making.
This is not meant to justify indifference to the plight of young people who have suffered the misfortune of graduating into an unfavorable economic climate over the last several years. To help alleviate youth unemployment, we must understand that the cause of this misfortune is the macro-economy, not higher education. Education is an important part of the solution. Among those who are young and inexperienced, those with more education continue to do better in the labor market than those with less, and this difference appears to be largely caused by the differences in level of education.
Insurance programs like income-based repayment of student loans and flexible and extended repayment plans can help young people manage the unpredictable and uncontrollable risk that they might happen to graduate into a bad economy. If this insurance leads to more people pursuing higher education, earning higher incomes, and paying more taxes, it will benefit not only students and educators, but also the federal government and the broader economy.
Wednesday, March 25, 2015
Many legal educators believe that shrinking class sizes will help the students they do admit find higher paid work more easily and boost the value of legal education. They reason that if the supply of law graduates shrinks, then the market price law graduates can command should increase.
According to another hypothesis, now popular in the press, a decline in the number of law school applicants reflects the wisdom of the crowds. Students now realize that a law degree simply isn’t worth it, and smaller class sizes reflect a consensus prediction of worse outcomes for law graduates in the future.
Frank McIntyre and I investigated whether changes in law cohort size predict earnings premiums. Historically, they have not. Not for recent graduates, and not for law graduates overall. Nor have changes in cohort size predicted much of anything about the entry-level measures used by the National Association for Law Placement (NALP)—starting salary, initial employment, initial law firm employment.
How can both of these theories be wrong? One possibility is that they are both right, but the two effects offset each other. This seems unlikely however. If neither macroeconomic data nor Bureau of Labor Statistics (BLS) employment projections can predict law employment conditions at graduation, then how likely is it that recent college graduates with less information and less expertise could make a better prediction?
A more likely possibility is that there are other factors at play that prevent any strong predictions about the relationship between cohort size and outcomes / value added. For example, law schools may become less selective as cohort size shrinks and more selective as it increases. In addition, the resources available to law schools, and therefore the quality of education and training they are able to provide, may also change with cohort size. Since physical facilities expenses are not particularly variable in the ordinary course, most budgetary adjustments at law schools presumably take place with respect to personnel.
Anecdotally, many law schools appear to be managing the recent decline in enrollments by shrinking their faculties and administrations and using remaining personnel to teach classes and perform functions outside of their areas of expertise. Reduced specialization and a lack of economies of scale could affect the quality of service provided to students, offsetting any benefits to students from less competition at graduation.
Previous research in labor economics has found that resources per student are an important predictor of value added by college education, and that the use of adjuncts can lead to worse outcomes for students. (See here for a review)
Much of this is speculative—we do not yet understand why changes in cohort size do not predict law graduate outcomes, only that they do not predict outcomes. Given the historical data, it is probably not advisable to read too much into what the decline in law school enrollment means for students who will graduate over the next few years.
Instead, we should focus on the long-term historical data and the value of a law degree across economic cycles and enrollment levels.
Tuesday, March 24, 2015
UPDATE: As of March 20, applicants are now down only 2.9% from last year. My guess is that we have hit bottom in terms of the applicant decline.
Monday, March 23, 2015
Labor economists have long cautioned against the misuse of Bureau of Labor Statistics (BLS) employment projections.
In 2004, Michael Horrigan at the BLS explained that the BLS projections should not be used to value education or to attempt to predict shortages or surpluses of educated labor. Instead, the value of education should be measured based on earnings premiums—the measure used in The Economic Value of a Law Degree and Timing Law School.
The general problem with addressing the question whether the U.S. labor market will have a shortage of workers in specific occupations over the next 10 years is the difficulty of projecting, for each detailed occupation, the dynamic labor market responses to shortage conditions. . . . Since the late 1970s, average premiums paid by the labor markets to those with higher levels of education have increased.
It is the growing distance, on average, between those with more education, compared with those with less, that speaks to a general preference on the part of employers to hire those with skills associated with higher levels of education.
The BLS takes the same position in its FAQ. The BLS does not project labor shortages or surpluses.
In 2006, Richard Freeman back-tested the BLS projections and found that “the projections of future demands for skills lack the reliability to guide policies on skill development.”
The BLS employment projections are not only unreliable. Comparing occupation-specific employment projections to number of graduates in related fields systematically underestimates the value of higher education.
In 2011 David Neumark, Hans Johnson, & Marisol Cuellar Mejia wrote:
If there are positive returns to education levels above those indicated as the skill requirement for an occupation in the BLS data – and especially if these wage premia are similar to those in other occupations – then relying on the BLS skill requirements likely substantially understates projected skill demands.
For nearly every occupational grouping, wage returns are higher for more highly-educated workers even if the BLS says such high levels of education are not necessary. For example . . . for management occupations, the estimated coefficients for Master’s, professional, and doctoral degrees are all above the estimated coefficient for a Bachelor’s degree, which is the BLS required level. . . ..
If the BLS numbers are correct, we might expect to see higher unemployment and greater underemployment of more highly-educated workers in the United States. As noted earlier, we do not find evidence of this kind of underemployment based on earnings data. Similarly, labor force participation rates are higher and unemployment rates are lower for more highly educated workers.
Neumark et. al. also noted that recent BLS projections appeared to be much too low for managerial and legal services occupations.
Starting around 2012 many law professors and pundits argued that the number of job openings for lawyers projected by the BLS relative to the number of expected law graduates suggested that too many students were attending law school and that they would not get much value out of their degrees.
The Bureau [of Labor Statistic]’s occupational employment projections . . . answer the very question that many law school applicants want to know: How many new lawyers will the economy be able to absorb this decade?
The Bureau currently estimates that the economy will create 218,800 job openings for lawyers and judicial law clerks during the decade stretching from 2010 through 2020. That number, unfortunately, falls far short of the number of aspiring lawyers that law schools are graduating.
The oversupply of entry-level lawyers deprives many graduates of any opportunity to practice law. At the same time, the lawyer surplus constrains entry-level salaries.
Merritt notes the possibility that law might be a versatile degree with value outside of legal practice.
Further evidence that law degrees are unlikely to become more valuable going forward can be found in the projections of the Bureau for Labor Statistics (BLS) . . . [which suggest many more law graduates than job openings].”
In 2013, Brian Tamanaha wrote:
The U.S. Bureau of Labor Statistics estimates about 22,000 lawyer openings annually through 2020 (counting departures and newly created jobs). Yet law schools yearly turn out more than 40,000 graduates. This bleak job market coexists with astronomically high tuition.
Several and journalists also started comparing BLS projections and job openings to make much the same argument.
In 2013, unaware of the problems with job openings projections, I (Simkovic) suggested that projections might be used to make adjustments to more objective historical baselines for risk-based student loan pricing.
On the chance that BLS projections that perform poorly in other contexts perform well in the legal education context, Frank McIntyre and I analyzed the extent to which BLS projections predict law graduate outcomes (earnings premiums). The answer is: no better than random chance.
As in other areas, BLS employment projections are not reliable or meaningful for predicting earnings premiums and are therefore not useful for valuing legal education.
But what about the number of job openings for lawyers? Can BLS projections at least predict that reasonably well?
It is unclear at this point if the new job opening projections method will predict earnings premiums better than the old ones. In any case, that was never their intended purpose, and it would be safer to predict earnings premiums and value education based on historical earnings premiums.
It remains likely that many law school graduates will not practice law. Such has been the case in the past, and such is the case in other fields. Many engineering, math and science graduates do not work as engineers, mathematicians or scientists in their fields of study. Most fields of study do not have a one-to-one correspondence with a particular occupation, but are more broadly useful in the labor market, and law is no exception. In spite of many individuals working outside their degree fields, higher education typically has been, and likely will remain, an investment with positive returns.
To best way to tell whether there is too much or too little investment in education is to consider relative returns that take into account risks and variability in employment. Are the returns to education higher or lower than returns that can be had elsewhere with similar levels of risk? The returns to education are generally much higher, and risk does not appear to explain this difference adequately. The high relative returns to education suggest underinvestment in education.
Friday, March 20, 2015
Thursday, March 19, 2015
How can we test predictions about the future when we don’t yet have data showing what will happen in the future? One answer is hindcasting. You already believe in hindcasting if you believe in the science behind global warming (see also here and here).
“Hindcasting” (or “backtesting”) is using historical data to test prediction methods and it is widely used in finance, engineering, and climate science. The basic idea is that a prediction method can be reduced to a set of rules or mathematical formulas. Historical data from the more distant past can be fed into these rules and formulas, and the resulting predictions about the “future” (relative to the distant past that provided the data) will also be predictions about the past (relative to the period in which the researcher conducts the backtest).
Since data about the “future” is now available, predictions generated by the prediction method can be compared to what actually happened. A prediction method does not have to be correct all of the time to be useful; if a prediction method performs a bit better than random chance, it might still be useful in many contexts, especially in investment management. If it performs better than the next best prediction method, then it is still useful even if it is imperfect. But if a prediction method does not perform any better than random chance, it is discredited and discarded.
Using this hindcasting approach, Frank McIntyre and I test popular prediction methods used by various pundits and professors to try to predict whether now is a good or bad time to go to law school. (See Timing Law School) As in our previous research, our primary outcome variable of interest is law earnings premiums—the earnings of law school graduates relative to the earnings of similar bachelor’s degree holders. This is the relevant measure, because it goes to the value added by law school, and can be compared to the cost of attendance.
The peer-reviewed labor economics literature finds that a law degree has been a lucrative investment for the overwhelming majority of law school graduates compared to entering the labor market with just a bachelor’s degree. Nevertheless, questions persist about whether now is an unusually good or bad time to start law school.
According to one popular hypothesis, now is an unusually bad time to go to law school because employment outcomes for recent graduates 9 months after graduation have deteriorated. These graduates, it is argued, will not have the same career success as law school graduates in the past. Moreover, deterioration in outcomes for those who graduated last year predicts poor outcomes three or four years in the future and beyond for those who are entering law school now.
According to another popular hypothesis, now is an unusually good time to go to law school because so few people are doing it. When these small cohorts of law students eventually graduate, they will all be more likely to find a high paying job than the larger cohorts of the past. A variation on this argument is that now is still a bad time to go to law school in spite of falling enrollments because the number of law school graduates will still be greater than the number of BLS projected job openings for lawyers. (For a discussion of newer BLS projection methods showing more job openings, see here)
Our analysis includes graduates from 1964 through 2008 and earnings data from 1984 to 2013. This period captures numerous economic booms and recessions. As in The Economic Value of a Law Degree, our main source of data is the U.S. Census Bureau’s Survey of Income and Program Participation. We were able to backfill the data to include older versions of the survey and capture more years of macroeconomic variation thanks to grant funding from Access Group, Inc., and LSAC. (Because the older data has some limitations, those who are interested in the value of a law degree rather than the size of cohort effects should still consult our 2014 article).
None of the prediction methods we tested perform better than random chance. Cohort size is not predictive. Cohort size relative to BLS projections is equally useless. Although those who graduate in a boom when unemployment is low do indeed have higher earnings premiums in their first few years after graduation than those who graduate when unemployment is low or moderate, the effect fades after the first four years. More importantly, it is not possible to predict whether unemployment will be high or low four years in the future based on currently available data. Even those who are unlucky enough to graduate into a weak economy still generally benefit substantially from their law degrees.
Delaying law school to attempt to “time the market” is an imprudent strategy. It does not improve one’s chances of graduating into a favorable economic climate. It entails substantial opportunity costs in the form of fewer years of higher, post-law-school earnings. The cost of every year of delay averages tens of thousands of dollars. Popular prediction methods for market timing are not only scientifically baseless; they also appear to be financially toxic to prospective students who take them seriously.
The best guide to the future continues to be the long-term historical data. Short-term fluctuations around these averages are not readily predictable. Instead of trying to predict the unpredictable, it may be more prudent to focus on helping students manage these risks, for example through insurance programs similar to Income-Based Repayment of student loans. (See also here)
But what about more recent graduates? How much can we say about those who graduated after 2008, and is this time different? How can we explain our results in light of previous research on cohort effects focused on bachelor’s degree holders?
For answers to some of these questions, look for our next blog post.