Friday, March 20, 2015
Thursday, March 19, 2015
How can we test predictions about the future when we don’t yet have data showing what will happen in the future? One answer is hindcasting. You already believe in hindcasting if you believe in the science behind global warming (see also here and here).
“Hindcasting” (or “backtesting”) is using historical data to test prediction methods and it is widely used in finance, engineering, and climate science. The basic idea is that a prediction method can be reduced to a set of rules or mathematical formulas. Historical data from the more distant past can be fed into these rules and formulas, and the resulting predictions about the “future” (relative to the distant past that provided the data) will also be predictions about the past (relative to the period in which the researcher conducts the backtest).
Since data about the “future” is now available, predictions generated by the prediction method can be compared to what actually happened. A prediction method does not have to be correct all of the time to be useful; if a prediction method performs a bit better than random chance, it might still be useful in many contexts, especially in investment management. If it performs better than the next best prediction method, then it is still useful even if it is imperfect. But if a prediction method does not perform any better than random chance, it is discredited and discarded.
Using this hindcasting approach, Frank McIntyre and I test popular prediction methods used by various pundits and professors to try to predict whether now is a good or bad time to go to law school. (See Timing Law School) As in our previous research, our primary outcome variable of interest is law earnings premiums—the earnings of law school graduates relative to the earnings of similar bachelor’s degree holders. This is the relevant measure, because it goes to the value added by law school, and can be compared to the cost of attendance.
The peer-reviewed labor economics literature finds that a law degree has been a lucrative investment for the overwhelming majority of law school graduates compared to entering the labor market with just a bachelor’s degree. Nevertheless, questions persist about whether now is an unusually good or bad time to start law school.
According to one popular hypothesis, now is an unusually bad time to go to law school because employment outcomes for recent graduates 9 months after graduation have deteriorated. These graduates, it is argued, will not have the same career success as law school graduates in the past. Moreover, deterioration in outcomes for those who graduated last year predicts poor outcomes three or four years in the future and beyond for those who are entering law school now.
According to another popular hypothesis, now is an unusually good time to go to law school because so few people are doing it. When these small cohorts of law students eventually graduate, they will all be more likely to find a high paying job than the larger cohorts of the past. A variation on this argument is that now is still a bad time to go to law school in spite of falling enrollments because the number of law school graduates will still be greater than the number of BLS projected job openings for lawyers. (For a discussion of newer BLS projection methods showing more job openings, see here)
Our analysis includes graduates from 1964 through 2008 and earnings data from 1984 to 2013. This period captures numerous economic booms and recessions. As in The Economic Value of a Law Degree, our main source of data is the U.S. Census Bureau’s Survey of Income and Program Participation. We were able to backfill the data to include older versions of the survey and capture more years of macroeconomic variation thanks to grant funding from Access Group, Inc., and LSAC. (Because the older data has some limitations, those who are interested in the value of a law degree rather than the size of cohort effects should still consult our 2014 article).
None of the prediction methods we tested perform better than random chance. Cohort size is not predictive. Cohort size relative to BLS projections is equally useless. Although those who graduate in a boom when unemployment is low do indeed have higher earnings premiums in their first few years after graduation than those who graduate when unemployment is low or moderate, the effect fades after the first four years. More importantly, it is not possible to predict whether unemployment will be high or low four years in the future based on currently available data. Even those who are unlucky enough to graduate into a weak economy still generally benefit substantially from their law degrees.
Delaying law school to attempt to “time the market” is an imprudent strategy. It does not improve one’s chances of graduating into a favorable economic climate. It entails substantial opportunity costs in the form of fewer years of higher, post-law-school earnings. The cost of every year of delay averages tens of thousands of dollars. Popular prediction methods for market timing are not only scientifically baseless; they also appear to be financially toxic to prospective students who take them seriously.
The best guide to the future continues to be the long-term historical data. Short-term fluctuations around these averages are not readily predictable. Instead of trying to predict the unpredictable, it may be more prudent to focus on helping students manage these risks, for example through insurance programs similar to Income-Based Repayment of student loans. (See also here)
But what about more recent graduates? How much can we say about those who graduated after 2008, and is this time different? How can we explain our results in light of previous research on cohort effects focused on bachelor’s degree holders?
For answers to some of these questions, look for our next blog post.
Wednesday, March 18, 2015
...about his continuing work on the labor economics of legal markets.
March 18, 2015 | Permalink
Tuesday, March 17, 2015
Monday, March 16, 2015
Last week, a website, USNews.com, released its annual rankings of law and other professional schools. (Since I was off-line last week, my comments had to wait.) Commentary, predictably, focused on the overall rank assigned by the website--what is known to "insiders" as the "nonsense number," since it is the upshot of an inexplicable weighting of 12 different factors, many self-reported and so of dubious accuracy anyway, but the amalgamation basically stipulative. Fortunately for USNews.com, many superficial journalists report the nonsense number, and changes in the nonsense number, as though they meant something.
So, for example, much ink was spilled on the "fact" that the University of Michigan Law School had a nonsense number of 11th, just outside "the top ten" where it usually resides. I did not see any journalist note, however, that just one raw score point (83 vs. 84) separated Michigan from Duke, Virginia, and Berkeley, all with a nonsense number of 8th. In other words, even by its own terms, the USNews.com demotion of Michigan to 11th was meaningless. (For those paying attention, Yale, because of its off-the-charts per capita expenditures, got a raw score of 100, Harvard and Stanford got 96, Columbia and Chicago 93, NYU 89, Penn 88, and then Duke et al. with 84.)
The most interest was in the nonsense number for UC Irvine, ranked for the first time this year. UCI came in at #30, the highest nonsense number debut I've ever seen in USNews.com. In USNews.com land, this puts UCI third in the UC system--behind Berkeley and UCLA, and ahead of UC Davis (31st) and UC Hastings (59th). (Hastings has probably been the most dramatic victim, over many years, of the small, private school bias in the USNews.com rankings.) Interestingly, UCI got this result despite weaker reputational scores: 29th in reputation among lawyers/judges (a rather good result, though, for a new school), and only 42nd among academics. Almost every school with a nonsense number around UCI had a higher academic reputation score, and my guess is UCI's will now improve accordingly. (The evidence for the echo chamber effect of the "overall rank" on the reputation scores in subsequent yeras is even greater now than in the past.) My guess is all those annoyed by the UC system starting a new law school penalized UCI in the reputational survey--if USNews.com published the median and mode, we'd have some idea, but I wouldn't be surprised if the distribution was skewed in that way. Counting against UCI is that it is still very small, and will presumably have to grow, which may affect other metrics.
USNews.com reported some curious data in various categories. I note two examples. Columbia, for the first time, reported the best student-faculty ratio in the nation: 6.3 to 1. Yale was 7.6/1, Stanford 7.3/1. Virginia reported 97.3% employed at graduation, but only 97% nine months out. This might be an artifact of the fact that USNews.com for the first time did not give schools full credit for graduates in law school funded positions--though many of these positions are quite legitimate.
The USNews.com reign of terror has now been going on for 25 years. It has been a disaster for legal education, though a boon for students with the favored numbers. In the 1990s, I used to try to reason with the USNews folks, and they in fact corrected some of their worst mistakes--for example, using starting salary data without taking into account regional differences; doing reputation surveys based on "quartiles" (meaning the dumbest evaluator--the one who forgot to put Harvard in the top quartile--determined the score); and failing to adjust expenditures for cost-of-living differences. But with regard to the basic problems--namely, that the weightings of the inputs are arbitrary, and that a lot of the data relied upon is bogus--they've done nothing. The only remedy for the USNews.com reign of terror will be competing systems, though hopefully not ones that simply replicate the USNews.com mistakes, though that is mostly what we have had so far.
Tuesday, March 10, 2015
For the first time in a half-century, the Encyclopedia Britannica has commissioned a new essay on "Philosophy of Law," written by myself and a former student, Michael Sevel, now at the University of Sydney. Hopefully ours will have a half-century run as well!
Friday, March 6, 2015
...at least for me, though my co-blogger Dan Filler may have some items. I'm also recuperating (alas) from another outpatient eye surgery for my on-going retinal detachment issues. I will probably get back on the blog the week of March 23 at the latest, though may have one or two items in the interim.
March 6, 2015 | Permalink