Saturday, January 23, 2016

With empirical work the fashion of the moment in the legal academy...

...I'm curious what folks have to say about the eminent psychologist Richard Nisbett's take on multiple regression analysis:

I hope that in the future, if I’m successful in communicating with people about this, that there’ll be a kind of upfront warning in New York Times articles: These data are based on multiple regression analysis. This would be a sign that you probably shouldn’t read the article because you’re quite likely to get non-information or misinformation.

Do read the whole interview.  Thoughts from readers?

https://leiterlawschool.typepad.com/leiter/2016/01/with-empirical-work-the-fashion-of-the-moment-in-the-legal-academy.html

Of Academic Interest | Permalink

Comments

Regression analysis has its flaws, but I'm not convinced that they are generally the flaws he is pointing out, which is more about flaw in design/execution than in statistical method. Much of what he's talking about is omitted variable bias, which is a definite methodological concern, not a knock on regression analysis altogether. If he thinks that income has an effect on length of marriage AND on elaborateness of weddings, then include income in the model and see if that's true.

Experiments are often better, as he notes, but experiments have their own problems depending on how representative your population is (as his Yale sleep example shows).

Posted by: Michael Risch | Jan 23, 2016 7:46:47 AM

It seems he’s more concerned with hasty generalizations made from regression rather than the analysis itself. If I were a competent researcher looking at his Vitamin E example, I would have accounted for the “healthy user” bias by measuring all sort of other health and disease indicators. Regression analyses could then tell me whether Vitamin E has a benefit over and above the other behaviors that go along with it. This type of technique is the entire basis of quasi-experimentation, and it is something that certainly shouldn’t simply be ignored. It’s pretty surprising that someone like Nisbett is making such a radical claim about a statistical tool – it’s not the math that’s bad, just the users who don’t know how to use or interpret it.

Posted by: Nick Schweitzer | Jan 23, 2016 9:17:47 AM

While it is, indeed, surprising that someone like Nisbett would go after the tool rather than the user, even more surprising would be the alternative: that he is ignoring or conflating the distinction by mistake. If Nisbett has actually become persuaded that the revealed tendency of multiple regression analysis to be misinterpreted now outweighs its benefits as a general matter, advocating a preference against the tool seems rational.

Posted by: Saurabh Vishnubhakat | Jan 23, 2016 3:14:54 PM

The point in the linked-to article is not news in the law world, and much of contemporary analysis by legal empiricists struggles with a) the right control variables [eg, the background health conditions and self-care in the first example in the article are rather obvious] and b) finding "natural experiments" that aren't subject to the same critique. The minimum wage studies are a classic example. Look at employment effects in two adjacent areas separated only by a state line, in which one state increases minimum wage. Or "regression discontinuity" designs, in which we see effects diverging around some artificial (because law or regulation-made) line. Yet a general concern with regression analysis, even if well-done, nevertheless remains. Take high blood pressure management. Observational studies with standard regression controls suggest that the optimal systolic level is the low 130s, that treatment to obtain a lower level is associated with a "J-curve" effect of increased morbidity/mortality. Yet a recent experimental study, in which random assignment was used, showed that the optimal level is 120 and below! The study was prematurely halted because the health benefits of tighter control were so apparent that it was deemed unethical to maintain patients at the higher level. Clinical practice is likely to change.

We all know the weakness of arm chair empiricism. How we best know the world is a fundamental, persistent question.

Posted by: anon | Jan 24, 2016 9:06:29 AM

What Michael said. Yeah, observational studies (not just the ones that use linear regression) are subject to omitted variable bias. Yeah, experimental studies done right are the gold standard.

But does he have something better to offer for the many, many cases where experimental research is either unethical (if I can put on my political scientist's hat for a moment, we can't start wars, rig elections, etc. etc.---and legal empiricists can't swap judges, convince lawyers to make different arguments, etc.), or impractical?

Posted by: Paul Gowder | Jan 24, 2016 8:14:06 PM

I think that Mr Nisbett's piece is simultaneously pointing at two different logic problems.

First, that correlation is not causation, especially when in the real world. (And a corollary here — pun intended — is "don't confuse samples with populations;" know which tools are empirical, which are epidemiological, and be consistent.) It's not the regression that's the problem; it's the conclusions drawn from that regression, which are usually recast in soundbite form that inherently overstates everything. Sometimes the only possible conclusion is "it's complicated," but that doesn't make for a publishable article (whether in the academic or the popular domain).

Second — and especially in anything involving volition or nonrandom resource restrictions, such as nearly everything in law — beware of the difference between inductive and deductive reasoning... and the fallacies associated with each. It's all well and good to say that "X leads to Y" — but not so much "every X leads to every Y," or "X leads to this instance of Y without any alternate explanation."

Posted by: C.E. Petit | Jan 25, 2016 11:37:49 AM

"But does he have something better to offer for the many, many cases where experimental research is either unethical ... or impractical?"

I'm sympathetic, Paul, but surely something more than this is needed as a defense. If the multiple regression analysis approach is as bad as suggested (I can't say myself), then the fact that we don't have any other practical alternatives doesn't make it okay to use. The approach needs to be defended on its own merits, not on whether there are any clear alternatives.

Posted by: Matt | Jan 25, 2016 11:42:02 AM

Matt, I can't agree. If observational studies are our best way of getting knowledge about many aspects of the world---aspects that we desperately need to get knowledge about, especially in areas like public health, international relations, etc., where the wellbeing and sometimes lives of billions of people are at stake---then it would seem to me that we have not only an entitlement but an obligation to use them, albeit as carefully as possible. (Many of Nisbett's complaints are about *uncareful* observational studies; with those I heartily agree, but to the extent his complaint is about the method itself, this objection applies.)

Posted by: Paul Gowder | Jan 26, 2016 9:23:33 AM

This article is absolutely correct about the pervasive misuse of statistical analysis and the weakness of using regression analysis to account for a non-experimental setting. I, too, am obsessed with this issue and at work on a project to combat this and other common inferential mistakes people make.

Microeconomists have been very concerned about these problems for at least 30 years, but they still haven’t made their way to public health, nutrition, fitness studies, and much of psychology (maybe Nisbett will change this). If you don’t believe me, just open up the paper and read the latest regression-based nutrition recommendation. Then check back in two weeks for the opposite rec – also from a regression study.

The notion that regression heals all is also very widespread in the law. Brian’s wishful thinking aside - that empirical studies are the fashion of the moment – it is clear that empirical analysis is with us to stay and will likely only increase.

What does this mean for the average law prof? It doesn’t mean learn stata. It means learn the logic behind the regression approach, and the flaws behind it. Omitted variables bias is just one inferential impediment, but it is a major one, and leaving out potentially relevant variables doesn’t just mean a regression is almost right – it means it could be dead wrong. And there are very few situations where one can include all relevant controls.

The good news for law is – logical thinking is our forte! It takes logical, not mathematical skill to recognize what makes for a good study, and what makes for a bad one; when we can learn something and when we can’t. Once that mode of thinking is mastered – then one can get stata and happily report… summary statistics. Because the regression won’t add much.

BL COMMENT: It is not my wish that there not be empirical work! There has been interesting empirical work going on in law schools for many decades. I think it is fair to say that now there is quite a bit more, as reflected on the entry-level market--an indication that it has become "fashionable." Even when it is less fashionable, I'm sure such work will continue to be done, as it should!

Posted by: David Abrams | Jan 27, 2016 2:08:34 AM

Objections to MRA very similar to Nisbet's were raised many decades ago by Paul Meehl "Nuisance Variables and the Ex Post
Facto Design," 4 Minnesota Studies in the Philosophy of Science 373-402 (1970)

4 Minnesota Studies in the Philosophy of Science

Posted by: Howard Ulan | Jan 28, 2016 7:30:03 AM

http://www.econ.ucla.edu/workingpapers/wp239.pdf
Quick post now; perhaps more later. First econometric course I took was in college in 1983 and included Ed Leamer's Let's Take the Con Out of Econometrics (draft linked above). Legal academic wags might take the title to characterize the article as the product of some crit mindset, but the piece was published in the American Economic Review and provides some good insights on how to do and how not to do regression analysis.

Posted by: Shubha Ghosh | Feb 3, 2016 12:04:08 PM

Post a comment