Friday, April 10, 2015
Information overload and response rates (Michael Simkovic)
Did law schools behave unethically by providing employment and earnings information without simultaneously reporting survey response rates? Or is this standard practice?
The answer is that not reporting response rates is standard practice in communication with most audiences. For most users of employment and earnings data, response rates are a technical detail that is not relevant or interesting. The U.S. Government and other data providers routinely report earnings and employment figures separate from survey response rates.*
Sometimes, too much information can be distracting.** It’s often best to keep communication simple and focus only on the most important details.
Nonresponse is not the same thing as nonresponse bias. Law school critics do not seem to understand this distinction. A problem only arises if the individuals who respond are systematically different from those who do not respond along the dimensions being measured. Weighting and imputation can often alleviate these problems. The critics’ claims about the existence, direction, and magnitude of biases in the survey data are unsubstantiated.
High non-response rates to questions about income are not a sign of something amiss, but rather are normal and expected. The U.S. Census Bureau routinely finds that questions about income have lower response rates (higher allocation rates) than other questions.
Law school critics claim that law school graduates who do not respond to questions about income are likely to have lower incomes than those who do respond. This claim is not consistent with the evidence. To the contrary, high-income individuals often value privacy and are reluctant to share details about their finances.***
Another potential problem is “response bias”, in which individuals respond to survey questions in a way that is systematically different from the underlying value being measured. For example, some individuals may under report or over-report their incomes.
The best way to determine whether or not we have nonresponse bias or response bias problems is to gather additional information about non-responders and responders.
Researchers have compared income reported to Census surveys with administrative earnings data from the Social Security Administration and Internal Revenue Service. They find that highly educated, high-income individuals systematically under-report their incomes, while less educated, lower income individuals over-report. (Assuming the administrative data is more accurate than the survey data).
Part of the problem seems to be that bonuses are underreported, and bonuses can be substantial. Another problem seems to be that high-income workers sometimes report their take-home pay (after tax withholding and deductions for benefits) rather than their gross pay.
Other studies have also found that response bias and nonresponse bias lead to underestimation of earnings and employment figures.
In other words, there may indeed be biases in law school earnings data, but if there is, it is likely in the opposite direction of the one the law school critics have claimed.
Of course, the presence of such biases in law school data would not necessarily be a problem if the same biases exist in data on employment and earnings for alternatives to law school. After all, earnings and employment data is only useful when compared to a likely alternative to law school.
As with gross employment data, the critics are yet again claiming that an uncontroversial and nearly universal data reporting practice, regularly used by the United States Government, is somehow scandalous when done by law schools.
The only thing the law school critics have demonstrated is their unfamiliarity with basic statistical concepts that are central to their views.
------
* Reporting earnings and employment estimates without response rates in communication intended for a general audience—and even some fairly technically sophisticated audiences—is standard practice for U.S. government agencies such as the U.S. Census Bureau and the U.S. Department of Labor, Bureau of Labor Statistics. A few examples below:
- Earnings and unemployment by education level
- Unemployment rates
- Employment population ratio
- Tabular summaries from
- the Survey of Income and Program Participation (see here and here)
- The American Community Survey (see also here)
** Information on response rates is available for researchers working with microdata to develop their own estimates, and for those who want to scour the technical and methodological documentation. But response rates aren’t of much interest to most audiences.
*** After the JD researchers noted that young law graduates working in large urban markets—presumably a relatively high-income group—were particularly reluctant to respond to the survey. From After the JD III:
“Responses . . . varied by urban and rural or regional status, law school rank, and practice setting. By Wave 2, in the adjusted sample, the significant difference between respondents and nonrespondents continued to be by geographic areas, meaning those from larger legal markets (i.e. New York City) were less likely to respond to the survey. By Wave 3, now over 12 years out into practice, nonrespondents and respondents did not seem to differ significantly in these selected characteristics.”
In the first wave of the study, non-respondents were also more likely to be male and black. All in all, it may be hard to say what the overall direction of any nonresponse bias might be with respect to incomes. A fairly reasonable assumption might be that the responders and non-responders are reasonably close with respect to income, at least within job categories.
https://leiterlawschool.typepad.com/leiter/2015/04/information-overload-and-response-rates-michael-simkovic.html