April 20, 2015
April 16, 2015
April 15, 2015
April 11, 2015
Deborah Merritt and Kyle McEntee conflated “response rates” with nonresponse bias and response bias. After I brought this error to light, Professor Merritt explained that she and Mr. McEntee were not confused about basic statistical terminology, but rather were being intentionally vague in their critique to be more polite* to the law schools.
Professor Merritt also changed the topic of conversation from Georgetown’s employment statistics—which had been mentioned in The New York Times and discussed by me, Professor Merritt, and Kyle McEntee—to the employment statistics of the institution where I teach.**
What Professor Merritt meant to say is that law schools have not been properly weighting their data to take into account nonresponse bias. This is an interesting critique. However, proper weights and adjustments to data should take into account all forms of nonresponse bias and response bias, not just the issue of over-representation of large law firms in NALP salary data raised by Professor Merritt.
While such over-representation would have an effect on the mean, it is unclear how much impact, if any, it would have on reported medians—the measure of central tendency used by The New York Times and critiqued by Mr. McEntee.
Other biases such as systematic under-reporting of incomes by highly educated individuals,*** under-reporting of bonuses and outside income, and the like should be taken into account.**** To the extent that these biases cut in opposite directions, they can offset each other. It’s possible that in aggregate the data are unbiased, or that the bias is much smaller than examination of a single bias would suggest.
Moreover, focusing on first year salaries as indicative of the value of a lifetime investment is itself a bias. As The Economic Value of a Law Degree, showed, incomes tend to rise rapidly among law graduates. They do not appreciably decrease, either, until the fourth decade of employment.
If Professor Merritt’s view is that differences between NALP, ABA, and U.S. Census Bureau data collection and reporting conventions make law school-collected data more difficult to compare to other data sources and make law school data less useful, then I am glad to see Professor Merritt coming around to a point I have made repeatedly.
I have gone further and suggested that perhaps the Census Bureau and other government agencies should be collecting all data for graduate degree programs to ensure the accuracy and comparability of data across programs and avoid wasting resources on duplicative data collection efforts.
This could also help avoid an undue amount of focus on short-term outcomes, which can be misleading in light of the rapid growth of law graduate earnings as they gain experience. The inappropriate focus on the short term can be misleading if students are not aware of the growth trajectory and how it compares to the growth trajectory of likely earnings without a law degree.
** This tactic, bringing up the employment statistics of the institution where those whom she disagrees with teach, is something of a habit for Professor Merritt. See her response Anders Walker at St. Louis).
*** Law graduates outside of the big firms are highly educated, high-income individuals compared to most of the rest of individuals in the United States. That is the benchmark used by researchers when they identified the reporting biases in census data that lead to under-reporting of incomes.
**** The risk of under-reporting income in law may be particularly high because of opportunities for tax evasion for those who run small businesses or have income outside of their salary.
UPDATE (4/14/2015): I just confirmed with NALP that their starting salary data does not include end of year bonuses.
April 10, 2015
Did law schools behave unethically by providing employment and earnings information without simultaneously reporting survey response rates? Or is this standard practice?
The answer is that not reporting response rates is standard practice in communication with most audiences. For most users of employment and earnings data, response rates are a technical detail that is not relevant or interesting. The U.S. Government and other data providers routinely report earnings and employment figures separate from survey response rates.*
Sometimes, too much information can be distracting.** It’s often best to keep communication simple and focus only on the most important details.
Nonresponse is not the same thing as nonresponse bias. Law school critics do not seem to understand this distinction. A problem only arises if the individuals who respond are systematically different from those who do not respond along the dimensions being measured. Weighting and imputation can often alleviate these problems. The critics’ claims about the existence, direction, and magnitude of biases in the survey data are unsubstantiated.
High non-response rates to questions about income are not a sign of something amiss, but rather are normal and expected. The U.S. Census Bureau routinely finds that questions about income have lower response rates (higher allocation rates) than other questions.
Law school critics claim that law school graduates who do not respond to questions about income are likely to have lower incomes than those who do respond. This claim is not consistent with the evidence. To the contrary, high-income individuals often value privacy and are reluctant to share details about their finances.***
Another potential problem is “response bias”, in which individuals respond to survey questions in a way that is systematically different from the underlying value being measured. For example, some individuals may under report or over-report their incomes.
The best way to determine whether or not we have nonresponse bias or response bias problems is to gather additional information about non-responders and responders.
Researchers have compared income reported to Census surveys with administrative earnings data from the Social Security Administration and Internal Revenue Service. They find that highly educated, high-income individuals systematically under-report their incomes, while less educated, lower income individuals over-report. (Assuming the administrative data is more accurate than the survey data).
Part of the problem seems to be that bonuses are underreported, and bonuses can be substantial. Another problem seems to be that high-income workers sometimes report their take-home pay (after tax withholding and deductions for benefits) rather than their gross pay.
Other studies have also found that response bias and nonresponse bias lead to underestimation of earnings and employment figures.
In other words, there may indeed be biases in law school earnings data, but if there is, it is likely in the opposite direction of the one the law school critics have claimed.
Of course, the presence of such biases in law school data would not necessarily be a problem if the same biases exist in data on employment and earnings for alternatives to law school. After all, earnings and employment data is only useful when compared to a likely alternative to law school.
As with gross employment data, the critics are yet again claiming that an uncontroversial and nearly universal data reporting practice, regularly used by the United States Government, is somehow scandalous when done by law schools.
The only thing the law school critics have demonstrated is their unfamiliarity with basic statistical concepts that are central to their views.
* Reporting earnings and employment estimates without response rates in communication intended for a general audience—and even some fairly technically sophisticated audiences—is standard practice for U.S. government agencies such as the U.S. Census Bureau and the U.S. Department of Labor, Bureau of Labor Statistics. A few examples below:
- Earnings and unemployment by education level
- Unemployment rates
- Employment population ratio
- Tabular summaries from
** Information on response rates is available for researchers working with microdata to develop their own estimates, and for those who want to scour the technical and methodological documentation. But response rates aren’t of much interest to most audiences.
*** After the JD researchers noted that young law graduates working in large urban markets—presumably a relatively high-income group—were particularly reluctant to respond to the survey. From After the JD III:
“Responses . . . varied by urban and rural or regional status, law school rank, and practice setting. By Wave 2, in the adjusted sample, the significant difference between respondents and nonrespondents continued to be by geographic areas, meaning those from larger legal markets (i.e. New York City) were less likely to respond to the survey. By Wave 3, now over 12 years out into practice, nonrespondents and respondents did not seem to differ significantly in these selected characteristics.”
In the first wave of the study, non-respondents were also more likely to be male and black. All in all, it may be hard to say what the overall direction of any nonresponse bias might be with respect to incomes. A fairly reasonable assumption might be that the responders and non-responders are reasonably close with respect to income, at least within job categories.
April 08, 2015
Opportunity costs and tradeoffs are foundational principles of micro-economics. Comparison between earnings with a law degree and earnings with likely alternatives to law school is the core of The Economic Value of a Law Degree.
In her recent post, Professor Merritt raises interesting questions about whether some students who now go to law school could have had more success elsewhere if they had majored in a STEM (Science Technology Engineering & Math) field rather than humanities or social sciences.
These questions, however, don’t invalidate our analysis. A percentage of those who major in STEM fields of course go on to law school, and our data suggest that they also receive a large boost to their earnings compared to a bachelor’s degree. Some studies suggest that among those who go to law school, the STEM and economics majors earn more than the rest.
Research on college major selection reveals that many more individuals intend to major in STEM fields than ultimately complete those majors. STEM/Econ majors who persist have higher standardized test scores than humanities/social science majors at the same institution and also higher scores than those who switch from STEM/Econ to humanities or social science. Those who switch out of STEM received lower grades in their STEM classes than those who persist. Compared to Humanities and Social Science majors, the STEM majors spend more time studying, receive lower grades, and take longer to complete their majors.
In other words, many of the individuals who end up majoring in the humanities and social sciences may have attempted, unsuccessfully, to major in STEM fields. (For a review of the literature, see Risk Based Student Loans and The Knowledge Tax).
In The Economic Value of a Law Degree, Frank McIntyre and I investigated whether the subset of humanities majors who go to law school had unusually high earning potential and found no evidence suggesting this. The humanities majors who attend law school are about as much above the average humanities major in terms of earning potential as the STEM majors who attend law school are above the average STEM major.
In her recent post, Professor Merritt does not suggest alternatives to law school. Instead she selectively discusses occupations other than being a lawyer. These are generally very highly paid and desirable occupations, such as senior managerial roles, and many individuals who pursue such jobs will be unable to obtain them. In other words, these high paid jobs cited by Professor Merritt are not the likely alternative outcome for most of those who now go to law school if they chose another path. (Indeed, given the high earnings premium to law school including the 40 percent of graduates who do not practice law, a law degree probably increases the likelihood of obtaining highly paid jobs other than practicing law).
Occupations are outcomes. Education is a treatment. Students choose education programs (subject to restrictive admissions policies and challenges of completing different programs), but have more limited control over their ultimate occupation. Comparing occupations as if they were purely choices would be an error. Not every MBA who sets out to be a Human Resources Manager will land that job, just as not every law school graduate will become a lawyer at a big firm. Analysis of nationally representative data from the U.S. Census Bureau using standard statistical techniques from labor economics to consider realistic earnings opportunities--rather than selective focus on the very highest paid occupations tracked by the BLS--suggests that most of the folks who go to law school would be in much less attractive positions if they had stuck with a bachelor’s degree.
Frank McIntyre and I have previously noted the importance of additional research into how the value of a law degree varies by college major, and how the causal effect of different kinds of graduate degrees varies for different sorts of people.
We appreciate Professor Merritt’s interest in these issues and look forward to discussing them in the future when more methodologically rigorous research becomes available. Professor Merritt raises some interesting ancillary issues about response rates, but discussion of those issues will have to wait for a future post.
After my first post on employment definitions, a law school dean emailed me to suggest that perhaps the ABA felt it needed to be extra tough because it was worried it couldn’t trust some of the law schools to make close judgment calls in categorizing employment data.
The Census Bureau does a wonderful job collecting and reporting earnings and employment data using standard methods and definitions. Why not empower the Census Bureau to collect the relevant data about law schools and all programs of higher education?
There are two potential uses of employment outcome data of law school graduates.
(1) Comparing law school to alternatives to law school
(2) Comparing law schools to teach other
Census Bureau data is very well suited to the first use, and could also be useful for high level information about geography or rank even if not for comparisons of individual institutions. If the Current Population Survey and the American Community Survey—which have larger sample sizes and release data more regularly than the Survey of Income and Program Participation—were expanded to include questions on graduate education field (i.e., law, medicine, business) as well as level (B.A., PhD, Master’s, or Professional degree), and specific information about institution or caliber or geography of institution attended, that would go along way toward making law school data redundant. Census surveys will not have data on every law graduate, but as long as the sample is representative, that is not much of a problem.
The Census Bureau data would likely be superior to law school data in the most important respects because it would be comparable to data for those with other educational backgrounds. Since Census Bureau data is for a representative sample of the population, it would not encourage an unhealthy and misleading fixation on short-term outcomes.
As far as comparing individual law schools to each other, student loan default data from the Department of Education might serve this function at least as well as ABA data. To the extent we are concerned about poor outcomes at any particular law school, such poor outcomes will show up in higher student loan default rates.
Default rates will reflect outcomes not only for graduates, but also for those who fail to complete the program. This data would also not be sensitive to response bias on the low end—individuals who do not respond to their student loan bills will be counted as defaulters. Another advantage of this data is that it can be compared with other educational programs. Of course, we would still need to be mindful of the issue of selection versus causation. (Although we could quibble about how the Department of Education calculates its default rates (they publish more than one), the specifics of the definition are far less important that the fact that it is applied consistently across institutions, is used for comparative purposes, and is correlated with other validated measures).
If the Department of Education required colleges and universities to release separate default rate data for every field of graduate study (and perhaps for every college major), that would go a long way to helping inform students and increasing comparability of information about risk levels across programs. (I’ve discussed the merits of this kind of granular disclosure before).
The data won’t capture differences in the boost to earnings across law schools for students in the middle or high end of the distribution, since relatively few students default on their loans. It also won’t tell us anything about the students who don’t need to borrow. Nor will it tell us which schools have the strongest alumni networks in specific geographies or industries. That purpose might be better served by expanding longitudinal studies like After the JD, Baccalaureate and Beyond, National Longitudinal Survey of Youth, and the National Survey of College Graduates to include larger samples, better information about pre-law school differences in characteristics, and more long term information on post-graduate earnings and employment.
The Census Bureau’s ethics and incentives are unimpeachable. Putting data collection in its capable hands and into the hands of similar agencies charged with broad-based data collection would enable these agencies to do more of what they do best and free law schools from the burdens of a task they may not be well equipped to handle.
Resources that are now wasted collecting very precise but not very useful data about initial outcomes for law graduates could instead be redeployed to analyzing the higher quality data. (Or if we still think short term ABA and NALP data provide incremental value that exceeds the costs of collecting, reporting, and interpreting the data—and the costs of predictable misinterpretation and misuse—we could have that much more data to work with).
Food for thought.
April 07, 2015
Recently, The New York Times reported on law school and the legal profession based on hard data and peer reviewed research rather than anecdote and innuendo. The New York Times came to the conclusion that anyone looking honestly at the data would naturally come to—law school seems to be a pretty good investment, at least compared to a terminal bachelor’s degree.
Mr. McEntee suggests incorrectly that The New York Times reported Georgetown’s median private sector salary without providing information on what percentage of the class or of those employed were working in the private sector. (Mr. McEntee also seems to be confused about the difference between response rates—the percentage of those surveyed who respond to the survey or to a particular question—and response bias—whether those who respond to a survey are systematically different along the measured variable from those who do not).
The New York Times wrote:
Last year, 93.2 percent of the 645 students of the Georgetown Law class of 2013 were employed. Sixty percent of the 2013 graduates were in the private sector with a median starting salary of $160,000.
Deborah Merritt disputes the accuracy of these numbers, suggesting it is 60 percent of the 93.2 percent of the graduating class who were employed that were employed in the private sector. This would come to 56 percent of the class employed in the private sector and is a small enough difference that The New York Times may have simply rounded up.
In any case, it is clear that The New York Times provided information about the percent of graduates working in the private sector.
Mr. McEntee also repeats the odd claim that by reporting employment numbers that appear to be close to consistent with the standard definition of “employment” established by the U.S. Census Bureau and promulgated internationally by the International Labor Organization, The New York Times is somehow misleading its readers.
To the contrary, it is Mr. McEntee’s non-standard definitions of employment, taken out of context, that are likely to mislead those attempting to compare law school statistics to the employment statistics of the next best alternative. Mr. McEntee discusses full-time employment statistics for law schools without noting that the full-time employment rate for law graduates is higher than the full-time employment rate for bachelor’s degree holders with similar levels of experience and backgrounds under consistent definitions and survey methods. And he overlooks the evidence that those who do not practice law still benefit from their law degrees.
Mr. McEntee also inaccurately describes my research with Frank McIntyre, claiming incorrectly that we do not take into account those who graduated after 2008. Timing Law School was specifically designed to address this limitation of our earlier research.
Timing Law School includes an analysis of two proxies for law school graduates from the American Community Survey: (1) young professional degree holders excluding those working in medical professions, and (2) young lawyers. This analysis includes individuals who graduated as recently as 2013, and finds no evidence of a decline in recent law graduates’ outcomes relative to those of similar bachelor’s degree holders. (See also here for a discussion of recent data for the subset of law graduates who work as lawyers).
Timing Law School also simulates the long term effects on the earnings premium of graduating into a recession based on the experiences of those who have graduated into previous recessions. The differences between graduating into a recession and graduating into an average economy are not very large (there is a large boost for those graduating into a boom, but booms and recessions are not predictable at the time of law school matriculation).
Moreover, in Timing Law School we find that fluctuations in short-term outcomes for recent graduates are not good predictors of outcomes for those who are currently deciding whether or not to enter law school; long term historical data is a better predictor.
The Economic Value of a Law Degree did not include data on those who graduated after 2008 because such data was not available in the Survey of Income and Program Participation. However, it did include earnings data through 2013, and found no evidence of the earnings premium for law graduates declining in recent years to below its historical average.
Frank and I have noted repeatedly that our analysis compares a law degree to a terminal bachelor’s degree and that we think an important area for future research is careful comparative analysis of alternate graduate degrees, being mindful of selection effects (read The Economic Value of a Law Degree or for the most recent example, see our post from two days ago). While a casual (i.e., not causal) examination of raw data suggests that a law degree likely compares reasonably well to most alternatives other than a medical degree, we’ve noted that it’s possible that more rigorous analysis will reveal that another graduate degree is a better option for some prospective law students, especially when subjective preferences are taken into account along with financial considerations.
Mr. McEntee claims incorrectly that when it comes to other graduate degrees, “McIntyre and Simkovic don’t know and don’t care; they’re convinced that the value of a law degree as immutable as the laws of nature.”
Mr. McEntee insists that law graduates, even at the higher ranked schools, will find it challenging to repay their student loans. However, data from After the JD shows that law school graduates from the class of 2000/2001 have been paying down their loans rapidly.
What about those who entered repayment more recently, when tuition was higher and job prospects less plentiful?
Data from the U.S. Department of Education shows that law students, even at low ranked law schools, remain much less likely to default than most student borrowers. This is true even though law students typically graduate with higher debt levels.
Indeed, The Economic Value of a Law Degree suggests that law graduates generally have higher incomes after taxes and after paying additional debt service than they likely would have had with a terminal bachelor’s degree, even before taking into account debt forgiveness available under Income Based Repayment plans.
Based in part on our research, private student lenders have noticed how unlikely law graduates are to fail to repay their loans. These lenders offer refinancing at substantially lower rates than those charged by the federal government, further reducing the costs of legal education for many graduates (while earning a profit in the process).
No matter what new information becomes available, Mr. McEntee insists that law school is financially disastrous. This is curious for a public figure who claims that his goal is providing prospective law students more accurate information about law school.
April 05, 2015
The choice of whether or not to go to law school is always a choice between law school and the next best alternative. College graduates do not vanish from the face of the earth if they choose not to go to law school. They still must find work, continue their education, or find some other source of financial support.
The question everyone who decides not to go to law school, and every critic of law schools, must answer is—what else out there is better?*
To enable prospective students to compare law school to the next best alternative, we need standardized measurements that apply to both law school and alternatives to law school.
Professor Merritt objects to the standard definition of employment used by the United States government, which she believes is too loose, since it includes an individual as employed if the individual works just one hour during the week of the interview. (This is also the international standard promulgated by the International Labor Organization and widely used around the world).
Using the standard definition of employment and consistent survey and reporting methods reveals that law graduates are more likely to be employed than similar bachelor’s degree holders.
The important thing is not the measurement itself, but rather the relative (causal) differences between law school and the next best alternative. Only by using consistent measurements for law school and the alternatives to law school can we understand those differences.
A single measurement like employment status may not provide all of the information we want. (As a discrete variable, employment status will contain less information than a continuous variable like earnings or hours of work). The solution is to use several standard measurements consistently to compare two different populations. For example, in addition to employment, we might consider work hours, the percent of individuals working “full-time” (i.e., more than 35 hours per week), earnings, or wages (earnings per hour).
In The Economic Value of a Law Degree, Frank McIntyre and I find that no matter which of these measurements we use, the results always point in the same direction. Law graduates participate more actively in the work force and are much better paid than similar bachelor’s degree holders.
If what students really care about is whether law school is a good investment financially, then no isolated measurement taken 9 or 10 months after graduation will provide much insight. (Especially since other educational programs and data collection agencies are not specifically collecting data 9 or 10 months after graduation).
To answer the investment question, we need estimates of the causal effect of education on the present value of lifetime earnings—what Frank and I try to do in The Economic Value of a Law Degree.
To the extent that measurements at or shortly after graduation are useful at all, it is only for purposes of comparison, and only then while being mindful of the differences between outcomes and causation.
Using non-standard definitions means that ABA data at best can facilitate comparisons between different law schools,** but cannot readily be used to compare law school to any alternative.
Professor Merritt argues that the American Bar Association should require law schools to use a uniquely stringent system of measuring employment. To demonstrate that the legal profession holds itself to a higher standard of ethics, law schools should report lower employment rates than everyone else by using less inclusive, non-standard definitions of employment.
I disagree with the premise that different definitions lead to higher standards. Professor Merritt’s proposal would mean that law school statistics cannot be compared to any other employment statistics, and if history is any guide, will contribute greatly to student confusion and error.
The right thing to do is to report standardized measurements so that law school statistics can be readily compared to statistics of other education programs, as well as used to compare law schools to each other.
* Another graduate degree might be better than law school for a particular individual, especially when preferences for certain kinds of education or work are taken into account along with differences in financial value added. One of the frontiers of labor economics research is comparative analysis of the causal effects of different kinds of graduate education.
** I have reservations about the extent to which ABA initial outcome data should be used to compare the value added by one law school to the value added by another. There are large differences between the student bodies of different law schools along dimensions that predict earning potential—standardized test scores, GPA, college quality and college major, socioeconomic status, and demographics. The differences between students matriculating to the highest and lowest ranked law schools appear to be much larger than the differences between the average college graduate and the average law student. While law schools disclose information about their entering classes, they do not reveal information about their graduates. Entering characteristics could be different from graduating characteristics for schools that accept large numbers of transfer students or have unusually high attrition. In addition, the average growth rate of earnings at different law schools might be different and comparing only initial earnings could lead to misleading results.