Brian Leiter's Law School Reports

Brian Leiter
University of Chicago Law School

A Member of the Law Professor Blogs Network

Monday, April 20, 2015

Latest NALP salary data

The percentage of firms paying $160,000 to start is up quite a bit since last year, but not yet back to 2009 levels, among other tidbits.


April 20, 2015 in Legal Profession, Of Academic Interest, Professional Advice, Student Advice | Permalink

Signs of the times: cutbacks at Catholic University...

Saturday, April 18, 2015

The richest (most endowed) private universities...

Thursday, April 16, 2015

Justice Scalia on Justice Ginsburg...

Wednesday, April 15, 2015

Video of University of Oregon's announcement of Dean Schill's Appointment as President

Signs of the times: Loyola Law School (Los Angeles) shrinking enrollment by 25%...

Tuesday, April 14, 2015

Chicago's Dean Mike Schill to become next President of U of Oregon

It breaks my heart to have to post this, since Mike Schill has been a terrific Dean here the last 5 1/2 years, but we all knew he was in demand elsewhere:  he will be the new President of the University of Oregon, come July 1.  Oregon is damn lucky, and I know I speak for everyone at Chicago in saying that Mike Schill will be greatly missed here.


April 14, 2015 in Faculty News | Permalink

Monday, April 13, 2015

Paul Campos's final implosion

Stephen Diamond (Santa Clara) has the details.  He won't be missed, it's fair to say.

UPDATE:  Ohio Northern Dean Richard Bales raises the question whether Campos should be fired, without even noting that Campos himself has confessed in print to being a "fraud." 


April 13, 2015 in Law Professors Saying Dumb Things, Of Academic Interest | Permalink

Saturday, April 11, 2015

Offsetting Biases (Michael Simkovic)

Deborah Merritt and Kyle McEntee conflated “response rates” with nonresponse bias and response bias.  After I brought this error to light, Professor Merritt explained that she and Mr. McEntee were not confused about basic statistical terminology, but rather were being intentionally vague in their critique to be more polite* to the law schools.

Professor Merritt also changed the topic of conversation from Georgetown’s employment statistics—which had been mentioned in The New York Times and discussed by me, Professor Merritt, and Kyle McEntee—to the employment statistics of the institution where I teach.**  

What Professor Merritt meant to say is that law schools have not been properly weighting their data to take into account nonresponse bias.  This is an interesting critique.  However, proper weights and adjustments to data should take into account all forms of nonresponse bias and response bias, not just the issue of over-representation of large law firms in NALP salary data raised by Professor Merritt.

While such over-representation would have an effect on the mean, it is unclear how much impact, if any, it would have on reported medians—the measure of central tendency used by The New York Times and critiqued by Mr. McEntee.

Other biases such as systematic under-reporting of incomes by highly educated individuals,*** under-reporting of bonuses and outside income, and the like should be taken into account.****   To the extent that these biases cut in opposite directions, they can offset each other.  It’s possible that in aggregate the data are unbiased, or that the bias is much smaller than examination of a single bias would suggest.  

Moreover, focusing on first year salaries as indicative of the value of a lifetime investment is itself a bias. As The Economic Value of a Law Degree, showed, incomes tend to rise rapidly among law graduates. They do not appreciably decrease, either, until the fourth decade of employment.

Slide1

 

If Professor Merritt’s view is that differences between NALP, ABA, and U.S. Census Bureau data collection and reporting conventions make law school-collected data more difficult to compare to other data sources and make law school data less useful, then I am glad to see Professor Merritt coming around to a point I have made repeatedly.

I have gone further and suggested that perhaps the Census Bureau and other government agencies should be collecting all data for graduate degree programs to ensure the accuracy and comparability of data across programs and avoid wasting resources on duplicative data collection efforts.

This could also help avoid an undue amount of focus on short-term outcomes, which can be misleading in light of the rapid growth of law graduate earnings as they gain experience.  The inappropriate focus on the short term can be misleading if students are not aware of the growth trajectory and how it compares to the growth trajectory of likely earnings without a law degree.

*    Readers of Professor Merritt’s blog posts will be familiar with Professor Merritt’s general level of politeness.   In her latest, Professor Merritt describes me as “clueless.”

**   This tactic, bringing up the employment statistics of the institution where those whom she disagrees with teach, is something of a habit for Professor Merritt.  See her response Anders Walker at St. Louis).

***  Law graduates outside of the big firms are highly educated, high-income individuals compared to most of the rest of individuals in the United States.  That is the benchmark used by researchers when they identified the reporting biases in census data that lead to under-reporting of incomes.

 **** The risk of under-reporting income in law may be particularly high because of opportunities for tax evasion for those who run small businesses or have income outside of their salary.

 

UPDATE (4/14/2015):  I just confirmed with NALP that their starting salary data does not include end of year bonuses.


April 11, 2015 in Guest Blogger: Michael Simkovic, Legal Profession, Of Academic Interest, Professional Advice, Science, Student Advice, Weblogs | Permalink

Friday, April 10, 2015

Information overload and response rates (Michael Simkovic)

Did law schools behave unethically by providing employment and earnings information without simultaneously reporting survey response rates?  Or is this standard practice?   

The answer is that not reporting response rates is standard practice in communication with most audiences.  For most users of employment and earnings data, response rates are a technical detail that is not relevant or interesting.  The U.S. Government and other data providers routinely report earnings and employment figures separate from survey response rates.*

Sometimes, too much information can be distracting.**  It’s often best to keep communication simple and focus only on the most important details.

Nonresponse is not the same thing as nonresponse bias.  Law school critics do not seem to understand this distinction.  A problem only arises if the individuals who respond are systematically different from those who do not respond along the dimensions being measured.  Weighting and imputation can often alleviate these problems.  The critics’ claims about the existence, direction, and magnitude of biases in the survey data are unsubstantiated.

High non-response rates to questions about income are not a sign of something amiss, but rather are normal and expected.  The U.S. Census Bureau routinely finds that questions about income have lower response rates (higher allocation rates) than other questions.

Law school critics claim that law school graduates who do not respond to questions about income are likely to have lower incomes than those who do respond.  This claim is not consistent with the evidence.  To the contrary, high-income individuals often value privacy and are reluctant to share details about their finances.*** 

Another potential problem is “response bias”, in which individuals respond to survey questions in a way that is systematically different from the underlying value being measured.  For example, some individuals may under report or over-report their incomes.

The best way to determine whether or not we have nonresponse bias or response bias problems is to gather additional information about non-responders and responders.

Researchers have compared income reported to Census surveys with administrative earnings data from the Social Security Administration and Internal Revenue Service.  They find that highly educated, high-income individuals systematically under-report their incomes, while less educated, lower income individuals over-report.  (Assuming the administrative data is more accurate than the survey data).  

Part of the problem seems to be that bonuses are underreported, and bonuses can be substantial.  Another problem seems to be that high-income workers sometimes report their take-home pay (after tax withholding and deductions for benefits) rather than their gross pay.

Other studies have also found that response bias and nonresponse bias lead to underestimation of earnings and employment figures.

In other words, there may indeed be biases in law school earnings data, but if there is, it is likely in the opposite direction of the one the law school critics have claimed.

Of course, the presence of such biases in law school data would not necessarily be a problem if the same biases exist in data on employment and earnings for alternatives to law school.  After all, earnings and employment data is only useful when compared to a likely alternative to law school.

As with gross employment data, the critics are yet again claiming that an uncontroversial and nearly universal data reporting practice, regularly used by the United States Government, is somehow scandalous when done by law schools. 

The only thing the law school critics have demonstrated is their unfamiliarity with basic statistical concepts that are central to their views.

 

------

*    Reporting earnings and employment estimates without response rates in communication intended for a general audience—and even some fairly technically sophisticated audiences—is standard practice for U.S. government agencies such as the U.S. Census Bureau and the U.S. Department of Labor, Bureau of Labor Statistics.  A few examples below:

 **  Information on response rates is available for researchers working with microdata to develop their own estimates, and for those who want to scour the technical and methodological documentation.  But response rates aren’t of much interest to most audiences. 

*** After the JD researchers noted that young law graduates working in large urban markets—presumably a relatively high-income group—were particularly reluctant to respond to the survey. From After the JD III:

“Responses . . . varied by urban and rural or regional status, law school rank, and practice setting.  By Wave 2, in the adjusted sample, the significant difference between respondents and nonrespondents continued to be by geographic areas, meaning those from larger legal markets (i.e. New York City) were less likely to respond to the survey.  By Wave 3, now over 12 years out into practice, nonrespondents and respondents did not seem to differ significantly in these selected characteristics.”

In the first wave of the study, non-respondents were also more likely to be male and black.  All in all, it may be hard to say what the overall direction of any nonresponse bias might be with respect to incomes.  A fairly reasonable assumption might be that the responders and non-responders are reasonably close with respect to income, at least within job categories.


April 10, 2015 in Guest Blogger: Michael Simkovic, Law in Cyberspace, Legal Profession, Of Academic Interest, Professional Advice, Science, Student Advice, Weblogs | Permalink