Brian Leiter's Law School Reports

Brian Leiter
University of Chicago Law School

A Member of the Law Professor Blogs Network

April 23, 2015

LSAC reports applicants down just 2.6% as of April 17...

...here and see the chart, which suggests we've hit bottom in terms of the applicant pool (barring dramatic economic jolts, that is, which could move things either way).  Of course this is a bottom last  seen in the 1970s when there were 50 fewer law schools.  But given how many law schools have refrained from hiring faculty due to uncertainty about the future, my guesstimate is that we'll see a slight uptick in law school hiring next year, since many schools have unfilled needs.


April 23, 2015 in Advice for Academic Job Seekers, Legal Profession, Of Academic Interest | Permalink

April 22, 2015

A somewhat more balanced piece on law schools and the legal profession...

April 21, 2015

Standardized measurement and its discontents

At the faculty lounge, Professor Bernard Burk of the University of North Carolina echoes questions raised earlier by Professor Merritt of Ohio State about whether it is unethical or misleading for law schools to report employment using the international standard definition of employment.   I have discussed these issues extensively before.*

Employment statistics are primarily useful for purposes of comparing alternatives.  Comparison requires standard measurements.  Standardization is efficient because it reduces the number of definitions that must be learned to use data.  The standard definition of employment is meaningful and useful because, notwithstanding preferences for particular kinds of work, a job of some kind is generally preferable to no job at all.  This does not mean that employment is the only measurement one should consider, but rather that it is a useful measurement.

Because international standards exist, it is not necessary to explain to a college graduate what a centimeter means when describing the length of an object.  Similarly it is not necessary to explain to college graduates contemplating law school what employment means when using the international standard definition of employment.**

College educated individuals who are unfamiliar with standard terminology can easily look up or inquire about the relevant definitions, and once they have learned, can begin to understand a world of data. The standard definitions of employment and unemployment can be quickly discovered through intuitive internet searches. (see searches for unemployment and employment definitions) These definitions are neither obscure nor technically challenging. 

In addition, many law schools disclose bar passage rates that are lower than their employment rates.  It seems doubtful that many college educated adults contemplating law school—in particular, the subset basing their decisions on outcome data such as employment and bar passage rates—would assume that every law graduate who is employed shortly after graduation is working as a lawyer when many of those graduates cannot legally practice law.

Critiquing international standardized measurements as inherently immoral is not without precedent.   

According to Martin Gardner, during the 1800s, a U.S. group attacked the French metric system as atheistic and immoral. 

“The president of the Ohio group, a civil engineer who prided himself on having an arm exactly one cubit in length, had this to say . .  . : "We believe our work to be of God; we are actuated by no selfish or mercenary motive. We depreciate personal antagonisms of every kind, but we proclaim a ceaseless antagonism to that great evil, the French Metric System. . .The jests of the ignorant and the ridicule of the prejudiced, fall harmless upon us and deserve no notice. . . It is the Battle of the Standards. May our banner be ever upheld in the cause of Truth, Freedom, and Universal Brotherhood, founded upon a just weight and a just measure, which alone are acceptable to the Lord." “

“A later issue printed the words and music of a song, the fourth verse of which ran:

        

        Then down with every "metric" scheme

        Taught by the foreign school,

        We'll worship still our Father's God!

        And keep our Father's "rule"!

        A perfect inch, a perfect pint,

        The Anglo's honest pound,

        Shall hold their place upon the earth,

        Till time's last trump shall sound!”

 

A catchy tune, although I’m not sure it’s one many scientists, mathematicians or engineers would appreciate!

Many thoughtful people believe the U.S.’s non-standard approach to measurement undermines U.S. competitiveness in science, math, engineering, and industry.  Time is wasted learning and converting to and from a redundant and inefficient measurement system.  This entails opportunity cost and leads to unnecessary and avoidable errors.

Law schools, the American Bar Association, and the National Association for Law Placement would be better served by using standard definitions for labor market measurements when standard definitions are available and widely in use elsewhere, or at least labeling non-standard definitions with names that will not be readily confused with standard definitions.

The ABA currently requires law schools to describe individuals as “Unemployed” who under standard definitions would be defined as either “Not in Labor Force” or “Unemployed.”  In other words, “unemployment” as reported under ABA definitions will be higher than unemployment under the standard and most widely used government definition.  A number of people have been confused by this, incorrectly claiming that “unemployment” for law graduates is unusually high in comparison to everyone else.  In fact, under consistent measurements, the fraction of recent law graduates who are employed is higher than the overall proportion of the population that is employed.   (Law graduates also do relatively well on the percent employed full-time).

I agree with Professor Burk that additional information about occupational categories could be useful to some users of data.  However, I do not agree that presenting standard summary statistics is inherently misleading or unethical, particularly for the sophisticated audience using the data —college educated, internet savvy adults.

 

Continue reading


April 21, 2015 in Guest Blogger: Michael Simkovic, Legal Profession, Of Academic Interest, Professional Advice, Science, Weblogs | Permalink

April 20, 2015

Latest NALP salary data

The percentage of firms paying $160,000 to start is up quite a bit since last year, but not yet back to 2009 levels, among other tidbits.


April 20, 2015 in Legal Profession, Of Academic Interest, Professional Advice, Student Advice | Permalink

Signs of the times: cutbacks at Catholic University...

April 16, 2015

Justice Scalia on Justice Ginsburg...

April 15, 2015

Signs of the times: Loyola Law School (Los Angeles) shrinking enrollment by 25%...

April 11, 2015

Offsetting Biases (Michael Simkovic)

Deborah Merritt and Kyle McEntee conflated “response rates” with nonresponse bias and response bias.  After I brought this error to light, Professor Merritt explained that she and Mr. McEntee were not confused about basic statistical terminology, but rather were being intentionally vague in their critique to be more polite* to the law schools.

Professor Merritt also changed the topic of conversation from Georgetown’s employment statistics—which had been mentioned in The New York Times and discussed by me, Professor Merritt, and Kyle McEntee—to the employment statistics of the institution where I teach.**  

What Professor Merritt meant to say is that law schools have not been properly weighting their data to take into account nonresponse bias.  This is an interesting critique.  However, proper weights and adjustments to data should take into account all forms of nonresponse bias and response bias, not just the issue of over-representation of large law firms in NALP salary data raised by Professor Merritt.

While such over-representation would have an effect on the mean, it is unclear how much impact, if any, it would have on reported medians—the measure of central tendency used by The New York Times and critiqued by Mr. McEntee.

Other biases such as systematic under-reporting of incomes by highly educated individuals,*** under-reporting of bonuses and outside income, and the like should be taken into account.****   To the extent that these biases cut in opposite directions, they can offset each other.  It’s possible that in aggregate the data are unbiased, or that the bias is much smaller than examination of a single bias would suggest.  

Moreover, focusing on first year salaries as indicative of the value of a lifetime investment is itself a bias. As The Economic Value of a Law Degree, showed, incomes tend to rise rapidly among law graduates. They do not appreciably decrease, either, until the fourth decade of employment.

Slide1

 

If Professor Merritt’s view is that differences between NALP, ABA, and U.S. Census Bureau data collection and reporting conventions make law school-collected data more difficult to compare to other data sources and make law school data less useful, then I am glad to see Professor Merritt coming around to a point I have made repeatedly.

I have gone further and suggested that perhaps the Census Bureau and other government agencies should be collecting all data for graduate degree programs to ensure the accuracy and comparability of data across programs and avoid wasting resources on duplicative data collection efforts.

This could also help avoid an undue amount of focus on short-term outcomes, which can be misleading in light of the rapid growth of law graduate earnings as they gain experience.  The inappropriate focus on the short term can be misleading if students are not aware of the growth trajectory and how it compares to the growth trajectory of likely earnings without a law degree.

*    Readers of Professor Merritt’s blog posts will be familiar with Professor Merritt’s general level of politeness.   In her latest, Professor Merritt describes me as “clueless.”

**   This tactic, bringing up the employment statistics of the institution where those whom she disagrees with teach, is something of a habit for Professor Merritt.  See her response Anders Walker at St. Louis).

***  Law graduates outside of the big firms are highly educated, high-income individuals compared to most of the rest of individuals in the United States.  That is the benchmark used by researchers when they identified the reporting biases in census data that lead to under-reporting of incomes.

 **** The risk of under-reporting income in law may be particularly high because of opportunities for tax evasion for those who run small businesses or have income outside of their salary.

 

UPDATE (4/14/2015):  I just confirmed with NALP that their starting salary data does not include end of year bonuses.


April 11, 2015 in Guest Blogger: Michael Simkovic, Legal Profession, Of Academic Interest, Professional Advice, Science, Student Advice, Weblogs | Permalink

April 10, 2015

Information overload and response rates (Michael Simkovic)

Did law schools behave unethically by providing employment and earnings information without simultaneously reporting survey response rates?  Or is this standard practice?   

The answer is that not reporting response rates is standard practice in communication with most audiences.  For most users of employment and earnings data, response rates are a technical detail that is not relevant or interesting.  The U.S. Government and other data providers routinely report earnings and employment figures separate from survey response rates.*

Sometimes, too much information can be distracting.**  It’s often best to keep communication simple and focus only on the most important details.

Nonresponse is not the same thing as nonresponse bias.  Law school critics do not seem to understand this distinction.  A problem only arises if the individuals who respond are systematically different from those who do not respond along the dimensions being measured.  Weighting and imputation can often alleviate these problems.  The critics’ claims about the existence, direction, and magnitude of biases in the survey data are unsubstantiated.

High non-response rates to questions about income are not a sign of something amiss, but rather are normal and expected.  The U.S. Census Bureau routinely finds that questions about income have lower response rates (higher allocation rates) than other questions.

Law school critics claim that law school graduates who do not respond to questions about income are likely to have lower incomes than those who do respond.  This claim is not consistent with the evidence.  To the contrary, high-income individuals often value privacy and are reluctant to share details about their finances.*** 

Another potential problem is “response bias”, in which individuals respond to survey questions in a way that is systematically different from the underlying value being measured.  For example, some individuals may under report or over-report their incomes.

The best way to determine whether or not we have nonresponse bias or response bias problems is to gather additional information about non-responders and responders.

Researchers have compared income reported to Census surveys with administrative earnings data from the Social Security Administration and Internal Revenue Service.  They find that highly educated, high-income individuals systematically under-report their incomes, while less educated, lower income individuals over-report.  (Assuming the administrative data is more accurate than the survey data).  

Part of the problem seems to be that bonuses are underreported, and bonuses can be substantial.  Another problem seems to be that high-income workers sometimes report their take-home pay (after tax withholding and deductions for benefits) rather than their gross pay.

Other studies have also found that response bias and nonresponse bias lead to underestimation of earnings and employment figures.

In other words, there may indeed be biases in law school earnings data, but if there is, it is likely in the opposite direction of the one the law school critics have claimed.

Of course, the presence of such biases in law school data would not necessarily be a problem if the same biases exist in data on employment and earnings for alternatives to law school.  After all, earnings and employment data is only useful when compared to a likely alternative to law school.

As with gross employment data, the critics are yet again claiming that an uncontroversial and nearly universal data reporting practice, regularly used by the United States Government, is somehow scandalous when done by law schools. 

The only thing the law school critics have demonstrated is their unfamiliarity with basic statistical concepts that are central to their views.

 

------

*    Reporting earnings and employment estimates without response rates in communication intended for a general audience—and even some fairly technically sophisticated audiences—is standard practice for U.S. government agencies such as the U.S. Census Bureau and the U.S. Department of Labor, Bureau of Labor Statistics.  A few examples below:

 **  Information on response rates is available for researchers working with microdata to develop their own estimates, and for those who want to scour the technical and methodological documentation.  But response rates aren’t of much interest to most audiences. 

*** After the JD researchers noted that young law graduates working in large urban markets—presumably a relatively high-income group—were particularly reluctant to respond to the survey. From After the JD III:

“Responses . . . varied by urban and rural or regional status, law school rank, and practice setting.  By Wave 2, in the adjusted sample, the significant difference between respondents and nonrespondents continued to be by geographic areas, meaning those from larger legal markets (i.e. New York City) were less likely to respond to the survey.  By Wave 3, now over 12 years out into practice, nonrespondents and respondents did not seem to differ significantly in these selected characteristics.”

In the first wave of the study, non-respondents were also more likely to be male and black.  All in all, it may be hard to say what the overall direction of any nonresponse bias might be with respect to incomes.  A fairly reasonable assumption might be that the responders and non-responders are reasonably close with respect to income, at least within job categories.


April 10, 2015 in Guest Blogger: Michael Simkovic, Law in Cyberspace, Legal Profession, Of Academic Interest, Professional Advice, Science, Student Advice, Weblogs | Permalink

April 08, 2015

Opportunities, College Majors, and Occupations (Compared to what? continued) (Michael Simkovic)

Opportunity costs and tradeoffs are foundational principles of micro-economics. Comparison between earnings with a law degree and earnings with likely alternatives to law school is the core of The Economic Value of a Law Degree.

In her recent post, Professor Merritt raises interesting questions about whether some students who now go to law school could have had more success elsewhere if they had majored in a STEM (Science Technology Engineering & Math) field rather than humanities or social sciences. 

These questions, however, don’t invalidate our analysis.  A percentage of those who major in STEM fields of course go on to law school, and our data suggest that they also receive a large boost to their earnings compared to a bachelor’s degree.  Some studies suggest that among those who go to law school, the STEM and economics majors earn more than the rest. 

Research on college major selection reveals that many more individuals intend to major in STEM fields than ultimately complete those majors.  STEM/Econ majors who persist have higher standardized test scores than humanities/social science majors at the same institution and also higher scores than those who switch from STEM/Econ to humanities or social science.  Those who switch out of STEM received lower grades in their STEM classes than those who persist.  Compared to Humanities and Social Science majors, the STEM majors spend more time studying, receive lower grades, and take longer to complete their majors. 

In other words, many of the individuals who end up majoring in the humanities and social sciences may have attempted, unsuccessfully, to major in STEM fields. (For a review of the literature, see Risk Based Student Loans and The Knowledge Tax).

In The Economic Value of a Law Degree, Frank McIntyre and I investigated whether the subset of humanities majors who go to law school had unusually high earning potential and found no evidence suggesting this.  The humanities majors who attend law school are about as much above the average humanities major in terms of earning potential as the STEM majors who attend law school are above the average STEM major.

In her recent post, Professor Merritt does not suggest alternatives to law school.  Instead she selectively discusses occupations other than being a lawyer.  These are generally very highly paid and desirable occupations, such as senior managerial roles, and many individuals who pursue such jobs will be unable to obtain them.  In other words, these high paid jobs cited by Professor Merritt are not the likely alternative outcome for most of those who now go to law school if they chose another path.  (Indeed, given the high earnings premium to law school including the 40 percent of graduates who do not practice law, a law degree probably increases the likelihood of obtaining highly paid jobs other than practicing law).

Occupations are outcomes.  Education is a treatment.  Students choose education programs (subject to restrictive admissions policies and challenges of completing different programs), but have more limited control over their ultimate occupation.  Comparing occupations as if they were purely choices would be an error.  Not every MBA who sets out to be a Human Resources Manager will land that job, just as not every law school graduate will become a lawyer at a big firm.  Analysis of nationally representative data from the U.S. Census Bureau using standard statistical techniques from labor economics to consider realistic earnings opportunities--rather than selective focus on the very highest paid occupations tracked by the BLS--suggests that most of the folks who go to law school would be in much less attractive positions if they had stuck with a bachelor’s degree.

Frank McIntyre and I have previously noted the importance of additional research into how the value of a law degree varies by college major, and how the causal effect of different kinds of graduate degrees varies for different sorts of people.

We appreciate Professor Merritt’s interest in these issues and look forward to discussing them in the future when more methodologically rigorous research becomes available.  Professor Merritt raises some interesting ancillary issues about response rates, but discussion of those issues will have to wait for a future post. 

 


April 8, 2015 in Guest Blogger: Michael Simkovic, Legal Profession, Of Academic Interest, Professional Advice, Science, Student Advice, Weblogs | Permalink