April 30, 2016
New research from Dan Schwarcz and Dion Farganis at Minnesota argues that providing students with practice problems and exercises that are similar to final exams and giving individual feedback prior to the final examination can help improve grades for first year law students.
Schwarcz and Farganis tracked the performance of first year students who were randomly assigned to sections, and as a result took courses with professors who either provided exercises and individual feedback prior to the final examination, or who did not provide feedback.
When the students who studied under feedback professors and the students who studied under no-feedback professors took a separate required class together, the feedback students received higher grades after controlling for several factors that predict grades, such as LSAT scores, undergraduate GPA, gender, race, and country of birth. The increase in grades appears to be larger for students toward the bottom half of the distribution. The paper also attempts to control for variation in instructor ability using student evaluations of teacher clarity.
It’s an interesting paper, and part of a welcome trend toward assessing proposed pedagogical reform through quasi-experimental methods.
The interpretation of these results raises a number of questions which I hope the authors will address more thoroughly as they revise the paper and in future research.
For example, are the differences due to instructor effects rather than feedback effects? Students are randomly assigned to instructors who happen to voluntarily give pre-final exam feedback. These might be instructors who are more conscientious, dedicated, or skilled and who also happen to give pre-exam feedback. Requiring other instructors to give pre-exam feedback—or having the same instructors provide no pre-exam feedback—might not affect student performance.
Controlling for instructor ability based on teaching evaluations is not entirely convincing, even if students are ostensibly evaluating teacher clarity. There is not very strong evidence that teaching evaluations reflect how much students learn. An easier instructor who covers less substance might receive higher teaching evaluations across the board than a rigorous instructor who does more to prepare students for practice. Teaching evaluations might reflect friendliness or liveliness or attractiveness or factors that do not actually affect student learning outcomes but that have consumption value for students. Indeed, high feedback professors might receive lower teaching evaluations for the same quality of teaching because they might make students work harder and because they might provide negative feedback to some students, leading students to retaliate on teaching evaluations.
These issues could be addressed in future research by asking the same instructor to teach two sections of the same class in different ways and measuring both long term student outcomes and teaching evaluations.
Another question is: are students simply learning how to take law school exams? Or are they actually learning the material better in a way that will provide long-term benefits, either in bar passage rates or in job performance? At the moment, the data is not sufficient to know one way or the other.
A final question is how much providing individualized feedback will cost in faculty time, and whether the putative benefits justify the costs.
It’s a great start, and I look forward to more work from these authors, and from others, using quasi-experimental designs to investigate pedagogical variations.
April 22, 2016
Professor Paula Franzese of Seton Hall law school is something of a patron saint of law students. Widely known for her upbeat energy, kindness, and tendency to break into song for the sake of helping students remember a particularly challenging point of law, Paula has literally helped hundreds of thousands of lawyers pass the bar exam through her video taped Property lectures for BarBri.
Paula is such a gifted teacher that she won teacher of the year almost ever year until Seton Hall implemented a rule to give others a chance: no professor can win teacher of the year more than two years in a row. Since the rule was implemented, Paula wins every other year. She’s also incredibly generous, leading seminars and workshops to help her colleagues improve their teaching.
Paula recently wrote a book encouraging law students to have a productive, upbeat happy, and grateful outlook on life (A short & happy guide to being a law school student).
Paula’s well-intentioned book has rather bizarrely been attacked by scambloggers as “dehumanizing”, “vain”, “untrustworthy” and “insidious.” The scambloggers are not happy people, and reacted as if burned by Paula’s sunshine. They worry that Paula’s thesis implies that “their failure must be due to their unwillingness to think happy and thankful thoughts.”
Happiness and success tend to go together. Some people assume that success leads to happiness. But an increasing number of psychological studies suggest that happiness causes success. (here and here) Happiness often precedes and predicts success, and happiness appears to be strongly influenced by genetic factors.
Leaving aside the question of how much people can change their baseline level of happiness, being happier—or at least outwardly appearing to be happier—probably does contribute to success, and being unhappy probably is a professional and personal liability.
People like working with happy people. They don’t like working with people who are unhappy or unpleasant. This does not mean that people who are unhappy are to blame for their unhappiness, any more than people who are born with disabilities are to blame for being deaf or blind.
But it does raise serious questions about whether studies of law graduates’ levels of happiness are measuring causation or selection. We would not assume that differences between the height of law graduates and the rest of the population were caused by law school attendance, and we probably should not assume that law school affects happiness very much either.
April 01, 2016
March 16, 2016
Statistician and data visualization expert Hans Rosling recently took the media to task for misleading readers and viewers using unrepresentative anecdotes and ignoring contradictory data.
Rosling says "You can't trust the news outlets if you want to understand the world. You have to be educated and then research basic facts."
While journalists often depict the developing world as full of "wars, conflicts, chaos" Rosling says "That is wrong. [The press] is completely wrong.. . . You can chose to only show my shoe, which is very ugly, but that is only a small part of me. . . . News outlets only care about a small part but [they] call it the world."
Rosling complains that the slow but steady march of progress is not considered news.
Rosling is famous for his data visualizations, especially this video briefly illustrating 200 years of global progress toward health and prosperity. It's optimism for the data-driven set (and is a big hit in my business law classes).
March 05, 2016
That’s the question Frank McIntyre and I try to answer in Value of a law degree by College Major. Economics seems to be the “best” major for aspiring law students, with both high base earnings with a bachelor’s degree and a large boost to earning with a law degree. History and philosophy/religion get a similarly large boost from a law degree but start at a lower undergraduate base and, among those with law degrees, typically end up earning substantially less than economics majors.
The abstract and a figure are below:
We estimate the increase in earnings from a law degree relative to a bachelor’s degree for graduates who majored in different fields in college. Students with humanities and social sciences majors comprise approximately 47 percent of law degree holders compared to 23 percent of terminal bachelor’s. Law degree earnings premiums are highest for humanities and social sciences majors and lowest for STEM majors. On the other hand, among those with law degrees, overall earnings are highest for STEM and Business Majors. This effect is fairly small at the low end of the earnings distribution, but quite large at the top end. The median annual law degree earnings premium ranges from approximately $29,000 for STEM majors to $45,000 for humanities majors.
These results raise an intriguing question: should law schools offer larger scholarships to those whose majors suggest they will likely benefit less from their law degrees? Conversely, should law schools charge more to those who will likely benefit the most?
Figure 3: ACS Mean Earnings for Professional Degree Holders (Narrow) by Selected Field of Study* (2014 USD Thousands)
- Includes degree fields with more than 700 professional degree holders in sample.
COMMENT FROM BRIAN LEITER: The lumping of philosophy majors together with religion invariably pulls down the performance of philosophy majors!
February 09, 2016
The latest unscientific fad among law school watchers is comparing job openings projections for lawyers from the Bureau of Labor Statistics* with the number of students expected to graduate from law school. Frank McIntyre and I tested this method of predicting earnings premiums--the financial benefits of a law degree--using all of the available historical projections from the BLS going back decades. This method of prediction does not perform any better than random chance.** Labor economists--including those working at the BLS--have explicitly stated that BLS projections should not be used to try to value particular courses of study. Instead, higher education should be valued based on earnings premiums.
Bloggers who report changes in BLS projections and compare projected job openings to the number of students entering law school might as well advise prospective law students to make important life decisions by flipping a coin.
Many law graduates won't practice law. Many engineering graduates won't become engineers. Many students in every field end up working jobs that are not directly related to what they studied. They still typically benefit financially from their degrees by using them in other occupations where additional education boosts earnings and likelihood of employment.
And if one's goal really is to practice law even if practicing law is not more lucrative than other opportunities opened by a law degree, then studying law may not be a guarantee, but it still dramatically improves the odds.
* BLS job opening projections--which are essentially worthless as predictors for higher education--should not be confused with BLS occupational employment statistics, which provide useful data about earnings and employment in many occupations, including for lawyers.
** There isn’t even strong evidence that changes in the ratio between BLS projected lawyer job openings and law class size predict changes in the percent of law graduates who will practice law, although the estimates are too noisy to be definitive. Historically, the ratio of BLS projected openings to law graduates (or first year enrollments 3 years prior) has systematically under-predicted by a wide margin the proportion of law graduates practicing law shortly after graduation, although it is clear that a large minority of law graduates do not practice law.
February 9, 2016 in Guest Blogger: Michael Simkovic, Law in Cyberspace, Legal Profession, Ludicrous Hyperbole Watch, Of Academic Interest, Professional Advice, Science, Student Advice, Web/Tech, Weblogs | Permalink
February 03, 2016
February 02, 2016
Smaller or Larger Law Class Sizes Don’t Predict Changes in Financial Benefits of Law School (Michael Simkovic)
One of the most surprising and controversial findings from Timing Law School was that changes in law school graduating class size do not predict changes in the boost to earnings from a law degree.* Many law professors, administrators, and critics believe that shrinking the supply of law graduates must surely improve their outcomes, because if supply goes down, then price—that is, earnings of law graduates—should go up.
In a new version of Timing Law School, Frank McIntyre and I explore our counterintuitive results more thoroughly. (The new analysis and discussion appear primarily in Part III.C. “Interpreting zero correlation for cohort size and earnings premium” on page 18-22 of the Feb. 1, 2016 draft and in Table 10 on the final page).
Our results of no relationship between class size and earnings premiums were robust to many alternative definitions of cohort size that incorporated changes in the number of law graduates over several years. This raises questions about whether our findings are merely predictive, or should be given a causal interpretation.
We considered several interpretations that could reconcile our results with a supply and demand model and with the data. The most plausible interpretation seemed to be that when law class sizes change, law graduates switch between practicing law and other employment opportunities that are equally financially rewarding. While changes in the number of law graduates might have an impact on the market for entry-level lawyers, such changes are much less likely to have a discernible impact on the much larger market for highly educated labor.
Although law graduates who practice law on average earn more than those who do not, at the margin, those who switch between practicing law and other options seem to have law and non-law opportunities that are similarly lucrative. We found that the proportion of recent law graduates who practice law does decline as class size increases, but earnings premiums remain stable. This is consistent with the broader literature on underemployment, and supports the view of law school as a flexible degree that provides benefits that extend to many occupations. (See here and here).
A related explanation is that relatively recent law school graduates may be reasonably good substitutes for each other for several years, blunting the impact of changes in class size on earnings.
Interpretations that depend on law students and law schools perfectly adjusting class size in anticipation of demand for law graduates or future unemployment seem implausible given the unrealistic degree of foresight this would require. Offsetting changes in the quality of students entering law school—an explanation we proposed in an earlier version of the paper—seems able to explain at most a very small supply effect. Although credentials of entering classes appear to decline with class sizes, these changes in credentials are relatively small even amid dramatic changes in class size, and probably do not predict very large changes in earnings. Imprecision in our estimates is another possibility, although for graduates with more than a few years of experience, our estimates are fairly precise.
Even if there are effects of law class size on law earnings premiums, they are probably not very large and not very long lasting.
* The finding is consistent with mixed results for cohort size in other econometric studies of cohort effects, but nevertheless was contrary to many readers’ intuitions.
December 02, 2015
Developer of Law School Admission Test (LSAT) Disputes Advocacy Group's Bar Exam Claims (Michael Simkovic)
The Law School Admission Council (LSAC)--the non-profit organization which develops and administers the Law School Admission Test (LSAT)--recently issued a press release disputing claims by the advocacy group "Law School Transparency" about the relationship between LSAT scores and bar passage rates. "Law School Transparency," headed by Kyle McEntee, prominently cited the LSAC National Longitudinal Bar Passage Study (1998) as a key source for "Law School Transparency's" claims that many law schools are admitting students who are unlikely to pass the bar exam based largely on their LSAT scores. McEntee's group's claims of bar passage risk were widely disseminated by the national press.
However, according to LSAC, the Longitudinal Bar Passage Study does not provide much support for "Law School Transparency's" claims. Moreover, "Law School Transparency's" focus on first time bar passage rates is potentially misleading:
"The LSAC [National Longitudinal Bar Passage] study did state that 'from the perspective of entry to the profession, the eventual pass rate is a far more important outcome than first-time pass rate.' This statement is as true today as it was 25 years ago. As noted by LST, the LSAC study participants who scored below the (then) average LSAT score had an eventual bar passage rate of over 90 percent.
Kyle McEntee and David Frakt responded to some of LSAC's critiques--partly on substance by pointing out disclaimers in the full version of "Law School Transparency's" claims, partly by smearing the technical experts at LSAC as shills for law school--but notably did not explain why "Law School Transparency" chose to focus on first time bar passage rates rather than seemingly more important--and much higher--eventual bar passage rates.
Eventual bar passage rates were the focus of the National Longitudinal Bar Passage Study. The LSAC study's executive summary highlights eventual bar passage rates and detailed data is presented on page 32 and 33. Even among graduates of the lowest "cluster" of law schools, around 80 percent eventually passed the bar exam.
According to LSAC:
"The LSAC National Longitudinal Bar Passage Study was undertaken primarily in response to rumors and anecdotal reports suggesting that bar passage rates were so low among examinees of color that potential applicants were questioning the wisdom of investing the time and resources necessary to obtain a legal education."
"Law School Transparency" has revived similar concerns, but without a specific focus on racial minorities.*
There may be legitimate concerns about long term eventual bar passage rates for some law students. However, "Law School Transparency's" back-of-the-envelope effort, focused on short term outcomes, does not provide much insight into long-term questions. The most rigorous study of this issue to date--the LSAC National Longitudinal Bar Passage Study--concluded that "A demographic profile that could distinguish first-time passing examinees from eventual-passing or never-passing examinees did not emerge from these data. . . . Although students of color entered law school with academic credentials, as measured by UGPA and LSAT scores, that were significantly lower than those of white students, their eventual bar passage rates justified admission practices that look beyond those measures."Unfortunately, some newspapers reported "Law School Transparency's" bar passage risk claims in ways that suggested the claims were blessed by LSAC, or even originated from LSAC. For example, one prominent newspaper's editorial board wrote that "In 2013, the median LSAT score of students admitted to [one law school] was in the bottom quarter of all test-takers nationwide. According to the test’s administrators, students with scores this low are unlikely to ever pass the bar exam."
October 28, 2015
N.Y. Times is Mistaken: Law Student Loans are Safe and Profitable for the Government (Michael Simkovic)
This weekend, The New York Times Editorial Board published a sensationalist lead editorial, “The Law School Debt Crisis,” claiming that law student borrowing is harmful to taxpayers. The New York Times is mistaken.
The Times cited Florida Coastal School of Law, a for-profit institution, as its prime example of law schools “vacuuming up hordes of young people, charging them outrageously high tuition and, after many of the students fail to become lawyers, sticking taxpayers with the tab for their loan defaults.” Florida Coastal seems like an easy target—even a Federal Court which dismissed a fraud suit against Florida Coastal described it as having “some of the lowest admissions standards of accredited or provisionally accredited law schools in the nation.” The Times has repeatedly criticized for-profit colleges, which it deems “predatory” based on their unusually high student loan default rates. (See opinion, upshot, news and news again).
If the Editorial Board's accusations were true—if the “majority of law schools” really were running “a scam” in which they load down their students with “crushing amounts of debt” which “they can’t repay”—Florida Coastal and other law schools should have among the highest default rates of any institutions of higher education in the country.
They don’t and they aren’t.
For the cohort entering repayment in 2012—the most recent year of data available*—the national 3-year cohort default rate on federal student loans was 11.8 percent. The comparable figure for Florida Coastal was only 1.1 percent—more than 10 times lower.
Other measures tracked by the Department of Education, like repayment rates, also show law school borrowers performing as well or better than most.
We see the same pattern across law schools and going back decades for which data is available.** Even low ranked law schools with allegedly “outrageously high” tuition generally have much lower student loan default rates than either the national average, or the average for institutions that grant bachelor’s or advanced degrees.
Law students not only have higher debts than most student loan borrowers; as professional students, they also pay higher interest rates on government loans than undergraduates.
Law students rarely default because the financial benefits they receive from attending law school are usually far greater than the costs.*** Law school typically boosts annual earnings by around $30,000 (median) to $60,000 per year (mean) compared to a bachelor’s degree.**** Even at the 25th percentile, toward the low end of the distribution, the annual boost to earnings is around $20,000 per year—more than enough to repay typical law school loans over the course of a career.
Taxpayers also benefit. For every extra dollar a law graduate earns, the federal government receives an extra 30 to 40 cents in payroll and income taxes. The federal government charges far more in taxes than most law schools charge in tuition.
But the government isn’t paying for most law graduates’ education. In fact, loans to law students are among the most profitable in the federal government’s student loan portfolio, thanks to high interest rates and low default rates. Many law graduates are such good credit risks, and are overcharged so much by the government, that private lenders have offered to refinance law graduate loans for substantially lower interest rates.
There are cases in which particular individuals have unusually bad outcomes and struggle to repay their loans. Thankfully, these situations are relatively rare among law graduates.
Incomes for law graduates may seem low when they first graduate, but typically climb rapidly over the next several decades. Education loans exist precisely so that borrowed money can be repaid later in life, when employment is more stable and incomes are usually higher.
The New York Times is right that many law school graduates—around 40 percent—do not practice law. But law graduates do not have to practice law or earn spectacular salaries to benefit financially from their degrees and repay their loans over their careers. They need only earn roughly $10,000 per year more than they would have earned without a law degree. The overwhelming majority of law graduates, including those not practicing law, receive substantially larger boosts to their earnings.
Thanks to income based repayment programs with debt forgiveness and progressive taxation, the overwhelming majority of successful law school graduates can offset the risks of investment in education for those rare unfortunate individuals who do not benefit as much from their educations.
It would be a mistake to let the small tail of defaults wag the much larger dog of public benefits.
Scaling back access to federal student loans to law students will not benefit taxpayers. To the contrary, the loss of revenue would mean larger deficits for the government, and eventually higher taxes for the rest of us.