April 30, 2016
New research from Dan Schwarcz and Dion Farganis at Minnesota argues that providing students with practice problems and exercises that are similar to final exams and giving individual feedback prior to the final examination can help improve grades for first year law students.
Schwarcz and Farganis tracked the performance of first year students who were randomly assigned to sections, and as a result took courses with professors who either provided exercises and individual feedback prior to the final examination, or who did not provide feedback.
When the students who studied under feedback professors and the students who studied under no-feedback professors took a separate required class together, the feedback students received higher grades after controlling for several factors that predict grades, such as LSAT scores, undergraduate GPA, gender, race, and country of birth. The increase in grades appears to be larger for students toward the bottom half of the distribution. The paper also attempts to control for variation in instructor ability using student evaluations of teacher clarity.
It’s an interesting paper, and part of a welcome trend toward assessing proposed pedagogical reform through quasi-experimental methods.
The interpretation of these results raises a number of questions which I hope the authors will address more thoroughly as they revise the paper and in future research.
For example, are the differences due to instructor effects rather than feedback effects? Students are randomly assigned to instructors who happen to voluntarily give pre-final exam feedback. These might be instructors who are more conscientious, dedicated, or skilled and who also happen to give pre-exam feedback. Requiring other instructors to give pre-exam feedback—or having the same instructors provide no pre-exam feedback—might not affect student performance.
Controlling for instructor ability based on teaching evaluations is not entirely convincing, even if students are ostensibly evaluating teacher clarity. There is not very strong evidence that teaching evaluations reflect how much students learn. An easier instructor who covers less substance might receive higher teaching evaluations across the board than a rigorous instructor who does more to prepare students for practice. Teaching evaluations might reflect friendliness or liveliness or attractiveness or factors that do not actually affect student learning outcomes but that have consumption value for students. Indeed, high feedback professors might receive lower teaching evaluations for the same quality of teaching because they might make students work harder and because they might provide negative feedback to some students, leading students to retaliate on teaching evaluations.
These issues could be addressed in future research by asking the same instructor to teach two sections of the same class in different ways and measuring both long term student outcomes and teaching evaluations.
Another question is: are students simply learning how to take law school exams? Or are they actually learning the material better in a way that will provide long-term benefits, either in bar passage rates or in job performance? At the moment, the data is not sufficient to know one way or the other.
A final question is how much providing individualized feedback will cost in faculty time, and whether the putative benefits justify the costs.
It’s a great start, and I look forward to more work from these authors, and from others, using quasi-experimental designs to investigate pedagogical variations.
April 22, 2016
Professor Paula Franzese of Seton Hall law school is something of a patron saint of law students. Widely known for her upbeat energy, kindness, and tendency to break into song for the sake of helping students remember a particularly challenging point of law, Paula has literally helped hundreds of thousands of lawyers pass the bar exam through her video taped Property lectures for BarBri.
Paula is such a gifted teacher that she won teacher of the year almost ever year until Seton Hall implemented a rule to give others a chance: no professor can win teacher of the year more than two years in a row. Since the rule was implemented, Paula wins every other year. She’s also incredibly generous, leading seminars and workshops to help her colleagues improve their teaching.
Paula recently wrote a book encouraging law students to have a productive, upbeat happy, and grateful outlook on life (A short & happy guide to being a law school student).
Paula’s well-intentioned book has rather bizarrely been attacked by scambloggers as “dehumanizing”, “vain”, “untrustworthy” and “insidious.” The scambloggers are not happy people, and reacted as if burned by Paula’s sunshine. They worry that Paula’s thesis implies that “their failure must be due to their unwillingness to think happy and thankful thoughts.”
Happiness and success tend to go together. Some people assume that success leads to happiness. But an increasing number of psychological studies suggest that happiness causes success. (here and here) Happiness often precedes and predicts success, and happiness appears to be strongly influenced by genetic factors.
Leaving aside the question of how much people can change their baseline level of happiness, being happier—or at least outwardly appearing to be happier—probably does contribute to success, and being unhappy probably is a professional and personal liability.
People like working with happy people. They don’t like working with people who are unhappy or unpleasant. This does not mean that people who are unhappy are to blame for their unhappiness, any more than people who are born with disabilities are to blame for being deaf or blind.
But it does raise serious questions about whether studies of law graduates’ levels of happiness are measuring causation or selection. We would not assume that differences between the height of law graduates and the rest of the population were caused by law school attendance, and we probably should not assume that law school affects happiness very much either.
April 01, 2016
March 24, 2016
CBS News reported as follows:
"Alaburda filed her lawsuit in 2011, seeking $125,000 in damages on claims of false advertising and misrepresentations by TJSL and an order preventing it from misleading students. Jurors awarded her nothing. . . .
Michael Sullivan, the attorney for the law school, said the jury verdict showed that TJSL does its best to provide accurate information on its graduates . . . Sullivan told the jury that Alaburda, 37, did not suffer any damages and that she went to TJSL because it was the only law school where she got accepted.
Once there, the plaintiff was awarded a $20,000 scholarship to help with tuition, making her total debt $32,000 after three years, Sullivan said. Alaburda decided not to work during her first two years of law school and within two months of graduating, had two job offers in the legal field, the attorney said.
Sullivan said the process of gathering employment data for graduates is "difficult'' and a "challenge'' for the school, but said there was "not a pattern of mistakes'' by TJSL. . . .
Eventually, Alaburda got a $60,000 job offer from a San Bernardino law firm and took a $70,000-a-year job with a legal publisher . . ."
March 16, 2016
Statistician and data visualization expert Hans Rosling recently took the media to task for misleading readers and viewers using unrepresentative anecdotes and ignoring contradictory data.
Rosling says "You can't trust the news outlets if you want to understand the world. You have to be educated and then research basic facts."
While journalists often depict the developing world as full of "wars, conflicts, chaos" Rosling says "That is wrong. [The press] is completely wrong.. . . You can chose to only show my shoe, which is very ugly, but that is only a small part of me. . . . News outlets only care about a small part but [they] call it the world."
Rosling complains that the slow but steady march of progress is not considered news.
Rosling is famous for his data visualizations, especially this video briefly illustrating 200 years of global progress toward health and prosperity. It's optimism for the data-driven set (and is a big hit in my business law classes).
March 05, 2016
That’s the question Frank McIntyre and I try to answer in Value of a law degree by College Major. Economics seems to be the “best” major for aspiring law students, with both high base earnings with a bachelor’s degree and a large boost to earning with a law degree. History and philosophy/religion get a similarly large boost from a law degree but start at a lower undergraduate base and, among those with law degrees, typically end up earning substantially less than economics majors.
The abstract and a figure are below:
We estimate the increase in earnings from a law degree relative to a bachelor’s degree for graduates who majored in different fields in college. Students with humanities and social sciences majors comprise approximately 47 percent of law degree holders compared to 23 percent of terminal bachelor’s. Law degree earnings premiums are highest for humanities and social sciences majors and lowest for STEM majors. On the other hand, among those with law degrees, overall earnings are highest for STEM and Business Majors. This effect is fairly small at the low end of the earnings distribution, but quite large at the top end. The median annual law degree earnings premium ranges from approximately $29,000 for STEM majors to $45,000 for humanities majors.
These results raise an intriguing question: should law schools offer larger scholarships to those whose majors suggest they will likely benefit less from their law degrees? Conversely, should law schools charge more to those who will likely benefit the most?
Figure 3: ACS Mean Earnings for Professional Degree Holders (Narrow) by Selected Field of Study* (2014 USD Thousands)
- Includes degree fields with more than 700 professional degree holders in sample.
COMMENT FROM BRIAN LEITER: The lumping of philosophy majors together with religion invariably pulls down the performance of philosophy majors!
February 09, 2016
The latest unscientific fad among law school watchers is comparing job openings projections for lawyers from the Bureau of Labor Statistics* with the number of students expected to graduate from law school. Frank McIntyre and I tested this method of predicting earnings premiums--the financial benefits of a law degree--using all of the available historical projections from the BLS going back decades. This method of prediction does not perform any better than random chance.** Labor economists--including those working at the BLS--have explicitly stated that BLS projections should not be used to try to value particular courses of study. Instead, higher education should be valued based on earnings premiums.
Bloggers who report changes in BLS projections and compare projected job openings to the number of students entering law school might as well advise prospective law students to make important life decisions by flipping a coin.
Many law graduates won't practice law. Many engineering graduates won't become engineers. Many students in every field end up working jobs that are not directly related to what they studied. They still typically benefit financially from their degrees by using them in other occupations where additional education boosts earnings and likelihood of employment.
And if one's goal really is to practice law even if practicing law is not more lucrative than other opportunities opened by a law degree, then studying law may not be a guarantee, but it still dramatically improves the odds.
* BLS job opening projections--which are essentially worthless as predictors for higher education--should not be confused with BLS occupational employment statistics, which provide useful data about earnings and employment in many occupations, including for lawyers.
** There isn’t even strong evidence that changes in the ratio between BLS projected lawyer job openings and law class size predict changes in the percent of law graduates who will practice law, although the estimates are too noisy to be definitive. Historically, the ratio of BLS projected openings to law graduates (or first year enrollments 3 years prior) has systematically under-predicted by a wide margin the proportion of law graduates practicing law shortly after graduation, although it is clear that a large minority of law graduates do not practice law.
February 9, 2016 in Guest Blogger: Michael Simkovic, Law in Cyberspace, Legal Profession, Ludicrous Hyperbole Watch, Of Academic Interest, Professional Advice, Science, Student Advice, Web/Tech, Weblogs | Permalink
February 03, 2016
February 02, 2016
Smaller or Larger Law Class Sizes Don’t Predict Changes in Financial Benefits of Law School (Michael Simkovic)
One of the most surprising and controversial findings from Timing Law School was that changes in law school graduating class size do not predict changes in the boost to earnings from a law degree.* Many law professors, administrators, and critics believe that shrinking the supply of law graduates must surely improve their outcomes, because if supply goes down, then price—that is, earnings of law graduates—should go up.
In a new version of Timing Law School, Frank McIntyre and I explore our counterintuitive results more thoroughly. (The new analysis and discussion appear primarily in Part III.C. “Interpreting zero correlation for cohort size and earnings premium” on page 18-22 of the Feb. 1, 2016 draft and in Table 10 on the final page).
Our results of no relationship between class size and earnings premiums were robust to many alternative definitions of cohort size that incorporated changes in the number of law graduates over several years. This raises questions about whether our findings are merely predictive, or should be given a causal interpretation.
We considered several interpretations that could reconcile our results with a supply and demand model and with the data. The most plausible interpretation seemed to be that when law class sizes change, law graduates switch between practicing law and other employment opportunities that are equally financially rewarding. While changes in the number of law graduates might have an impact on the market for entry-level lawyers, such changes are much less likely to have a discernible impact on the much larger market for highly educated labor.
Although law graduates who practice law on average earn more than those who do not, at the margin, those who switch between practicing law and other options seem to have law and non-law opportunities that are similarly lucrative. We found that the proportion of recent law graduates who practice law does decline as class size increases, but earnings premiums remain stable. This is consistent with the broader literature on underemployment, and supports the view of law school as a flexible degree that provides benefits that extend to many occupations. (See here and here).
A related explanation is that relatively recent law school graduates may be reasonably good substitutes for each other for several years, blunting the impact of changes in class size on earnings.
Interpretations that depend on law students and law schools perfectly adjusting class size in anticipation of demand for law graduates or future unemployment seem implausible given the unrealistic degree of foresight this would require. Offsetting changes in the quality of students entering law school—an explanation we proposed in an earlier version of the paper—seems able to explain at most a very small supply effect. Although credentials of entering classes appear to decline with class sizes, these changes in credentials are relatively small even amid dramatic changes in class size, and probably do not predict very large changes in earnings. Imprecision in our estimates is another possibility, although for graduates with more than a few years of experience, our estimates are fairly precise.
Even if there are effects of law class size on law earnings premiums, they are probably not very large and not very long lasting.
* The finding is consistent with mixed results for cohort size in other econometric studies of cohort effects, but nevertheless was contrary to many readers’ intuitions.
December 14, 2015
In the era of Google Maps, instant language translation, and digital music libraries, law students still spend countless hours flipping pages to find the right subclause or definition in a statute. This process can and should be automated.
Computerized calculations have liberated STEM students from tedious, repetitive tasks so that they can focus on the more intellectually simulating and creative aspects of math, science and engineering. Word processing software has freed us all from applying whiteout and waiting for it to dry and from manually retyping manuscripts to correct a few errors. This has enabled us to focus on our ideas and not the mechanics of fixing them permanently on paper.
Law is an inherently conservative field, focused on precedent, tradition, and risk avoidance. But when the case for change is compelling, we are prepared to try new tools.
I’ve been thinking about the problems of statutory interpretation for years and how automation could streamline the process. I’m very excited to announce a new electronic statutory supplement, LawEdge. (Full disclosure: I helped develop it).
LawEdge aims to do for statutory interpretation what the calculator did for mathematics.
The U.S. Code includes thousands of defined terms. A reader must understand what each of the defined terms means to understand the meaning of each provision containing those defined terms.
Unfortunately, defined terms are not always clearly labeled. Even when defined terms are labeled as defined terms, understanding one provision may require flipping back and forth to several other locations in the code. This process can be slow and cumbersome with paper statutes. Even electronic statutes often will not take users to the precise location in the code where a definition appears, but will instead take readers to a the section containing the definition, forcing readers to search for the definition.
LawEdge makes working with defined terms simple and easy. Defined terms are clearly labeled. Clicking on a defined term generates a popup window showing its meaning. Definitions are also hyperlinked to their meaning.
Definitions are context-specific and do not apply to all sections of the code. For example, the definition of “property” in Section 317(a) of the Internal Revenue Code does not apply to Section 351 of the same title. LawEdge recognizes context and links definitions appropriately.
LawEdge is easy to navigate. For example, suppose that you wish to read § 21(b)(2)(B). With a paper statutory supplement, you could flip to section 21, then look for subsection (b), then read down to paragraph (2) and finally find subparagraph (B). The entire process might take 30 seconds, and along the way you might accidentally look at the wrong provision. With LawEdge, this process is nearly instantaneous and error free. You would simply type s21b2b in the search bar. This feature works all the way down to the subclause level.
Browsing a statute is also easier and more intuitive. Structural components are color coded to be more recognizable.
The underlying technology is algorithmic, which means it is easy to update and support as the U.S. Code changes.
LawEdge has all of the benefits of paper—notes, highlights, bookmarks, offline access—and many advantages only available electronically. It can be used on exams with the latest version of ExamSoft, which offers on option to only block internet access but not the hard drive.
If you’re interested in trying it for your class, please feel free to contact me for an evaluation copy.