Tuesday, February 12, 2013

National Jurist in Competition to Displace Thomas Cooley Rankings as Biggest Joke in Legal Academia

Years ago, when Texas had the misfortune to be #1 in the Cooley rankings, the law school was asked by the public affairs department whether we wanted to produce a press release; the immediate answer was, "No, don't mention it, it's an embarrassment to be #1 in the Cooley rankings."  National Jurist has now replicated the Cooley feat, with a somewhat more baroque methodology that can only make Bob Morse and the U.S. News editors smile, since it makes their approach look like rocket science.  Like U.S. News, the National Jurist has a multitude of different factors, all inexplicably weighted (5% for bar pas rate and diversity, but 12.5% for the number of Super Lawyer alumni!), but some of which are independently interesting, but aggregated make no sense.

But the coup de grace is that 20% of the overall score is based on Rate My Professors, the notorious on-line rating site used mainly by undergraduates, and hardly at all by law students.  (In a remarkable display of editorial good judgment, Jack Crittenden, the editor, decided not to incorporate the "hotness" score, however.)

What kind of sample was this?  In the case of the University of Chicago Law listings on Rate My Professor, it consisted of 54 responses total for ten faculty over a period of six years--but only 23 responses for actual full-time law school faculty!  Indeed, Rate My Professor lists one person, Smigelskis, who has never even taught in the Law School here (he accounts for almost 20% of the responses, and also had the lowest scores).  (Actual law faculty, like me, didn't even appear, because Rate My Professors had me listed in the wrong unit!)  I've gotten e-mails from colleagues elsewhwere reporting similar anomalies.

In short, 20% of the overall score is fraudulent on its face.  And it's that 20% that explains all the variance.  Stanford, Harvard, Virginia, Chicago, Michigan, Yale all get A and A+ scores in all the employment categories (NLJ 200 partners, Super Lawyers, etc.), what differentiates them is the fraudulent Rate My Professors data.  This means the National Jurist has one advantage over the Cooley rankings:  its absurdity isn't quite as obvious.

I've talked to Jack Crittenden on occasion, and I even talked to him in the past about how to improve on what U.S. News tries to do.  He never mentioned that he was thinking of incorporating bogus and inaccurate pseudo-data in order to rank law schools.

I hope Mr. Crittenden will have the good sense to issue a retraction and apology for putting this misinformation into circulation.  It's the second time in recent months that they have put out misleading rankings.  Maybe this signals desperation, I don't know.

If readers catch any law schools publicizing their National Jurist ranking, please let me know.

UPDATE:  Shame on the University of Oklahoma and Texas Tech!  (Thanks to a reader who asked for anonymity.)

ANOTHER:  A colleague elsewhere writes:  "The methodology says that the RateMyProfessors score was only used if there were more than 40 ratings. Wash U is listed as “NA” in that column, but a perusal of that site shows 63 professors in the “law” department with over 200 ratings – far more than most schools. Haven’t checked to see if their number is accurate…."

ANOTHER READER points out that the National Jurist article falsely states that "Rate My Professors" has been found valid in "scientific" studies.  No citations are given, for good reason.   One study, at the Univeristy of Maine, focusing on undergraduate teaching, found a strong correlation only for overall course evaluation and easiness or difficulty of the course (not for teaching quality), and also found that the correlations were strongest for highly rated faculty, but fell apart thereafter.   There is no evidence--as in none--that Rate My Professors has any validity for measuring teaching at law schools, or for measuring the distribution of quality teaching at an institution of higher education. 

SHAME WATCH CONTINUES:  North Carolina joins the hall of shame. (UPDATE:  UNC seems to have removed the announcement--kudos to them for their good sense!)

MORE 'RATE MY PROFESSORS' VALIDITY INDICATORS:  Steven Freedman (Kansas) writes:  "If you search for Mickey Mouse on RateMyProfessor.com, you will find out he’s a Chemical Engineering professor at Ohio State.  I googled to make sure there wasn’t an actual Professor Mouse at Ohio State, but there seems to be no record of that."

ANOTHER:  The "Above the Law" blog calls the National Jurist ranking (not unfairly) "pure ridiculousness."

AND MORE ERRORS:  A colleague who examined this with some care writes:

I did some data analysis to try to figure out how they calculated the number they used for RMP. I did the top 5 schools (three of which didn’t have their numbers used) and the lowest score on the whole thing, which happens to be your school. Here’s the takeaway:

There are two possible numbers they could be using – the overall average, and the average of the Helpfulness and Clarity scores (which is what they say they did in the methodology). In all cases I went to the school page, selected the “law” department and included anyone with at least one rating.

 

School

NJ's score

Score used?

overall avg

Help+Clar avg

Stanford

3.74

Y

3.7311

3.7177

Virginia

3.47

N

3.3125

3.3187

Berkeley

3.93

N

3.9678

3.9464

Vanderbilt

3.66

N

3.6613

3.6677

Alabama

3.97

Y

3.9772

3.9646

Chicago

2.83

Y

3.6793

3.6731

As you can see, in no case does my calculation exactly match up with the NJ number, but at least for 4 of the top five schools, it’s within a few hundredths of a point, and my two numbers are so close as to be statistically identical anyway. So it’s fair to say that I have the method pretty close to correct. But note that the difference for Virginia served to inflate their score from a 3.32 to a 3.47, and it deflates Chicago’s score from 3.67 to 2.83. (This is irrelevant to Virginia, of course, because their RMP score was not used.)

In addition, I looked at the three schools on this list who had their numbers used to see how many of the ratings were for actual law school professors. Stanford fares best – all the ratings are for current or former faculty (and only three are former, representing 18 of the 93 ratings). But at Alabama, only 12 of the 20 ever served on the law school faculty (taking the most generous view of “on the law faculty” possible) and they account for only 98 of the 219 ratings. Without the inclusion of the non-law faculty (most of whom are current or former members of the Economics, Finance, and Legal Studies Department), Alabama’s score goes from 3.96 to 3.60. (Chicago’s would go from a 3.67 to a 4.03 without the completely non-law professor you noted earlier – the others have all at least taught law courses.)

So basically, not only did they use bad data, but they used it badly, and they appeared to have miscalculated it in multiple cases (2 out of the 6 that I looked at – one egregiously).

MORE SHAME:  Houston.  But why tout being ranked 45th?  Schools need to pause and recognize that if they legitimate this nonsense by broadcasting it, it will come back to bite them another time around, given that the National Jurist ranking seems to be based on an even less good sense than U.S. News.  UPDATE:  Houston has also removed it--good for them!

ANOTHER UDPATE (FEB. 15):  The National Jurist editor has admitted to one of our assistant deans that if they dropped the fraudulent Rate My Professors data (which they admit seems problematic), Chicago would be in the top five overall.  One wonders how many other schools were penalized without any reason in this way?  In any case, this simply confirms that the variance is all due to the fraudulent data input, and that the whole thing is "pure ridiculousness."

THE LATEST:  Here.

https://leiterlawschool.typepad.com/leiter/2013/02/national-jurist-in-competition-to-displace-thomas-cooley-rankings-as-biggest-joke-in-legal-academia.html

Rankings | Permalink