December 10, 2019

On filling out the USNEWS.com "peer assessment" (i.e., academic reputation) survey

A young legal scholar elsewhere writes:

I'm my faculty's most recently tenured member, so I got a US News peer assessment survey. Or, I should say, peer "assessment," since it doesn't actually ask for any assessment of anything. I knew that the methodology was shoddy for these things, but I'm still kind of shocked at what this is: just a list of all the law schools and a request to rate them on a 5 point scale. No faculty or publications or any information about them. It's just a test of what schools I happen to have heard good things about lately.

 

So, given that this survey cannot produce any credible measure of quality or anything else (except of who I happen to have heard good things about lately), what should I do? Should I simply ignore this nonsense? Or is there some penalty (to me? to others?) if people who recognize this as nonsense refuse to participate? Should I rank everyone outstanding? Everyone except the top twenty schools?

A few observations and suggestions: 

(1)  any recently tenured faculty member (and that certainly goes for this young scholar) will, in fact, know a fair bit about the quality of scholarship (at least in his or her fields, and often cognate fields) at anywhere from a dozen to several dozen law schools.   Evaluate those schools, being either generous or stingy with the scores as you see fit:  e.g., give just five or six schools a "5," or give two dozen schools a "5."  In general, I think evaluators should be generous, especially since higher scores will have more influence on the overall results:  avoid 1s and 2s (unless you really are confident in the weakness of a particular school), and there's no  harm in giving lots of 4s and 3s.  (In the past, USNEWS.COM used to drop a percentage of the highest and lowest scores as a check on strategic voting, I'm not sure if they still do that.)   Most importantly, when you "don't know" much about a school, choose "don't know."  "Don't know" does not count against (or for) a school.

(2)  The academic reputation survey is, in fact, one of the few "reality checks" in the whole USNEWS.com charade:  without it, the rankings would be based on nothing more than wealth and the extent to which schools "massage" the self-reported data like employment statistics and expenditures.  Unfortunately, the academic reputation surveys increasingly track the prior years' overall rank in USNEWS.com, which impedes its utility as a reality check.  (This is one reason why adding citation data would, if done rightly, be salutary.)   But evaluators can counteract that by actually thinking about (1) the quality of scholarship produced by a school's faculty (not the school's name!), and (2) looking at other data as a check on their impressions. 

Here's a suggestion:  everyone should give the University of San Diego at least a "4" this year in the peer assessment survey, since its overall USNEWS.com rank is preposterously low relative to the strength of the faculty (which is made up of folks who have had tenured positions or offers at lots of excellent schools, including Berkeley, Northwestern, Cornell, Minnesota, George Washington, Boston University, and elsewhere).  If this works, I'll nominate more schools in future years who deserve a boost for their faculty excellence, even as they are punished by USNEWS.com on other metrics.


December 10, 2019 in Professional Advice, Rankings | Permalink

December 04, 2019

Jonathan Turley (George Washington) is not "the second-most cited law professor in the country"...

as The New York Times misleadingly reports today; indeed, he's not even one of the ten-most cited members of the GW law faculty.   On Professor Turley's website (the source for the NYT claim), the context was clearer:  in Judge Posner's 2003 book Public Intellectuals, Turley was the second-most cited law professor due almost entirely to references to him in the media.  On the other hand, he is poised to soon displace Alan Dershowitz as the "most-cited law professor by Donald Trump"!

UPDATE:  This is not atypical of the reception accorded Professor Turley's performance today.


December 4, 2019 in Faculty News, Rankings | Permalink

December 02, 2019

Are "transformative gifts" really transformative?

Law.com has a list of naming gifts to law schools over the last few decades, with the majority coming in the last two decades.  Here are the biggest gifts, by year:

    1998:    $115 million to the University of Arizona

    2001:    $30 million to Ohio State University

    2001:    $30 million to the University of Utah

    2008:    $35 million to Indiana University, Bloomington

    2011:    $30 million to the University of Maryland

    2013:    $50 million to Chapman University

    2014:    $50 million to Drexel University

    2015:    $100 million to Northwestern University

    2016:    $30 million to George Mason University

    2019:    $50 million to Pepperdine University

    2019:    $125 million to the University of Pennsylvania

For some of these gifts, it's too soon to say what their effects will be, and some of them served more, one suspects, to help newer schools stay afloat and continue to grow during tough times (e.g., Chapman, Drexel). On the other hand, George Mason's gift has already resulted in a lot more hiring by that school.   But Ohio State, Utah, and Indiana all seem to be roughly where they were at the time of the gifts:  strong state flagships, neither much better, and certainly not worse.   The same goes for the most remarkable gift of them all, the one to Arizona, much lauded at the time.   I gather a good chunk of that gift went to bricks and mortar, rather than expanding the size of a fairly small faculty.  Northwestern's more recent major gift was followed a few years later by belt-tightening anyway.

It remains to be seen whether any of these gifts will really change the strength and status of any of these schools.  In ten years, we'll probably have a clearer idea of the impact given how recent many of the largest gifts are.


December 2, 2019 in Legal Profession, Of Academic Interest, Rankings | Permalink

November 19, 2019

"Penn Carey Law"

In the wake of the outcry from students and alumni, Dean Ruger at Penn has sent a letter to students and alumni announcing that "the Law School will continue to use Penn Law as our short-form name until the start of the 2022-23 academic year, after which we will use Penn Carey Law."  A reasonable compromise.


November 19, 2019 in Of Academic Interest, Professional Advice, Rankings | Permalink

November 13, 2019

Blast from the past: law school rankings in the 1970s

Here.


November 13, 2019 in Rankings | Permalink

November 11, 2019

Some students and alumni of "Penn Law" are not happy about becoming students and alumni of "Carey Law"

They have a point.  The University of Maryland already has a "Carey Law School," making losing the connection to Penn a particularly bad idea.   Why not "Penn Law-Carey"?


November 11, 2019 in Legal Profession, Of Academic Interest, Professional Advice, Rankings | Permalink

November 08, 2019

$125 million gift to Penn Law...

November 07, 2019

Measuring law faculty scholarly impact by citations

Professor Gregory Sisk (St. Thomas) comments.

UPDATE:  And more thoughts from my colleagues Adam Chilton and Jonathan Masur.


November 7, 2019 in Rankings | Permalink

October 29, 2019

Society for Empirical Legal Studies (SELS) objects to use of HeinOnLine citation data to measure "scholarly impact"

Their letter to USNEWS.com editor Robert Morse is here.  I agree with a lot of this; herewith a few comments, sometimes expanding on the points made in the SELS letter, sometimes disagreeing.

The SELS letter states that while "no ranking system is perfect, one strength of the existing ranking approach--as U.S. News officials themselves have argued--is that it provides several accurate metrics for consumers to evaluate for themselves."   This did make me laugh, although I understand the good intentions behind the statement.  In fact, as we all know, USNEWS.com has regularly provided consumers with misinformation, since it never audits the self-reported data schools submit, whether about expenditures or job placement.  

The letter continues: 

Unlike other indicators like graduation rate and bar-passage rate, however, HeinOnline’s current citation system does not appear to accurately capture what it represents to. HeinOnline’s metric would purportedly measure a faculty member’s “scholarly impact.” But the method suffers from a variety of systemic measurement flaws so significant that they undermine its validity as a measure of scholarly impact—and with it, the validity of any metric incorporating it. Making the HeinOnline data part of the Best Law Schools ranking would therefore deviate from your longstanding practice of offering readers accurate information.

A small point:  while U.S. News college rankings incorporate graduation rates, the law school rankings do not.  The main concern of the SELS letter is that USNEWS.com may add the Hein impact data to the overall ranking formula.   I hereby predict with confidence that USNEWS.com will do exactly that within the next two years.  The trend in all their professional school rankings in the last few years has been to try to add "objective" indicia; citation data is the best candidate in the case of law schools.

Of course, the Hein data has exactly the problems that the SELS letter notes (and we have discussed previously):   books and book chapters are invisible, and partly because of that, and partly because Hein is a database of only law-related journals, interdisciplinary scholarship will get less weight in the "scholarly impact" measure.   Of course, it might reasonably be said that "scholarly impact" for a law school should be reflected in law publications, not, e.g., in impact in philosophy or economics journals.  (The two examples given--a highly-cited article co-authored by Lucian Bebchuk [Harvard] in a non-law journal and the highly cited historian Samuel Moyn [Yale] whose citations derive primarily from books--are apt, but probably not typical.  Bebchuk will surely do extremely well by a Hein-only measure even if that one article is excluded, while Moyn won't; but does anyone think that would have factored into Yale's hiring decision?)

But the real question about adding the Hein data is a comparative one.  Right now the USNEWS.com ranking of law schools measures [sic] the scholarly quality of faculties through an academic reputation survey, that has become simply an echo chamber:  if a school's overall USNEWS.com rank increases, the reputation score increases; and vice versa.   The Hein data--or any scholarly impact data--would make the measurement of scholarly quality independent of the reputation echo chamber.   (In USNEWS.com, Harvard, Stanford, and Yale typically tie at #1 in academic reputation; while Chicago, Columbia, and sometimes NYU come in at #4; contrast that with what scholarly impact data reveals.  The differences with impact data become even more dramatic further down the academic reputation hierarchy.)

So if the choice is between academic reputation data and no measure of scholarly impact, versus adding the Hein impact data, I'd vote for the latter.  (I agree with the SELS letter that Google Scholar would be a better metric, but USNEWS.com policy is not to do anything that requires real work on their part, and using Google Scholar would be time- and labor-intensive.)

Continue reading


October 29, 2019 in Of Academic Interest, Rankings | Permalink | Comments (0)

October 15, 2019

Blast from the past: when "The National Jurist" went off the rails...

...back in 2013.  I gather they're still around, but I've not read them since.


October 15, 2019 in Of Academic Interest, Rankings | Permalink